974 lines
56 KiB
Plaintext
974 lines
56 KiB
Plaintext
Computer underground Digest Wed Apr 06, 1994 Volume 6 : Issue 30
|
|
ISSN 1004-042X
|
|
|
|
Editors: Jim Thomas and Gordon Meyer (TK0JUT2@NIU.BITNET)
|
|
Archivist: Brendan Kehoe (He's Baaaack)
|
|
Acting Archivist: Stanton McCandlish
|
|
Shadow-Archivists: Dan Carosone / Paul Southworth
|
|
Ralph Sims / Jyrki Kuoppala
|
|
Ian Dickinson
|
|
Suspercollater: Shrdlu Nooseman
|
|
|
|
CONTENTS, #6.30 (Apr 06, 1994)
|
|
File 1--"Who Holds the Keys?" (CFP '94 Transcript)
|
|
|
|
Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
|
|
available at no cost electronically.
|
|
|
|
CuD is available as a Usenet newsgroup: comp.society.cu-digest
|
|
|
|
Or, to subscribe, send a one-line message: SUB CUDIGEST your name
|
|
Send it to LISTSERV@UIUCVMD.BITNET or LISTSERV@VMD.CSO.UIUC.EDU
|
|
The editors may be contacted by voice (815-753-0303), fax (815-753-6302)
|
|
or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL
|
|
60115, USA.
|
|
|
|
Issues of CuD can also be found in the Usenet comp.society.cu-digest
|
|
news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of
|
|
LAWSIG, and DL1 of TELECOM; on GEnie in the PF*NPC RT
|
|
libraries and in the VIRUS/SECURITY library; from America Online in
|
|
the PC Telecom forum under "computing newsletters;"
|
|
On Delphi in the General Discussion database of the Internet SIG;
|
|
on RIPCO BBS (312) 528-5020 (and via Ripco on internet);
|
|
and on Rune Stone BBS (IIRGWHQ) (203) 832-8441.
|
|
CuD is also available via Fidonet File Request from
|
|
1:11/70; unlisted nodes and points welcome.
|
|
|
|
EUROPE: from the ComNet in LUXEMBOURG BBS (++352) 466893;
|
|
In ITALY: Bits against the Empire BBS: +39-461-980493
|
|
|
|
FTP: UNITED STATES: etext.archive.umich.edu (141.211.164.18) in /pub/CuD/
|
|
aql.gatech.edu (128.61.10.53) in /pub/eff/cud/
|
|
EUROPE: nic.funet.fi in pub/doc/cud/ (Finland)
|
|
nic.funet.fi
|
|
ftp.warwick.ac.uk in pub/cud/ (United Kingdom)
|
|
|
|
COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
|
|
information among computerists and to the presentation and debate of
|
|
diverse views. CuD material may be reprinted for non-profit as long
|
|
as the source is cited. Authors hold a presumptive copyright, and
|
|
they should be contacted for reprint permission. It is assumed that
|
|
non-personal mail to the moderators may be reprinted unless otherwise
|
|
specified. Readers are encouraged to submit reasoned articles
|
|
relating to computer culture and communication. Articles are
|
|
preferred to short responses. Please avoid quoting previous posts
|
|
unless absolutely necessary.
|
|
|
|
DISCLAIMER: The views represented herein do not necessarily represent
|
|
the views of the moderators. Digest contributors assume all
|
|
responsibility for ensuring that articles submitted do not
|
|
violate copyright protections.
|
|
|
|
----------------------------------------------------------------------
|
|
|
|
Date: Sun, 27 Mar 1994 22:09:47 -0800
|
|
From: fen@IMAGINE.COMEDIA.COM(Fen Labalme)
|
|
Subject: File 1--"Who Holds the Keys?" (CFP '94 Transcript)
|
|
|
|
((MODERATORS' NOTE: Over the next few weeks, we'll try to include
|
|
occasional summaries or transcripts of CPSR '94 sessions as they
|
|
become available))
|
|
|
|
Transcript of
|
|
|
|
DATA ENCRYPTION: WHO HOLDS THE KEYS? (Panel)
|
|
at the Fourth Conference on Computers, Freedom and Privacy
|
|
|
|
Chicago, Illinois, March 24, 1994
|
|
|
|
This is a verbatim transcript of the session on "Data Encryption;
|
|
Who Holds the Keys?" held at the Fourth Conference on Computers,
|
|
Freedom and Privacy in Chicago on March 24, 1994. The
|
|
transcription was done by an independent local transcription
|
|
agency. Light editing was done by CFP volunteers to resolve items
|
|
the agency could not be expected to have knowledge of (for example,
|
|
"technical" terms like "PGP"). "Did X *really* say U?" questions
|
|
can always be resolved by listening to the audiotape available as
|
|
tape JM414 from Teach'Em, 160 East Illinois St, Chicago, IL 60611,
|
|
1-800-225-3775, for $10 + $1 ($2 outside US) shipping and handling
|
|
+ 8.75% sales tax.
|
|
|
|
=================================================================
|
|
|
|
Welcome to this program from the John
|
|
Marshall Law School's fourth conference on computers, freedom and
|
|
privacy entitled, "Cyberspace Superhighways: Access, Ethics &
|
|
Control", held March 23rd through the 26th, 1994 at the Chicago
|
|
Palmer House Hilton.
|
|
On this cassette you will hear Data
|
|
Encrytion -- who holds the keys? Now to our program.
|
|
|
|
BOB SMITH Willis Ware originally had been
|
|
slated to being moderator for this panel and Willis had a problem
|
|
and could not be with us and Robert Ellis Smith has agreed to fill
|
|
in and use his technology background to fill in for Willis. It
|
|
will take just a minute while we disengage from the T.V. hookup and
|
|
get back to the modern overhead projector.
|
|
My name is Bob Smith. I publish
|
|
privacy journal and actually I am moderating because Dave Banisar
|
|
did not want to be moderator. We will hear from the three
|
|
panelists with about three ten-minute presentations and then we
|
|
will open it up to questions.
|
|
The three ground rules for this
|
|
session: First, there will be no expansions of the metaphor of
|
|
highways. We will not talk about highway metaphors for the next
|
|
hour. Secondly, we will not accept as a defense that this issue is
|
|
too sensitive or too complicated for us to understand and that we
|
|
have to trust the government. And thirdly, a rule that I hope you
|
|
will make work. If you hear a point of jargon or a point of
|
|
technology that you don't understand, explanation -- not policy
|
|
disputes but if there is something you don't understand feel free
|
|
to raise your hand as a point of order. And if you can say it in
|
|
ten words or less like, I don't understand, we'll get you an
|
|
answer.
|
|
I think Senator Leahy provided a good
|
|
primer for cryptography and so I won't bother with that and we'll
|
|
get right into the nuts and bolts of this issue.
|
|
Our speakers are George Davida, who
|
|
is with the University of Wisconsin in Milwaukee and has been
|
|
involved in cryptography research for many years and was one of the
|
|
first academicians to feel the heavy hand of government in the
|
|
1980's in its effort to try to curtail research into cryptography.
|
|
That appears to be happening again in the 1990's so perhaps
|
|
Professor Davida can tell us something about his experiences
|
|
earlier on that same front.
|
|
Our second speaker will be Stuart
|
|
Baker, who is General Counsel of the National Security Agency. He
|
|
was a lawyer in private practice in Washington before joining NSA
|
|
and one of the things he promised to do is to tell us exactly what
|
|
NSA does and is because a lot of people don't know. It is
|
|
different from the National Security Council by the way.
|
|
Thirdly, our third speaker will be
|
|
David Banisar who is the Computer Professionals for Social
|
|
Responsibility office in Washington. He is trained as a lawyer and
|
|
has a background in computer science and has some strong feelings
|
|
about the cryptography debate.
|
|
We will now move to Professor Davida.
|
|
PROFESSOR DAVIDA I would like to talk about two issues
|
|
that concern me and I believe a number of people here. By the way,
|
|
I brought some copies of my paper in case you need one today. And
|
|
if I don't have enough you can always write to me at that address.
|
|
And I am also willing to put that on FTP for those of you who are
|
|
on Internet and you can pick up a poster file and print it if you
|
|
so wish.
|
|
As Robert said, in 1978 I had an
|
|
interesting experience with NFA. I was doing research at the time
|
|
in cryptograhy and one day I received a secrecy order by mail. It
|
|
was more or less like a postcard telling me that under the penalty
|
|
of three years in jail and $10,000 fine I am to talk to no one
|
|
about what I had done in that paper without reference to any
|
|
classified material.
|
|
At first my graduate student and I
|
|
laughed until we found out that it was deadly serious. We talked
|
|
to the Chancelor about it and he said, no way because in Wisconsin
|
|
there is a strong position of academic freedom and we are not
|
|
allowed actually to conduct research that's secret. So we decided
|
|
to resist the order and after a number of conversations between the
|
|
Chancelor and someone you might have heard about recently again,
|
|
Admiral Bobby Inman, and the then Commerce Secretary Juanita Kreps,
|
|
the order was lifted. But not before Admiral Inman tried to
|
|
convince the Chancelor that he should acquiesce to the order and
|
|
allow us to stay, but I am happy to say that the Chancelor said
|
|
that we could not put up with the order.
|
|
Shortly thereafter a group was formed
|
|
by the American Council on Education called Public Cryptography
|
|
Study Group, not to be confused with Public Key Cryptosystems. And
|
|
it is interesting that this group considered model legislation for
|
|
censorship at first. I objected to it rather vigorously and when
|
|
the press began to get involved in covering the meetings, they then
|
|
approved what they called voluntary prior restraint. I again
|
|
dissented from that report and the rest, as they say, is history.
|
|
Many people have asked, "why do you
|
|
oppose restaints?" Very simply, that privacy is just too important
|
|
to leave it just to agencies like NSA. I also felt that the ACE
|
|
recommendations were dangerous because they were later going to be
|
|
looked at as some kind of admission by allegedly knowledgeable
|
|
people that cryptography is an evil tool that will only be used by
|
|
terrorists and drug dealers. And it is interesting that Senator
|
|
Leahy himself refers to the struggle of the law enforcement with
|
|
crimes -- and I assume he is talking about drug dealers and what
|
|
have you. But someone should point out to him that they are not
|
|
using cryptography today so I don't know what the struggle is all
|
|
about. They may be struggling against criminals -- not because of
|
|
cryptography but simply because a crime is just a major problem.
|
|
I would also like to tell them that I don't think that the
|
|
intelligence agencies struggle when it comes to tapping ordinary
|
|
law abiding citizens. They do very well, thank you.
|
|
I also think that the realities are
|
|
very different because cryptography is extremely important for two
|
|
very critical applications. Now so far you mostly hear about one
|
|
of them which is privacy. But the other application that also
|
|
needs privacy work on is authenticity, or identification. These
|
|
are two extremely critical applications of cryptography. And what
|
|
is interesting is that the current proposals -- again, you only
|
|
hear about one of them -- actually constitute a double whammy --
|
|
because there are two proposals that are being put forth today.
|
|
You only hear about Clipper but what you do not hear about as much
|
|
is the other twin monster that which is the digital signature
|
|
standard. Basically what they are trying to do with this -- with
|
|
Clipper you lack privacy and with DSS you essentially lack the
|
|
signature, the identification schemes -- the two most important
|
|
operations/applications of cryptography.
|
|
So what will essentially happen is
|
|
that not only can you invade privacy with digital signatures which
|
|
will be essentially the new way of identifying yourselves to an
|
|
awful lot of systems and executables. They will actually be able
|
|
to deny your very existence if those systems are allowed to be only
|
|
government issued because it will be impossible in the systems of
|
|
the future not to use something like digital identification/
|
|
digital authentication schemes because there are no other effective
|
|
means. You all know about the silly paper systems we use for
|
|
identifications, and even high school students know how to fake
|
|
ID's to drink. So we will be moving toward digital signatures and
|
|
if there is only one digital signature it's essentially a proposal
|
|
to have just one government Bic pen. That is what they would like
|
|
us to have. One pen to sign our names with and sign our checks
|
|
with and authenticate ourselves with.
|
|
Now again, as I said, privacy is one
|
|
application and I have raised a number of objections to it because
|
|
it has been again portrayed as a tool of crime and criminals and
|
|
drug dealers. But they are not the only ones who will be using
|
|
cryptography and more importantly, if we continue this policy they
|
|
will be the only ones who will have good security because we will
|
|
not have any security as to privacy. And as that saying
|
|
goes "if you outlaw privacy, only outlaws will have privacy". It
|
|
is very strange. I find myself wanting to go and join
|
|
organizations like the NRA all of a sudden. I really do.
|
|
There is also an interesting sort of
|
|
deception here going on with this so called escrow system. The
|
|
problem is that, how in the hell can you escrow privacy. Go look
|
|
at the definition of escrow -- it says that something of value held
|
|
in trust is given back. Can you give back privacy? That is
|
|
impossible. So I think that the very title of that is deceptive.
|
|
|
|
Then I was amused, as some of you
|
|
might have been, with all the stories about bugging to look up a
|
|
recent case of my friend Bobby Inman again, standing in front of
|
|
television cameras saying that William Safire and Senator Dole were
|
|
conspiring to get him with the President. And the question is,
|
|
where is he getting this kind of data? Presumably he must because
|
|
he spent his whole life, by the way, being very careful about what
|
|
to say. You know, I can't imagine he is saying that without having
|
|
something to back up with what he was claiming. So when we talk
|
|
about bugging, just what do they do with all that data? Well, I
|
|
think you have seen an example of what possibly may have been dealt
|
|
with -- data that is intercepted.
|
|
Again, authenticity is another area
|
|
that I think people should pay attention to. The second most
|
|
important application of the use of identification, digital
|
|
signatures for proving who you are and yet again they are proposing
|
|
just one single big pen. I think that these two proposals jointly
|
|
amount to what I consider a digital dragnet. Thank you.
|
|
STUART BAKER: I have a friend who gives speeches a
|
|
lot and he likes to begin all his speeches by referring to country
|
|
and western songs that sum up the theme of his talk. When he talks
|
|
about U.S./Japan trade relations, he always starts out by referring
|
|
to that classic "you got the gold mine, I got the shaft." And I
|
|
thought about what David would have given as the country and
|
|
western song that I should probably sing here and I think in
|
|
relation to the Clipper Chip it would probably be "How can I miss
|
|
you if you won't go away?"
|
|
There is a reason why the Clipper
|
|
Chip won't go away and what I thought I would try to do very
|
|
quickly because I only have ten minutes before the lynching begins
|
|
is talk about why Key Escrow hasn't gone away by talking about some
|
|
of the myths that are pretty prevalent about Key Escrow. I am not
|
|
going to call it Clipper because there are a lot of products called
|
|
Clipper. This is the internal name, not something that was used
|
|
for the public. I don't object to people calling it Clipper but
|
|
there probably are people who have Clipper products who would
|
|
prefer that it not be called that.
|
|
Let me see if I can put the first one
|
|
up. [OH slide: Myth #1: Key escrow encryption will create a brave
|
|
new world of government, intrusion into the privacy of Americans.]
|
|
I think this is pretty -- probably the classic opening statement
|
|
about Clipper. That this is the beginning of some kind of brave
|
|
new world in which everybody's privacy is at risk in a substantial
|
|
new way. There is a lot of emotion behind that argument but not a
|
|
lot of fact, because if you ask yourself if everybody in the United
|
|
States used key escrow encryption and only key escrow encryption,
|
|
which is not what the Administration has proposed by any means,
|
|
what would the world look like? Well, the world would look like
|
|
the world we live in today. It would be possible for the
|
|
government to intercept communications subject to a variety of
|
|
legal rules that make it very dangerous to go outside those rules.
|
|
And, in fact, it would be a more private world because other people
|
|
without authority would not be able to intercept and decrypt those
|
|
communications. That is important because, in fact, there is
|
|
somebody proposing a brave new world here and it is the people who
|
|
want people to go away and to have unreadable encryption installed
|
|
on all of the communications networks in the United States. That's
|
|
a new world and that is a world we don't understand. We don't live
|
|
in it today.
|
|
We don't know what it is going to be
|
|
like if criminals or terrorists or other people who are hostile to
|
|
society can use that sanctuary to communicate. We don't know what
|
|
it is like but it probably won't be as pleasant in terms of freedom
|
|
from crime and terror as the world we live today, which is not
|
|
exactly a comforting thought. It won't be a world in which the
|
|
government can do more than they do today. So if you ask yourself
|
|
well, how bad is it today, that's as bad as it can get under
|
|
Clipper.
|
|
[OH Slide: Myth #2L Unbreakable
|
|
encryption is the key to our future liberty]
|
|
Now the response to that, that you
|
|
hear from people, well, yeah but what if the Republicans get
|
|
elected? What if the Administration changes? This is a guarantee.
|
|
I don't want to have to rely on laws and procedures and escrow
|
|
agents. I don't trust the escrow agents, I don't trust the courts,
|
|
I don't trust the government, I don't trust anybody. I want to
|
|
trust my machine.
|
|
Now that is not an uncommon way of
|
|
thinking in the parts of this community. I said to somebody once,
|
|
this is the revenge of people who couldn't go to Woodstock because
|
|
they had too much trig homework. It's a kind of romanticism about
|
|
privacy and the kind of, you know, "you won't get my crypto key
|
|
until you pry it from my dead cold fingers" kind of stuff. I have
|
|
to say, you know, I kind of find it endearing.
|
|
The problem with it is that the
|
|
beneficiaries of that sort of romanticism are going to be
|
|
predators. PGP, you know, it is out there to protect freedom
|
|
fighters in Latvia or something. But the fact is, the only use
|
|
that has come to the attention of law enforcement agencies is a guy
|
|
who was using PGP so the police could not tell what little boys he
|
|
had seduced over the net. Now that's what people will use this for
|
|
-- not the only thing people will use it for but they will use it
|
|
for that and by insisting on having a claim to privacy that is
|
|
beyond social regulation we are creating a world in which people
|
|
like that will flourish and be able to do more than they can do
|
|
today.
|
|
[OH Slide: Myth #3: Encryption is the
|
|
key to preserving privacy in a digital world]
|
|
I'll move quickly. There is another
|
|
argument that I think is less romantic and that is the notion that
|
|
technically, because we are all going to be networked, we are all
|
|
going to be using wireless stuff -- we need encryption for privacy.
|
|
I am not going to say that does not fit but it is a little
|
|
oversold. Actually, I agreed with Professor Davida. Much of the
|
|
privacy problems that we see in an electronic world are not because
|
|
people are intercepting our communications, they're because we are
|
|
giving it away. But what we don't like is that there are people
|
|
now in a position that collate it all from public stuff that we
|
|
willingly gave up. Well, you know, we gave this information to get
|
|
a loan from one bank and before we know it, you know, our ex-
|
|
spouse's lawyer has got it. That's a problem, but encryption won't
|
|
solve it because you are going to have to give that information up
|
|
if you want the benefit that the bank has.
|
|
Similarly the most important use for
|
|
the protection for privacy, protection for data, is authentication
|
|
-- digital signatures as opposed to privacy. I won't say that
|
|
encrypting data for privacy purposes is irrelevant but it is
|
|
probably not the most important way of guaranteeing privacy in an
|
|
electronic age.
|
|
[OH Slide: Myth #4: Key Escrow won't
|
|
work. Crooks won't use it if it's voluntary. There must be a
|
|
secret plan to make key escrow encryption mandatory]
|
|
This will be familiar. You shouldn't
|
|
over estimate the I.Q. of crooks. When I was first starting out as
|
|
a lawyer I was in Portland, Maine and a guy walked into a downtown
|
|
bank and he said, he handed a note to the teller, it said, "Give me
|
|
all your money; I don't have a gun but I know where I can get one."
|
|
I'm sure if you sent him out to buy encryption he for sure would
|
|
buy the Clipper Chip.
|
|
I think this misstates the problem.
|
|
The notion that what the government is trying to do is to put in
|
|
everybody's hands this kind of encryption in the hopes that crooks
|
|
will be fooled into using it I think is to misstate the nature of
|
|
the concern. The concern is not so much what happens today when
|
|
people go in and buy voice scramblers; it is the prospect that in
|
|
five years or eight years or ten years every phone you buy that
|
|
costs $75 or more will have an encrypt button on it that will
|
|
interoperate with every other phone in the country and suddently we
|
|
will discover that our entire communications network, sophisticated
|
|
as it is, is being used in ways that are profoundly anti-social.
|
|
That's the real concern, I think, that Clipper addresses. If we
|
|
are going to have a standardized form of encryption that is going
|
|
to change the world we should think seriously about what we are
|
|
going to do when it is misused.
|
|
[OH Slide: Myth #5: Industry must be
|
|
left alone for competitiveness reasons]
|
|
Are we interfering with the free
|
|
market? Are we affecting the competitiveness of U.S. industry
|
|
here? First, Clipper is an option. It is out there. People can
|
|
use it. They can make it. They can not use it. And they can not
|
|
make it. It's simply an additional option on the market. There
|
|
may well be people who want this.
|
|
I am a lawyer. I think in terms of
|
|
who is liable if something goes wrong. And I think that if it's
|
|
your business, and you are thinking about buying encryption and the
|
|
possibility that your employees will misuse it to rip-off your
|
|
customers, you ask yourself, well who is going to be liable if that
|
|
happens? You might think, "Geez, maybe I don't want to be in a
|
|
position where I can't actually make sure the police can come in
|
|
and check to see if people are misusing this encryption where I
|
|
have reason to believe that they are."
|
|
Second, and this is a point that gets
|
|
lost a lot: this is a standard for what the government is going to
|
|
buy because nobody in this room has to buy this thing. Now the
|
|
complaint is kind of remarkable from all the stand-on-your-own-two-
|
|
feet, free-market, nobody-tells-me-what-to-do, organizations that
|
|
we hear from. The fact is, that this is just what the government
|
|
is going to buy, and the people who are complaining that they don't
|
|
want to make it, or don't want to buy it, don't have to. What they
|
|
are really saying is, we would like the government to go on testing
|
|
equipment, telling us what the best stuff is so we can then go out
|
|
and sell it without doing our own research, doing our own
|
|
debugging, our own checks on this technology. I think if you think
|
|
of it from the government's point of view you see why we don't want
|
|
to do that. We probably -- there are very few institutions other
|
|
than government that are willing to devote both the kind of energy
|
|
and resources that it takes to eliminate the last few bugs in
|
|
encryption software or machinery. To go through and find every
|
|
possible attack and think about how to prevent it -- somebody once
|
|
said, the airport guy talking about encryption he said, well, I'll
|
|
take it if it is invisible, doesn't have any effect on the pilot,
|
|
and adds lift to my airplane. There is an attitude about
|
|
encryption that I think most of you have probably encountered in
|
|
the commercial world is, "Yeah, I want it if it is free." But
|
|
there is very little demonstrated inclination on the part of
|
|
industry to spend a lot of its own money to develop independent
|
|
encryption. And the fact is that a lot of the encryption that is
|
|
out there today was designed with government money, or endorsed by
|
|
government standards or otherwise supported by government
|
|
fortresses. But if the government is going to create encryption
|
|
and create markets and run the cost down, then we ought to be
|
|
designing and buying encryption that we are willing to see migrate
|
|
into the private sector without destroying the ability of law
|
|
enforcement to deal with it.
|
|
And, I guess, the last point, people
|
|
who don't want to sell to the government can make anything they
|
|
want. People are willing to put their own money into designing
|
|
encryption can do it. This is just what the governments fund.
|
|
AUDIENCE COMMENT: But you can't take it overseas. What
|
|
the government buys is (inaudible) technical for overseas.
|
|
BAKER: This is also something that we hear
|
|
a lot about and I'll deal with it quickly.
|
|
[OH Slide: Myth #6: NSA is a spy
|
|
agency. It has no business worrying about domestic encryption
|
|
policy]
|
|
Yeah, the NSA does indeed gather
|
|
signal intelligence in foreign countries. But we have a second
|
|
issue. Not only do we try to break people's codes but we make
|
|
codes for the federal government. That means we have as a
|
|
significant mission trying to design secure communications here
|
|
that the government is going to use. And we face the very real
|
|
concern that I described earlier, that if we design something and
|
|
it's good and it's terrific stuff and the price goes down because
|
|
the government has bought a lot of it, then other people are going
|
|
to use it. It may end up becoming the most common encryption in
|
|
the country. If that happens and people like this pedophile out in
|
|
California start using it, we have some responsibility for that and
|
|
therefore we have some responsibility to design and use encryption,
|
|
that (if it does migrate to the private sector) does not put law
|
|
enforcement out of business.
|
|
[OH Slide: Myth #7: The entire
|
|
initiative was done in secret. There was no opportunity for
|
|
industry or the public to be heard.]
|
|
This is my last one. Again, this was
|
|
true, I think or at least it was a reasonable thing to say in April
|
|
of '93 when the Clipper Chip first showed up in people's
|
|
newspapers. But since then the Administration has done an enormous
|
|
amount of public outreach listening to a variety of groups -- EFF,
|
|
CPSR, industry groups, holding hearings, organizing task forces to
|
|
listen to people. It is not that they weren't heard -- what I
|
|
expect people to say is, yes but you still didn't listen. We said
|
|
we don't like it. How come you still did it?
|
|
I think that the answer to that is
|
|
you have to ask yourself, what is the alternative that people will
|
|
propose. It is not enough in my view to simply say "Get rid of it.
|
|
What we want is unreadable encryption so that we have a guarantee
|
|
of privacy against some government that hasn't come to our country
|
|
in 15 years or a hundred years or two hundred years, and in the
|
|
same guarantee that criminals and other people who don't have
|
|
society's interest at heart will have a kind of electronic
|
|
sanctuary." That is not a very satisfying answer for people who
|
|
have to uphold the law as well as try to get the national
|
|
information infrastructure off the ground.
|
|
Thanks.
|
|
DAVE BANISAR: Well, first I'd like to say I'm not
|
|
sure what song you were referring to in your country and western
|
|
description, but I think if I had to choose a country and western
|
|
song it would probably be "Take This Job and Shove It."
|
|
Moving onto the high road from now,
|
|
I think what we have here is a really fundamental change in the way
|
|
the communication system is being looked at in the future.
|
|
Currently we have a situation where if somebody decides they need
|
|
a wiretap, which is an issue I'll get to in a minute, whether it is
|
|
useful or not, they go and they do an affirmative action. And the
|
|
communication system is essentially set up to communicate. I use
|
|
it to call.
|
|
These two proposals, digital
|
|
telephony which we haven't talked about here too much and Clipper,
|
|
change that around. They change it into a fundamental purpose for
|
|
the communication sytem now is going to be, let's make it available
|
|
for surveillance. Essentially, we are designing pretapped
|
|
telephones and then we have to work on the assumption that at only
|
|
authorized periods will they not turn those on. This is a
|
|
fundamental change. It treats now every person as a criminal. We
|
|
are looking at them going -- well, I think that every person in
|
|
this room is a criminal so I will build the tap into their phone.
|
|
Perhaps next they will be building microphones into everybody's
|
|
desk chairs and only turning them on when they need them. Frankly,
|
|
in reality I don't know if the law enforcement has really made the
|
|
case for wire tapping. Just last week they busted the entire
|
|
Philadelphia mob. They got it by putting a microphone in the
|
|
lawyer's office. This book here, GangLand, it is all about how
|
|
they got Gotti. They put microphones on the street to get Gotti.
|
|
The FBI comes and they give us the four cases. They have the El-
|
|
Rukh people here in Chicago which I believe was more like a scam to
|
|
get some money out of the Libyan government. They have one
|
|
pedophile, they have a couple of drug dealers and so on and they
|
|
keep doing this.
|
|
I don't think they really made the
|
|
case. There's only in reality 800 or so wire taps a year. They
|
|
are only a part of the deal. A lot of busts, especially from
|
|
Mafia, are done with inside people with microphones, with a lot of
|
|
other technologies out there. The FBI has spent billions of
|
|
dollars in the last ten years modernizing. They have an amazing
|
|
computer system now, amazing DNA systems, amazing everything. They
|
|
are not behind the scenes anymore, or behind the ball anymore.
|
|
To give you a new example: There were
|
|
approximately a couple thousand arrests in 1992 that they say were
|
|
attributable to electronic surveillance and that includes bugs. So
|
|
it is hard to say how many of those were actually wire taps. In
|
|
1992 there were 14 million arrests in the United States. That's an
|
|
awful lot of arrests and an awful small number of those had to do
|
|
with electronic surveillance. Are we willing to revise our entire
|
|
communication system just for that very small number? It is a
|
|
question that needs to be asked.
|
|
Now we have a problem. I wish we
|
|
could wave my magic wand here and solve the problem. [Takes out
|
|
wand] You know, this is the magic wand that I can say crypto be
|
|
gone, or crypto be strong. I don't know. It's not working. Oh
|
|
well. So I have a couple solutions or a couple suggestions as they
|
|
may be.
|
|
First is to withdraw the Clipper
|
|
proposal. It's a bad idea. Nobody wants it. Of the CNN/Time
|
|
Magazine poll 80% of the American public didn't want it. Industry
|
|
doesn't want it. Fifty-thousand people signed our CPSR Clipper
|
|
petition asking for its withdrawal. I haven't seen anybody in the
|
|
world who wants this thing -- well, save two, but I won't mention
|
|
them.
|
|
What should be done is to restart the
|
|
process. Back in 1989 NIST was basically ordered to start a new
|
|
process to return to make a new version of DES, or to replace DES
|
|
with something else. And they had a good idea. They wanted it to
|
|
be an open process. They wanted to look around, talk to people
|
|
like they did back with DES and they eventually got that from IBM.
|
|
They wanted a public algorithm that did both security and
|
|
authenticity. They wanted it available in hardware and software.
|
|
They wanted it to be a good strong standard for everybody. This
|
|
hasn't happened.
|
|
You know, withdraw the Clipper
|
|
proposal and start the process over. There's lots of people in
|
|
this room even who could come up with something very good but the
|
|
fact is that we have not been allowed to do it. We had, I guess,
|
|
nine or ten months after Clipper came out which had been designed
|
|
in secret for the last five years. In that time nobody has come
|
|
out and supported the thing and lots of people have had better
|
|
ideas. But they came back a couple weeks ago and came out with the
|
|
exact same proposal with one or two typos replaced. But that's
|
|
about it.
|
|
The second thing we need to do is
|
|
revise the law. We need to do this since NIST is the agency that
|
|
is supposed to be in charge of this. We should make NIST subject
|
|
to the same kind of rules that every other government agency has to
|
|
go by. Why should NIST have lower standards to develop these
|
|
crypto things which will affect all of our privacy than the FCC
|
|
does when they hand out a radio license; when the Environmental
|
|
Protection Agency does when they determine how much toxic waste we
|
|
can survive in? The basis for this, for any of you that are
|
|
lawyers in the room, is known as the Administrative Procedures Act.
|
|
It is very well established, it has been around 40 years. Every
|
|
other government agency, every other public government agency uses
|
|
it already and it works well. The things that go under this
|
|
rulemaking is that it is open. It is done in the open. There's no
|
|
communications behind the scenes. It's all done in the public eye.
|
|
The decision -- when they finally make a decision -- is based on
|
|
the public record. It is not based on something on a classified
|
|
study. And it is appealable. If we think that we've been screwed
|
|
we can appeal.
|
|
Finally, as we heard three or four
|
|
times today, we need an independent privacy commission. Simply
|
|
speaking, there is nobody in this government -- in the U.S.
|
|
government -- who is responsible for privacy. To look around and
|
|
say, wait a second, this isn't working. I mean, what kind of
|
|
government do we have that comes up with something on surveillance
|
|
and calls it the "Communication Privacy Improvement Act"? What we
|
|
need is a government agency that can look around and give an
|
|
independent assessment on what's going on. And it can't be shunted
|
|
aside or ignored or anything like that. We have to realize, and I
|
|
apologize for breaking Bob's ground rules, that we're building the
|
|
national information infrastructure without any guard rails. And
|
|
we need to think about it and get back.
|
|
Thank you.
|
|
BOB SMITH: Questions, short and sweet. We have
|
|
limited time.
|
|
CHARLES MARSON Charles Marson, lawyer of San
|
|
Francisco. I would like to ask a question of the General Counsel.
|
|
I have to say, this may be my one lifetime opportunity.
|
|
A lot of the Administration's case
|
|
for the Clipper depends on a reliance and a level of comfort with
|
|
present law. We are always told present law covers these things we
|
|
are not extending anything. Present law requires your agency, sir,
|
|
to apply to the foreign intelligence court for a warrant. CBS News
|
|
issued a report last month that said that -- I think it was 4,500
|
|
applications had been made to that court -- all appointed by Chief
|
|
Justice Renquist, and 4,500 have been granted. That is to say not
|
|
one has been denied. Now in terms of our comfort level with
|
|
present law will you tell us why it is that we should not conclude
|
|
that this court is nothing but a Fourth Amendment fig leaf and that
|
|
your agency is in fact free to tap anybody it wants.
|
|
STU BAKER There's an interesting element -- I
|
|
think you have to understand bureaucratic behavior in part here.
|
|
CHARLES MARSON My fear is that I do, sir.
|
|
[Laughter] A real tap whomever you please.
|
|
STU BAKER Let's bear in mind, these are all
|
|
Article III judges. I actually don't know that the figures you
|
|
gave are right. But these are Article III judges from all over the
|
|
country. They are used to seeing law enforcement wire taps and to
|
|
reviewing them carefully. Their whole life is sticking to the law.
|
|
CHARLES MARSON If they said yes all the time, who
|
|
cares?
|
|
STU BAKER Well, I -- let me offer an
|
|
alternative explanation for the record of the courts and the agency
|
|
in terms of FISA applications. And that is this. No one wants to
|
|
be the first general counsel whose application is turned down.
|
|
Nobody wants to get creative about what you can do and what you
|
|
can't do. And so the effect of putting into judicial review is not
|
|
so much that it is going to lead to judges rejecting a lot of stuff
|
|
as much as it will make the agency make sure that before it takes
|
|
something to the court, it is absolutely confident it has a case
|
|
that it can make, that the judge will accept as fitting within the
|
|
standards set by the statute. It's for the same reason that
|
|
prosecutors don't like to bring cases that they don't think they
|
|
can win. People do not like to try and fail and they consequently
|
|
are very careful about what they put forward. I think that in fact
|
|
is a more creditable explanation of the figures that you gave if
|
|
they are right than the explanation you gave which is that judges
|
|
don't care what the law is. I don't think that's true.
|
|
SPEAKER Could we move on to the next
|
|
question, thank you.
|
|
PHIL ZIMMERMANN That explanation reminds me of the
|
|
Doonsberry cartoon about grade inflation where some students sued
|
|
for not getting an "A" in this course and in the courtroom they
|
|
said that this university gave an "A" to all students. How is it
|
|
possible that the entire graduating class had an "A" average of 4.0
|
|
and they said, well, you know, it's just a great class. So I guess
|
|
all those guys that applied for the wiretap orders through that
|
|
judge, all those judges, absolutely all of them did everything
|
|
right. It's sort of a grade inflation for wire tap requests.
|
|
One thing that bothers me about this
|
|
process of Clipper ....
|
|
MODERATOR Your name please.
|
|
PHIL ZIMMERMANN I'm sorry. I'm Phil Zimmerman. I am
|
|
the author of PGP [applause]. I'm sorry, I didn't hear the part
|
|
about what is your name.
|
|
It seems to me that this Clipper
|
|
process has some kind of secret game plan that the government is
|
|
following through that we only find out about each step of it as it
|
|
unfolds. I saw on the net some news about some representative of
|
|
the U.S. government going -- it might have been from NSA -- talking
|
|
to people in Europe, other countries in Europe, about them getting
|
|
their own Clipper systems. Well, that seems like a public policy
|
|
thing that we should have been discussing openly here before
|
|
sending somebody over there to quietly do horizontal escalation and
|
|
get this Clipper thing glued in worldwide, planetwide before ....
|
|
thus making it harder to reverse later.
|
|
MODERATOR Could you phrase the question? The
|
|
line behind is getting restless.
|
|
ZIMMERMANN Okay, okay.
|
|
I think that this kind of secretive
|
|
agenda is not being treated like other public policy issues like
|
|
health care and things like that that are openly debated. It's
|
|
like we are being treated like an enemy foreign population to be
|
|
manipulated cynically. And so I would like somebody to respond to
|
|
that, whoever wants to respond to that -- why can't we be treated
|
|
like ...
|
|
MODERATOR Let's hear the response.
|
|
ZIMMERMANN Okay.
|
|
STU BAKER There isn't a secret plan.
|
|
AUDIENCE (Negative response from the
|
|
audience.)
|
|
STU BAKER But, all right, there will be --
|
|
we're not the only place that's worried about law enforcement and
|
|
criminal misuse of the communications system. Every country in the
|
|
world is going to be concerned about that -- it is no surprise.
|
|
Today France says we will tell you what you can use, what you can
|
|
export, what you import. Singapore, we've had lots of companies
|
|
say we're concerned about that.
|
|
ZIMMERMANN Singapore -- it's illegal to not
|
|
flush the toilet in Singapore. I didn't make that up, that's true.
|
|
It's possible to construct a society -- a crime-free society -- but
|
|
who wants to live in a society like that? We might be heading
|
|
toward Singapore. I'm glad you said Singapore -- I couldn't have
|
|
paid you money to say that -- I'm glad you said Singapore.
|
|
STU BAKER But look, Italy has just banned forms
|
|
of encryption on the phone system. The significance I think of the
|
|
Singapore example is that we shouldn't expect that as Asians get
|
|
richer they are going to say, oh well, let's adopt American views
|
|
about privacy. What's important about that, I think, is the view
|
|
that we get from a lot of people whose life has been open systems
|
|
and will have seen that standards are the key to new technological
|
|
advances, believe that if they could standarize encryption and sell
|
|
it everywhere in the world, it would sweep the world and whoever
|
|
had the best product would win. I think that reckons without the
|
|
law enforcement concerns that you will see in every country. And
|
|
you are already beginning to see other countries say we are not
|
|
going to tolerate unreadable encryption of all sorts proliforating
|
|
throughout our communications network. You are going to see more
|
|
of that. Not less. It won't happen here but it will happen in
|
|
other countries.
|
|
AUDIENCE Yes, worldwide.
|
|
MODERATOR Can we move onto the next question?
|
|
And we probably have time for only two more.
|
|
BLAKE SOBILOFF My name is Blake Sobiloff and I'm
|
|
with ACM SIGCAS and I'm trying to figure out some sort of
|
|
philosophical presupposition that you have -- the kind that frames
|
|
your approach to your objections to anti-Clipper individuals.
|
|
BAKER Most of the anti-Clipper individuals
|
|
I really like actually.
|
|
BLAKE SOBILOFF Okay, well, their position. Would it
|
|
be fair to characterize your position as one that assumes that a
|
|
desire for an unimpeachable privancy can be fairly well equated
|
|
with the desire to engage in lawless acts?
|
|
BAKER No, I think that's completely wrong.
|
|
The problem is that guaranteeing privacy to everybody is going to
|
|
guarantee it to some people who will misuse that kind of
|
|
technological sanctuary.
|
|
AUDIENCE (Negative response.)
|
|
BAKER All right, okay. Well, to continue
|
|
the poor song metaphor, if anyone is familiar with the Spin Doctors
|
|
rock group. Let me say that you are a fantastic Spin Doctor and I
|
|
do admire you for that but I'll keep my pocket full of kryptonite.
|
|
Thanks.
|
|
QUESTION Can I make a comment on that.
|
|
BAKER Yes.
|
|
QUESTION I think it is important to say
|
|
something about who asked NSA to be the guarantor of privacy.
|
|
Asking NSA to guarantee privacy is sort of like asking Playboy to
|
|
guard chastity belts.
|
|
BAKER I tried to address that briefly. Our
|
|
job is in fact to guarantee the privacy of U.S. government
|
|
communications when they're talking about whether to go to war, for
|
|
example. That's one of the things we do and it is one of our two
|
|
principle missions. We do guarantee privacy. Now I understand the
|
|
reaction but we do have a job to create encryption and to make it
|
|
as good as we possibly can.
|
|
AUDIENCE Not for my privacy.
|
|
BAKER My concern is that what we design is
|
|
very likely to be -- to find itself migrating into private sector
|
|
and if we design it in a way that is going to put law enforcement
|
|
out of business we haven't acted responsibly.
|
|
MODERATOR Next question.
|
|
HERB LIN My name is Herb Lin. I'm with the
|
|
National Academy of Sciences regarding the need for an independent
|
|
look at it. The U.S. Congress has asked the Academy to undertake
|
|
an independent assessment of national cryptography policy.
|
|
Descriptions of that study are out on the giveaway desk. I'll be
|
|
glad to talk to anybody about it.
|
|
MODERATOR Thank you. We've got one more.
|
|
(Unknown) My name is Barbolin (?) from GRC (?).
|
|
I have a question concerning the algorithm that is used in the
|
|
Clipper Chip, Skipjack. That algorithm is not being made public
|
|
and yet one of the very basis of scientific research is that the
|
|
work should be published and then reviewed by the community and
|
|
approved as the state-of-the-art develops. Yet it seems that the
|
|
NSA reluctant to do that. There is a certain amount of conjecture
|
|
that in fact the algorithm contains a deliberately encoded weakness
|
|
that will allow the NSA, without access to the escrow keys, to be
|
|
able to intercept communication in their mission to monitor on-
|
|
shore and off-shore communications. There's a number of us in the
|
|
scientific community that are greatly concerned that that algorithm
|
|
is not being made public. I would like the counsel from NSA to
|
|
address that with a simple yes or no answer. Is that a problem?
|
|
And then I would like our university professor to comment on his
|
|
opinion in this matter.
|
|
BAKER I'll answer it yes or no if you'll
|
|
tell me exactly the question.
|
|
UNKNOWN Does it or does it not contain a
|
|
weakness that allows you to intercept the communications without
|
|
access to the escrow keys.
|
|
BAKER No.
|
|
MODERATOR I'm sorry, that has to be the last
|
|
question. We will conclude. I'm sorry, we have to stick to the
|
|
schedule. [Negative audience response.]
|
|
We'll conclude with another country song which is ....
|
|
GEORGE TRUBOW, CONF. CHAIR Let me explain to you what our
|
|
problem is. During the reception this room is going to be cleared
|
|
and turned into the dining room for our meal this evening and so
|
|
the hotel has a schedule; and if you want to give up the evening
|
|
reception and meal we could do that but that's why we've got to
|
|
close out. You want to go for a little longer. Okay, how about
|
|
this for a promise, we'll quit at six (pm) which will give us
|
|
another seven minutes. All right.
|
|
PROFESSOR DAVIDA I will comment just very briefly about
|
|
this issue of standards and algorithms.
|
|
I've worked for almost 20 years in
|
|
organizations like IEEE(?) Computer Society and we have addressed
|
|
issues like standards. It is important to understand what a
|
|
standard is. Standards' purposes are primarily to promote trust in
|
|
commerce and the products that you are actually engaging in, buying
|
|
or using. DES and other encryption standards deviate from that
|
|
substantially. These are not standards that set a boxing or weight
|
|
standard, or a packaging standard, which is what most electronic
|
|
standards and computer standards tend to be like. For example,
|
|
there is no standard that says you must use the Intel 8085 or
|
|
whatever. There is no standard that says you must use a particular
|
|
chip. The standards pertain to buses, number of bytes and what
|
|
have you. DES and other standards like that force us to adopt
|
|
something which is basically monopolistic. It is specific
|
|
algorithm. So there are some fundamental faults with it. But as
|
|
for trusting algorithm that somebody else designed, I stand by my
|
|
previous comment.
|
|
MODERATOR Thank you.
|
|
MIKE GODWIN I'm Mike Godwin with the Electronic
|
|
Frontier Foundation and I have a question, as you can image for the
|
|
General Counsel of the NSA.
|
|
You said in myth number four that we can
|
|
anticipate -- and in fact NSA did anticipate that these
|
|
technologies would become available in five to ten years. People
|
|
would go buy telephones, have an encryption button and be able to
|
|
use this technology -- I think I am quoting you accurately -- in
|
|
profoundly anti-social ways. Isn't it true that many otherwise
|
|
acceptable technologies can be used by individuals in profoundly
|
|
anti-social ways including, say the printing press. Isn't it in
|
|
fact true that in a democratic society we make a decision to
|
|
empower individuals knowing upfront and openly that we do so taking
|
|
risk about society. Isn't that in fact the case in this country?
|
|
BAKER Yes. And first I should say, Mike, I
|
|
haven't met you but I've read your stuff and actually, is David
|
|
Sternlight here too?
|
|
Sure you take risks and you have to look
|
|
at each technology as it comes. Let's take a look at cars. Cars
|
|
have advantages and risks and how do we deal with that. We put
|
|
license plates on every car and everybody has to have a license
|
|
plate on their car even if they think it violates their First
|
|
Amendment Rights to do it.
|
|
MIKE GODWIN In fact, automobiles are a little bit
|
|
different because we do have explicit Constitutional guarantees
|
|
with regard to communications. We have implicit and explicit
|
|
guarantees as regard to privacy and it is a little bit different
|
|
from driving your Ford.
|
|
BAKER Well, actually there is a Constitutional
|
|
right to travel.
|
|
MIKE GODWIN There is a Constitutional right to travel,
|
|
that's correct. But we are talking -- it's still a false analogy.
|
|
This is a central right. You know, Hugo Black said that there is
|
|
a reason for the First Amendment to be a First Amendment.
|
|
BAKER This is why I never get on the net with
|
|
you, Mike.
|
|
MIKE GODWIN So I take it you've answered my question.
|
|
The reason -- the thing that really troubled me about your comments
|
|
is that you did talk about France and Italy and Singapore and it
|
|
seems to me worth pointing out that the theory of government that
|
|
we have in this country is a little bit different from the theory
|
|
of government in France, Italy and Singapore. (Applause)
|
|
BAKER Absolutely. I don't think that we will
|
|
ever have the same view of government that any of those places
|
|
have.
|
|
MIKE GODWIN I'm confident.
|
|
BAKER And I think the short answer is, yes, as
|
|
each technology comes along we have to evaluate the risks and the
|
|
rewards that come with it and try to figure out the way to get as
|
|
much good from it and as little bad from it. And the response is
|
|
going to be very variable depending on the technology. But you
|
|
can't set up a principle that says we will always do whatever seems
|
|
like the best technology today without regard for the social
|
|
consequences. We don't do that with guns, we don't do that with
|
|
cars, we don't do that with any kind of technology.
|
|
MODERATOR Can we go on to another question?
|
|
JOHN BRIMACOMBE Hi, my name is John Brimacombe I'm a
|
|
European scientist and user of cryptography. I'd like to go
|
|
through something very quickly here. First, you know, people know
|
|
about cryptography in Europe. We know about all the algorithms.
|
|
Secondly, you know, scientists in Europe don't have brains so
|
|
defective that we can't implement them. And there is going to be
|
|
a big market for this sort of stuff out there in the world. Now,
|
|
we can do that work, we are doing that work, we like doing that
|
|
work. You are cutting yourselves off. My question is, why are you
|
|
screwing yourselves this way? My worry looking at your nice
|
|
salesmen of your shiny Clipper Chip coming to sell it to all my CEC
|
|
people. I'm worrying that you see this problem. You see
|
|
yourselves being put out of the market by these nice Europeans.
|
|
They say, okay, let's go and screw their market up to a Clipper.
|
|
MODERATOR No response?
|
|
BAKER No, I liked the speech.
|
|
MATT BLAZE Matt Blaze from Bell Labs. I have a
|
|
question that was originally for Senator Leahy but it could be
|
|
equally well directed to the NSA Counsel. Do you see any risks in
|
|
terms of risk assessment of the Clipper proposal to the fact that
|
|
the escrow procedures exist entirely within the purview of the
|
|
Executive Branch, the Attorney General in particular, and can be
|
|
changed essentially at will entirely within a single branch of
|
|
government?
|
|
BAKER I think that's a reasonable concern.
|
|
One of the interesting things is that we designed it so you decide
|
|
who you trust and that's where the keys go as a society. And we
|
|
didn't have much input into who holds the keys. This is almost a
|
|
litmus test though. It is kind of interesting when you ask, well
|
|
who do you trust, exactly? And often the answer is "Well, just not
|
|
those guys." And it is much harder when you ask the question,
|
|
"Well who would you trust?" I think Jerry Berman was quoted as
|
|
saying I don't care if it is Mother Theresa and the Pope who holds
|
|
the keys. There certainly are people who feel that way. There is
|
|
a lot of talk about whether, you know, should you have private
|
|
sector entities hold the keys and I have to say that one doesn't
|
|
...
|
|
MODERATOR I have to say through the escrow agency.
|
|
The procedures are written and under the authority of the --
|
|
entirely within the Attorney General.
|
|
BAKER The procedures don't change the fact
|
|
that we are all governed by laws that are already on the books that
|
|
make it a felony to do stuff without authority. And so the
|
|
procedures for withdrawing key are written down as Executive Branch
|
|
rules but the legal framework for that is set by Congress or by the
|
|
Fourth Amendment as a matter of fact.
|
|
EFREM LIPKIN I'm Efrem Lipkin that works in
|
|
community and I guess I'm a fossil from the '60's. My parents had
|
|
to deal with HUAC. I had the utterly surreal experience -- I was
|
|
in the Civil Rights Movement -- I had this surreal experience of
|
|
apparently a government agent tried to plant a copy of the Daily
|
|
Worker on me. And so my question is really for CPSR. Why, I
|
|
understand why the NSA says we don't have to worry about this
|
|
government. We haven't had any trouble with it recently. But why
|
|
doesn't CPSR point out all of the trouble we have had and how the
|
|
protection -- the privacy protection we want and that we
|
|
historically needed -- is from the government.
|
|
BANISAR Well, obviously, you haven't been
|
|
reading a whole lot of my press releases. We've been pointing out
|
|
a lot of the abuses and problems that have been going on. We have
|
|
also some deep concerns to pour off here a little bit about the
|
|
escrow procedures. At the end of each escrow procedure it mentions
|
|
that they are not enforceable so if they are violated it wouldn't
|
|
matter because this evidence can't be suppressed. Frankly -- I
|
|
guess somebody asked me today -- Mike Nelson from OSTP apparently
|
|
now is talking about putting the escrow key holders outside the
|
|
government. I frankly think that it wouldn't make a whole world of
|
|
difference whether Mother Theresa and the Pope held the keys then
|
|
if they are not enforceable.
|
|
MODERATOR Thank you, thanks to all the panelists for
|
|
coming. We'll conclude with another country song, "I've Enjoyed
|
|
About as Much of This as I Can Stand."
|
|
Just a moment please, there is a
|
|
related announcement on an equally high note I want to read this to
|
|
you and to my colleague here. To a dedicated advocate, gifted
|
|
journalist, generous friend and true champion of freedom, Robert
|
|
Ellis Smith. publisher, Privancy Journal, in recognition of 20
|
|
years in service to the cause of privacy protection. With warm
|
|
regards from friends and colleagues in celebrating the 20th year of
|
|
the publication of this fine journal.
|
|
ROBERT ELLIS SMITH I have a few words I would like to
|
|
say.
|
|
|
|
END OF TAPE
|
|
|
|
===================================================================
|
|
There endeth the transcript - CFP'94 Volunteers.
|
|
|
|
------------------------------
|
|
|
|
End of Computer Underground Digest #6.30
|
|
************************************
|
|
|