1135 lines
57 KiB
Plaintext
1135 lines
57 KiB
Plaintext
Computer underground Digest Sun May 22, 1994 Volume 6 : Issue 44
|
|
ISSN 1004-042X
|
|
|
|
Editors: Jim Thomas and Gordon Meyer (TK0JUT2@NIU.BITNET)
|
|
Archivist: Brendan Kehoe
|
|
Retiring Shadow Archivist: Stanton McCandlish
|
|
Shadow-Archivists: Dan Carosone / Paul Southworth
|
|
Ralph Sims / Jyrki Kuoppala
|
|
Ian Dickinson
|
|
Covey Editors: D. Bannaducci & S. Jones
|
|
|
|
CONTENTS, #6.44 (May 22, 1994)
|
|
|
|
File 1--EFF's Jerry Berman testimony - House Clipper/DigTel hearing 5/3/94
|
|
File 2--Whit Diffie testimony - Senate Clipper Hearing, May 3 1994
|
|
|
|
Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
|
|
available at no cost electronically.
|
|
|
|
CuD is available as a Usenet newsgroup: comp.society.cu-digest
|
|
|
|
Or, to subscribe, send a one-line message: SUB CUDIGEST your name
|
|
Send it to LISTSERV@UIUCVMD.BITNET or LISTSERV@VMD.CSO.UIUC.EDU
|
|
The editors may be contacted by voice (815-753-0303), fax (815-753-6302)
|
|
or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL
|
|
60115, USA.
|
|
|
|
Issues of CuD can also be found in the Usenet comp.society.cu-digest
|
|
news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of
|
|
LAWSIG, and DL1 of TELECOM; on GEnie in the PF*NPC RT
|
|
libraries and in the VIRUS/SECURITY library; from America Online in
|
|
the PC Telecom forum under "computing newsletters;"
|
|
On Delphi in the General Discussion database of the Internet SIG;
|
|
on RIPCO BBS (312) 528-5020 (and via Ripco on internet);
|
|
and on Rune Stone BBS (IIRGWHQ) (203) 832-8441.
|
|
CuD is also available via Fidonet File Request from
|
|
1:11/70; unlisted nodes and points welcome.
|
|
|
|
UNITED STATES: etext.archive.umich.edu (141.211.164.18) in /pub/CuD/
|
|
ftp.eff.org (192.88.144.4) in /pub/Publications/CuD
|
|
aql.gatech.edu (128.61.10.53) in /pub/eff/cud/
|
|
world.std.com in /src/wuarchive/doc/EFF/Publications/CuD/
|
|
uceng.uc.edu in /pub/wuarchive/doc/EFF/Publications/CuD/
|
|
wuarchive.wustl.edu in /doc/EFF/Publications/CuD/
|
|
EUROPE: nic.funet.fi in pub/doc/cud/ (Finland)
|
|
ftp.warwick.ac.uk in pub/cud/ (United Kingdom)
|
|
|
|
JAPAN: ftp.glocom.ac.jp /mirror/ftp.eff.org/
|
|
|
|
COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
|
|
information among computerists and to the presentation and debate of
|
|
diverse views. CuD material may be reprinted for non-profit as long
|
|
as the source is cited. Authors hold a presumptive copyright, and
|
|
they should be contacted for reprint permission. It is assumed that
|
|
non-personal mail to the moderators may be reprinted unless otherwise
|
|
specified. Readers are encouraged to submit reasoned articles
|
|
relating to computer culture and communication. Articles are
|
|
preferred to short responses. Please avoid quoting previous posts
|
|
unless absolutely necessary.
|
|
|
|
DISCLAIMER: The views represented herein do not necessarily represent
|
|
the views of the moderators. Digest contributors assume all
|
|
responsibility for ensuring that articles submitted do not
|
|
violate copyright protections.
|
|
|
|
----------------------------------------------------------------------
|
|
|
|
Date: Fri, 6 May 1994 12:10:26 -0400 (EDT)
|
|
From: Stanton McCandlish <mech@EFF.ORG>
|
|
Subject: File 1--EFF's J Berman testimony - House Clipper/DigTel hearing 5/3/94
|
|
|
|
Testimony of
|
|
Jerry J. Berman, Executive Director
|
|
Electronic Frontier Foundation
|
|
|
|
before the
|
|
|
|
Committee on Science, Space and Technology
|
|
|
|
Subcommittee on Technology, Environment and Aviation
|
|
|
|
U.S. House of Representatives
|
|
|
|
|
|
Hearing on
|
|
|
|
Communications and Computer Surveillance, Privacy
|
|
and Security
|
|
|
|
|
|
May 3, 1994
|
|
|
|
Mr. Chairman and Members of the Committee
|
|
|
|
I want to thank you for the opportunity to testify today on
|
|
communications
|
|
and computer surveillance, privacy, and security policy. The Electronic
|
|
Frontier Foundation (EFF) is a public interest membership organization
|
|
dedicated to achieving the democratic potential of new communications
|
|
and computer technology and works to protect civil liberties in new
|
|
digital environments. EFF also coordinates the Digital Privacy and
|
|
Security Working Group (DPSWG), a coalition of more than 50 computer,
|
|
communications, and public interest organizations and associations
|
|
working on communications privacy issues. The Working Group has
|
|
strongly opposed the Administration's clipper chip and digital telephony
|
|
proposals.
|
|
EFF is especially pleased that this subcommittee has taken an
|
|
interest in these issues. It is our belief that Administration policy
|
|
developed in this area threatens individual privacy rights, will thwart
|
|
the development of the information infrastructure, and does not even
|
|
meet the stated needs of law enforcement and national security agencies.
|
|
A fresh and comprehensive look at these issues is needed.
|
|
|
|
|
|
I.Background on digital privacy and security policy
|
|
------------------------------------------------------
|
|
|
|
>From the beginning of the 1992 Presidential campaign, President
|
|
Clinton and Vice President Gore committed themselves to support the
|
|
development of the National Information Infrastructure. They recognize
|
|
that the "development of the NII can unleash an information revolution
|
|
that will change forever the way people live, work, and interact with
|
|
each other." They also know that the information infrastructure can
|
|
only realize its potential if users feel confident about security
|
|
measures available.
|
|
If allowed to reach its potential, this information infrastructure
|
|
will carry vital personal information, such as health care records,
|
|
private communications among friends and families, and personal
|
|
financial transactions. The business community will transmit valuable
|
|
information such as plans for new products, proprietary financial data,
|
|
and other strategic communications. If communications in the new
|
|
infrastructure are vulnerable, all of our lives and businesses would be
|
|
subject to both damaging and costly invasion.
|
|
In launching its Information Infrastructure Task Force (IITF) the
|
|
Clinton Administration recognized this when it declared that:
|
|
|
|
The trustworthiness and security of communications channels and
|
|
networks are essential to the success of the NII.... Electronic
|
|
information systems can create new vulnerabilities. For example,
|
|
electronic files can be broken into and copied from remote locations,
|
|
and cellular phone conversations can be monitored easily. Yet these
|
|
same systems, if properly designed, can offer greater security than
|
|
less advanced communications channels. [_Agenda_for_Action_, 9]
|
|
|
|
Cryptography -- technology which allows encoding and decoding of
|
|
messages -- is an absolutely essential part of the solution to
|
|
information security and privacy needs in the Information Age. Without
|
|
strong cryptography, no one will have the confidence to use networks to
|
|
conduct business, to engage in commercial transactions electronically,
|
|
or to transmit sensitive personal information. As the Administration
|
|
foresees, we need
|
|
|
|
network standards and transmission codes that facilitate
|
|
interconnection and interoperation between networks, and ensure the
|
|
privacy of persons and the security of information carried....
|
|
[_Agenda_for_Action_, 6]
|
|
|
|
While articulating these security and privacy needs, the Administration
|
|
has also emphasized that the availability of strong encryption poses
|
|
challenges to law enforcement and national security efforts. Though the
|
|
vast majority of those who benefit from encryption will be law abiding
|
|
citizens, some criminals will find ways to hide behind new technologies.
|
|
|
|
|
|
II.Current cryptography policy fails to meet the needs of
|
|
-----------------------------------------------------------
|
|
the growing information infrastructure
|
|
---------------------------------------------
|
|
|
|
As a solution to the conflict between the need for user privacy
|
|
and the desire to ensure law enforcement access, the Administration has
|
|
proposed that individuals and organizations who use encryption deposit a
|
|
copy of their private key -- the means to decode any communications they
|
|
send -- with the federal government.
|
|
In our view, this is not a balanced solution but one that
|
|
undermines the need for security and privacy without resolving important
|
|
law enforcement concerns. It is up to the Congress to send the
|
|
Administration back to the drawing board.
|
|
|
|
A.Current Export Controls and New Clipper Proposal Stifle Innovation
|
|
-----------------------------------------------------------------------
|
|
|
|
Two factors are currently keeping strong encryption out of the
|
|
reach of United States citizens and corporations. First, general
|
|
uncertainty about what forms of cryptography will and will not be legal
|
|
to produce in the future. Second, export controls make it economically
|
|
impossible for US manufacturers that build products for the global
|
|
marketplace to incorporate strong encryption for either the domestic or
|
|
foreign markets. Despite this negative impact on the US market, export
|
|
controls are decreasingly successful at limiting the foreign
|
|
availability of strong encryption. A recent survey shows that of the
|
|
more than 260 foreign encryption products now available globally, over
|
|
80 offer encryption which is stronger than what US companies are allowed
|
|
to export. Export controls do constrain the US market, but the
|
|
international market appears to be meeting its security needs without
|
|
help from US industry. The introduction of Clipper fails to address the
|
|
general uncertainty in the cryptography market. Announcement of a key
|
|
escrow policy alone is not sufficient to get the stalled US cryptography
|
|
market back on track.
|
|
|
|
B.The secrecy of the Clipper/Skipjack algorithm reduces public trust
|
|
-----------------------------------------------------------------------
|
|
and casts doubt on the voluntariness of the whole system
|
|
-------------------------------------------------------------
|
|
|
|
Many parties have already questioned the need for a secret
|
|
algorithm, especially given the existence of robust, public-domain
|
|
encryption techniques. The most common explanation given for use of a
|
|
secret algorithm is the need to prevent users from bypassing the key
|
|
escrow system proposed along with the Clipper Chip. Clipper has always
|
|
been presented by the Administration as a voluntary option. But if the
|
|
system is truly voluntary, why go to such lengths to ensure compliance
|
|
with the escrow procedure?
|
|
|
|
C.Current plans for escrow system offer inadequate technical
|
|
---------------------------------------------------------------
|
|
security and insufficient legal protections for users
|
|
----------------------------------------------------------
|
|
|
|
The implementation of a nationwide key escrow system is clearly a
|
|
complex task. But preliminary plans available already indicate several
|
|
areas of serious concern:
|
|
|
|
1._No_legal_rights_for_escrow_users_: As currently written, the
|
|
escrow procedures insulate the government escrow agents from any legal
|
|
liability for unauthorized or negligent release of an individual's key.
|
|
This is contrary to the very notion of an escrow system, which
|
|
ordinarily would provide a legal remedy for the depositor whose
|
|
deposit is released without authorization. If anything, escrow agents
|
|
should be subject to strict liability for unauthorized disclosure of
|
|
keys.
|
|
|
|
2._No_stability_in_escrow_rules_: The Administration has
|
|
specifically declared that it will not seek to have the escrow
|
|
procedures incorporated into legislation or official regulations.
|
|
Without formalization of rules, users have no guaranty that subsequent
|
|
administrations will follow the same rules or offer the users the same
|
|
degree of protection. This will greatly reduce the trust in the system.
|
|
|
|
3._Fixed_Key_: A cardinal rule of computer security is that
|
|
encryption keys must be changed often. Since the Clipper keys are
|
|
locked permanently into the chips, the keys can never be changed. This
|
|
is a major technical weakness of the current proposal.
|
|
|
|
4._Less_intrusive,_more_secure_escrow_alternatives_are_available_:
|
|
The Clipper proposal represents only one of many possible kinds of key
|
|
escrow systems. More security could be provided by having more
|
|
than two escrow agents. And, in order to increase public trust, some
|
|
or all of these agents could be non-governmental agencies, with the
|
|
traditional fiduciary duties of an escrow agent.
|
|
|
|
D.Escrow Systems Threaten Fundamental Constitutional Values
|
|
--------------------------------------------------------------
|
|
|
|
The Administration, Congress, and the public ought to have the
|
|
opportunity to consider the implications of limitations on cryptography
|
|
from a constitutional perspective. A delicate balance between
|
|
constitutional privacy rights and the needs of law enforcement has been
|
|
crafted over the history of this country. We must act carefully as we
|
|
face the constitutional challenges posed by new communication
|
|
technologies.
|
|
Unraveling the current encryption policy tangle must begin with
|
|
one threshold question: will there come a day when the federal
|
|
government controls the domestic use of encryption through mandated key
|
|
escrow schemes or outright prohibitions against the use of particular
|
|
encryption technologies? Is Clipper the first step in this direction?
|
|
A mandatory encryption regime raises profound constitutional questions.
|
|
In the era where people work for "virtual corporations" and
|
|
conduct personal and political lives in "cyberspace," the distinction
|
|
between _communication_ of information and _storage_ of information is
|
|
increasingly vague. The organization in which one works may constitute
|
|
a single virtual space, but be physically dispersed. So, the papers and
|
|
files of the organization or individual may be moved within the
|
|
organization by means of telecommunications technology. Instantaneous
|
|
access to encryption keys, without prior notice to the communicating
|
|
parties, may well constitute a secret search, if the target is a
|
|
virtual corporation or an individual whose "papers" are physically
|
|
dispersed.
|
|
Wiretapping and other electronic surveillance has always been
|
|
recognized as an exception to the fundamental Fourth Amendment
|
|
prohibition against secret searches. Even with a valid search warrant,
|
|
law enforcement agents must "knock and announce" their intent to search
|
|
a premises before proceeding. Failure to do so violates the Fourth
|
|
Amendment. Until now, the law of search and seizure has made a sharp
|
|
distinction between, on the one hand, _seizures_of_papers_ and other
|
|
items in a person's physical possession, and on the other hand,
|
|
_wiretapping_of_communications_. Seizure of papers or personal effects
|
|
must be conducted with the owner's knowledge, upon presentation of a
|
|
search warrant. Only in the exceptional case of wiretapping, may a
|
|
person's privacy be invaded by law enforcement without simultaneously
|
|
informing that person.
|
|
Proposals to regulate the use of cryptography for the sake of law
|
|
enforcement efficiency should be viewed carefully in the centuries old
|
|
tradition of privacy protection.
|
|
|
|
E.Voluntary escrow system will not meet law enforcement needs
|
|
----------------------------------------------------------------
|
|
|
|
Finally, despite all of the troubling aspects of the Clipper
|
|
proposal, it is by no means clear that it will even solve the problems
|
|
that law enforcement has identified. The major stated rationale for
|
|
government intervention in the domestic encryption arena is to ensure
|
|
that law enforcement has access to criminal communications, even if they
|
|
are encrypted. Yet, a voluntary scheme seems inadequate to meet this
|
|
goal. Criminals who seek to avoid interception and decryption of their
|
|
communications would simply use another system, free from escrow
|
|
provisions. Unless a government-proposed encryption scheme is
|
|
mandatory, it would fail to achieve its primary law enforcement purpose.
|
|
In a voluntary regime, only the law-abiding would use the escrow system.
|
|
|
|
III.Recent policy developments indicate that Administration policy is
|
|
----------------------------------------------------------------------
|
|
bad for the NII, contrary to the Computer Security Act, and
|
|
----------------------------------------------------------------
|
|
requires Congressional oversight
|
|
-------------------------------------
|
|
|
|
Along with the Clipper Chip proposal, the Administration announced
|
|
a comprehensive review of cryptography and privacy policy. Almost
|
|
immediately after the Clipper announcement, the Digital Privacy and
|
|
Security Working Group began discussions with the Administration on
|
|
issues raised by the Clipper proposal and by cryptography in general.
|
|
Unfortunately, this dialogue has been largely one-sided. EFF and many
|
|
other groups have provided extensive input to the Administration, yet
|
|
the Administration has not reciprocated -- the promised policy report
|
|
has not been forthcoming. Moreover, the National Security Agency and
|
|
the Federal Bureau of Investigation are proceeding unilaterally to
|
|
implement their own goals in this critical policy area.
|
|
Allowing these agencies to proceed unilaterally would be a grave
|
|
mistake. As this subcommittee is well aware, the Computer Security Act
|
|
of 1987 clearly established that neither military nor law enforcement
|
|
agencies are the proper protectors of personal privacy. When
|
|
considering the law, Congress asked, "whether it is proper for a super-
|
|
secret agency [the NSA] that operates without public scrutiny to involve
|
|
itself in domestic activities...?" The answer was a clear "no." Recent
|
|
Administration announcements regarding the Clipper Chip suggest that the
|
|
principle established in the 1987 Act has been circumvented.
|
|
As important as the principle of civilian control was in 1987, it
|
|
is even more critical today. The more individuals around the country
|
|
come to depend on secure communications to protect their privacy, the
|
|
more important it is to conduct privacy and security policy dialogues in
|
|
public, civilian forums.
|
|
The NII can grow into the kind of critical, national resource
|
|
which this Administration seeks to promote only if major changes in
|
|
current cryptography and privacy policy. In the absence of such
|
|
changes, digital technology will continue to rapidly render our
|
|
commercial activities and communications -- and, indeed, much of our
|
|
personal lives -- open to scrutiny by strangers. The Electronic
|
|
Frontier Foundation believes that Americans must be allowed access
|
|
to the cryptographic tools necessary to protect their own privacy.
|
|
We had hoped that the Administration was committed to making these
|
|
changes, but several recent developments lead us to fear that the effort
|
|
has been abandoned, leaving individual agencies to pursue their own
|
|
policy agendas instead of being guided by a comprehensive policy. The
|
|
following issues concern us:
|
|
|
|
*Delayed Cryptography Policy Report
|
|
---------------------------------------
|
|
|
|
The policy analysis called for along with the April 16, 1993
|
|
Presidential Decision Directive has not been released, though it was
|
|
promised to have been completed by early fall of 1993. We had hoped
|
|
that this report would be the basis for public dialogue on the important
|
|
privacy, competitiveness, and law enforcement issues raised by
|
|
cryptography policy. To date, none of the Administration's policy
|
|
rationale has been revealed to the public, despite the fact that
|
|
agencies in the Executive Branch are proceeding with their own plan
|
|
|
|
*Escrowed Encryption Federal Information Processing Standard (FIPS)
|
|
-----------------------------------------------------------------------
|
|
approved against overwhelming weight of public comments
|
|
+-----------------------------------------------------------
|
|
|
|
The Presidential Decision Directive also called for consideration of a
|
|
Federal Information Processing Standard (FIPS) for key-escrow
|
|
encryption systems. This process was to have been one of several
|
|
forums whereby those concerned about the proposed key-escrow system
|
|
could voice opinions. EFF, as well as over 225 of our individual
|
|
members, raised a number of serious concerns about the draft FIPS in
|
|
September of this 1993. EFF expressed its opposition to government
|
|
implementation of key-escrow systems as proposed. We continue to
|
|
oppose the deployment of Skipjack family escrow encryption systems
|
|
both because they violate fundamental First, Fourth, and Fifth
|
|
amendment principles, and because they fail to offer users adequate
|
|
security and flexibility.
|
|
|
|
Despite overwhelming opposition from over 300 commenters, the
|
|
Department of Commerce recently approved FIPS 185.
|
|
|
|
*Large-Scale Skipjack Deployment Announced
|
|
+---------------------------------------------
|
|
|
|
At the December 9, 1993 meeting of the Computer Systems Security and
|
|
Privacy Advisory Board, an NSA official announced plans to deploy from
|
|
10,000 to 70,000 Skipjack devices in the Defense Messaging System in
|
|
the near future. The exact size of the order was said to be dependent
|
|
only on budget constraints. The Administration is on record in the
|
|
national press promising that no large-scale Skipjack deployment would
|
|
occur until a final report of the Administration Task Force was
|
|
complete. Ten thousand units was set as the upper limit of initial
|
|
deployment. Skipjack deployment at the level planned in the Defense
|
|
Messaging System circumvents both the FIPS notice and comments process
|
|
which has been left in a state of limbo, as well as the Administration's
|
|
promise of a comprehensive policy framework.
|
|
|
|
*New FBI Digital Telephony Legislation Proposed
|
|
+--------------------------------------------------
|
|
|
|
The FBI recently proposed a new "Digital Telephony" bill. After initial
|
|
analysis, we strongly oppose the bill, which would require all common
|
|
carriers to construct their networks to deliver to law enforcement
|
|
agencies, in real time, both the contents of all communications on their
|
|
networks and the "signaling" or transactional information.
|
|
|
|
In short, the bill lays the groundwork for turning the National
|
|
Information Infrastructure into a nation-wide surveillance system, to be
|
|
used by law enforcement with few technical or legal safeguards. This
|
|
image is not hyperbole, but a real assessment of the power of the
|
|
technology and inadequacy of current legal and technical privacy
|
|
protections for users of communications networks.
|
|
|
|
Although the FBI suggests that the bill is primarily designed to
|
|
maintain status quo wiretap capability in the face of technological
|
|
changes, in fact, it seeks vast new surveillance and monitoring tools.
|
|
|
|
Lengthy delays on the promised policy report, along with these
|
|
unilateral steps toward Clipper/Skipjack deployment, lead us to believe
|
|
that Administration policy is stalled by the Cold War-era national
|
|
security concerns that have characterized cryptography policy for the
|
|
last several decades.
|
|
EFF believes that it would be a disastrous error to allow national
|
|
information policy -- now a critical component of domestic policy -- to
|
|
be dictated solely by backward-looking national-security priorities and
|
|
unsubstantiated law-enforcement claims. The directions set by this
|
|
Administration will have a major impact on privacy, information
|
|
security, and the fundamental relationship between the government and
|
|
individual autonomy. This is why the Administration must take action--
|
|
and do so before the aforementioned agencies proceed further--to ensure
|
|
that cryptography policy is restructured to serve the
|
|
interests of privacy and security in the National Information
|
|
Infrastructure. We still believe the Administration can play the
|
|
leadership role it was meant to play in shaping this policy. If it does
|
|
not, the potential of the NII, and of fundamental civil liberties in the
|
|
information age, will be threatened.
|
|
|
|
IV.Congressional oversight of cryptography & privacy policy is
|
|
+---------------------------------------------------------------
|
|
urgently needed to right the balance between privacy,
|
|
+---------------------------------------------------------
|
|
competitiveness & law enforcement needs
|
|
+-------------------------------------------
|
|
|
|
All participants in this debate recognize that the need for
|
|
privacy and security is real, and that new technologies pose real
|
|
challenges for law enforcement and national security operations.
|
|
However, the solutions now on the table cripple the NII, pose grave
|
|
threats to privacy, and fail to even meet law enforcement objectives.
|
|
In our judgment, the Administration has failed, thus far, to articulate
|
|
a comprehensive set of policies which will advance the goals upon
|
|
which we all agree.
|
|
Congress must act now to ensure that cryptography policy is
|
|
developed in the context of the broader goal of promoting the
|
|
development of an advanced, interoperable, secure, information
|
|
infrastructure.
|
|
In order to meet the privacy and security needs of the growing
|
|
infrastructure, Congress should seek a set of public policies which
|
|
promote the widespread availability of cryptographic systems according
|
|
to the following criteria:
|
|
|
|
*Use Voluntary Standards to Promote Innovation and Meet
|
|
+----------------------------------------------------------
|
|
Diverse Needs:
|
|
+------------------
|
|
|
|
The National Information Infrastructure stretches to
|
|
encompass devices as diverse as super computers, handheld personal
|
|
digital assistants and other wireless communications devices, and plain
|
|
old telephones. Communication will be carried over copper wires, fiber
|
|
optic cables, and satellite links. The users of the infrastructure will
|
|
range from elementary school children to federal agencies. Encryption
|
|
standards must be allowed to develop flexibly to meet the wide-ranging
|
|
needs all components of the NII. In its IITF Report, the Administration
|
|
finds that standards also must be compatible with the large installed
|
|
base of communications technologies, and flexible and adaptable enough
|
|
to meet user needs at affordable costs. [_AA_, 9] The diverse uses of
|
|
the NII require that any standard which the government seeks to promote
|
|
as a broadly deployed solution should be implementable in software as
|
|
well as hardware and based on widely available algorithms.
|
|
|
|
*Develop Trusted Algorithms and End-to-End Security:
|
|
+-------------------------------------------------------
|
|
|
|
Assuring current and future users of the NII that their communications
|
|
are
|
|
secure and their privacy is protected is a critical task. This means
|
|
that the
|
|
underlying algorithms adopted must have a high level of public trust and
|
|
the overall systems put in place must be secure.
|
|
|
|
*Encourage National and International Interoperability:
|
|
+----------------------------------------------------------
|
|
|
|
The promise of the NII is seamless national and international
|
|
communications of all types. Any cryptographic standard offered for
|
|
widespread use must allow US corporations and individuals to function as
|
|
part of the global economy and global communications infrastructure.
|
|
|
|
*Seek Reasonable Cooperation with Law Enforcement and National
|
|
+-----------------------------------------------------------------
|
|
Security Needs:
|
|
+-------------------
|
|
|
|
New technologies pose new challenges to law enforcement and national
|
|
security surveillance activities. American industry is committed to
|
|
working with law enforcement to help meet its legitimate surveillance
|
|
needs, but the development of the NII should not be stalled on this
|
|
account.
|
|
|
|
*Promote Constitutional Rights of Privacy and Adhere to Traditional
|
|
+----------------------------------------------------------------------
|
|
Fourth Amendment Search and Seizure Rules:
|
|
+----------------------------------------
|
|
|
|
New technology can either be a threat or an aid to protection of
|
|
fundamental privacy rights. Government policy should promote
|
|
technologies which enable individuals to protect their privacy and be
|
|
sure that those technologies are governed by laws which respect the
|
|
long history of constitutional search and seizure restraints.
|
|
|
|
*Maintain Civilian Control over Public Computer and
|
|
+------------------------------------------------------
|
|
Communications Security:
|
|
-----------------------------
|
|
|
|
In accordance with the Computer Security Act of 1987, development of
|
|
security and privacy standards should be directed by the civilian
|
|
|
|
V.Conclusion
|
|
---------------
|
|
|
|
Among the most important roles that the federal government has in
|
|
NII deployment are setting standards and guaranteeing privacy and
|
|
security. Without adequate security and privacy, the NII will never
|
|
realize it economic or social potential. Cryptography policy must, of
|
|
course, take into account the needs of law enforcement and national
|
|
security agencies, but cannot be driven by these concerns alone. The
|
|
Working Group, along with other industry and public interest
|
|
organizations, is committed to working with the Administration to
|
|
solving the privacy and security questions raised by the growing NII.
|
|
This must be done based on the principles of voluntary standards,
|
|
promotion of innovation, concern for law enforcement needs, and
|
|
protection of constitutional rights of privacy.
|
|
|
|
------------------------------
|
|
|
|
Date: Fri, 6 May 1994 12:07:04 -0400 (EDT)
|
|
From: Stanton McCandlish <mech@EFF.ORG>
|
|
Subject: File 2--Whit Diffie testimony - Senate Clipper Hearing, May 3 1994
|
|
|
|
|
|
Key Escrow: Its Impact and Alternatives
|
|
|
|
|
|
Testimony of
|
|
Dr. Whitfield Diffie
|
|
Distinguished Engineer
|
|
Sun Microsystems, Inc.
|
|
|
|
Before the Subcommitee on Technology and the Law
|
|
of the Senate Judiciary Committee
|
|
|
|
|
|
3 May 1994
|
|
|
|
Dr. Diffie is also testifying on behalf of the Digital Privacy and
|
|
Security Working Group, a group of more than 50 computer, communications
|
|
and public interest organizations and associations working on
|
|
communications privacy issues.
|
|
|
|
|
|
|
|
I would like to begin by expressing my thanks to the chairman, the
|
|
members of the committee, and the committee staff for the chance not
|
|
only of appearing before this committee, but of appearing in such
|
|
distinguished company. It is a pleasure to be able to present not
|
|
only my own concerns and those of Sun Microsystem, but to have the
|
|
opportunity of representing the Digital Privacy and Security Working
|
|
Group.
|
|
|
|
I think it is also appropriate to say a few words about my
|
|
experience in the field of communication security. I first began
|
|
thinking about cryptography while working at Stanford University in
|
|
the late summer of 1972. This subsequently brought me into contact
|
|
with Professor Martin E. Hellman of the Electrical Engineering
|
|
Department. Marty and I worked together throughout the mid-1970s and
|
|
discovered the family of techniques now known as public key
|
|
cryptography. It is these techniques that are directly responsible
|
|
for the issue before the committee today. Prior to public key
|
|
cryptography, any large scale cryptographically secure system required
|
|
trusted elements with the fundamental capability of decrypting any
|
|
message protected by the system. Public key cryptography eliminated
|
|
the need for network subscribers to place this level of trust in any
|
|
network element. In so doing, it potentially reduced the subscribers'
|
|
vulnerability to government wiretapping. It is this vulnerability
|
|
that the Escrowed Encryption Initiative, seeks to reintroduce.
|
|
|
|
In 1978, I walked through the revolving door from academia to
|
|
industry and for a dozen years was `Manager of Secure Systems
|
|
Research' at Northern Telecom. In 1991, I took my present position
|
|
with Sun Microsystems. This has allowed me an inside look at the
|
|
problems of communication security from the viewpoints of both the
|
|
telecommunications and computer industries.
|
|
|
|
|
|
The Key Escrow Program
|
|
|
|
Just over a year ago, the Administration revealed plans for a
|
|
program of key escrow technology best known by the name of its
|
|
flagship product the Clipper chip. The program's objective is to
|
|
promote the use of cryptographic equipment incorporating a special
|
|
back door or trap door mechanism that will permit the
|
|
Federal Government to decrypt communications without the knowledge or
|
|
consent of the communicating parties when it considers this necessary
|
|
for law enforcement or intelligence purposes. In effect, the privacy
|
|
of these communications will be placed in escrow with the Federal
|
|
Government.
|
|
|
|
The committee has asked me to address myself to this proposal and
|
|
in particular to consider three issues:
|
|
|
|
o Problems with key escrow, particularly in the area of privacy.
|
|
|
|
o The impact of the key escrow proposal on American business
|
|
both at home and abroad.
|
|
|
|
o Alternatives to key escrow.
|
|
|
|
|
|
Scope
|
|
|
|
In the course of discussing the key escrow program over the past
|
|
year, I have often encountered a piecemeal viewpoint that seeks to
|
|
take each individual program at face value and treat it independently
|
|
of the others. I believe, on the contrary, that it is appropriate to
|
|
take a broad view of the issues. The problem confronting us is to
|
|
assess the advisability of key escrow and its impact on our society.
|
|
This requires examining the effect of private, commercial, and
|
|
possibly criminal use of cryptography and the advisability and effect
|
|
of the use of communications intelligence techniques by law
|
|
enforcement. In doing this, I will attempt to avoid becoming bogged
|
|
down in the distinctions between the Escrowed Encryption Standard
|
|
(FIPS185) with its orientation toward telephone communications and the
|
|
CAPSTONE/TESSERA/MOSAIC program with its orientation toward
|
|
computer networks. I will treat these, together with the Proposed
|
|
Digital Signature Standard and to a lesser extent the Digital
|
|
Telephony Proposal, as a unified whole whose objective is to maintain
|
|
and expand electronic interception for both law enforcement and
|
|
national security purposes.
|
|
|
|
|
|
Privacy Problems of Key Escrow
|
|
|
|
When the First Amendment became part of our constitution in 1791,
|
|
speech took place in the streets, the market, the fields, the office, the
|
|
bar room, the bedroom, etc. It could be used to express intimacy,
|
|
conduct business, or discuss politics and it must have been recognized
|
|
that privacy was an indispensable component of the character of many
|
|
of these conversations. It seems that the right --- in the case of
|
|
some expressions of intimacy even the obligation --- of the
|
|
participants to take measures to guarantee the privacy of their
|
|
conversations can hardly have been in doubt, despite the fact that the
|
|
right to speak privately could be abused in the service of crime.
|
|
|
|
Today, telephone conversations stand on an equal footing with the
|
|
venues available in the past. In particular, a lot of political
|
|
speech --- from friends discussing how to vote to candidates planning
|
|
strategy with their aides --- occurs over the phone. And, of all the
|
|
forms of speech protected by the first amendment, political speech is
|
|
foremost. The legitimacy of the laws in a democracy grows out of the
|
|
democratic process. Unless the people are free to discuss the issues
|
|
--- and privacy is an essential component of many of these discussions
|
|
--- that process cannot take place.
|
|
|
|
There has been a very important change in two hundred years, however.
|
|
In the seventeen-nineties two ordinary people could achieve a high
|
|
degree of security in conversation merely by the exercise of a little
|
|
prudence and common sense. Giving the ordinary person comparable
|
|
access to privacy in the normal actions of the world today requires
|
|
the ready availability of complex technical equipment. It has been
|
|
thoughtlessly said, in discussions of cryptographic policy, that
|
|
cryptography brings the unprecedented promise of absolute privacy. In
|
|
fact, it only goes a short way to make up for the loss of an assurance
|
|
of privacy that can never be regained.
|
|
|
|
As is widely noted, there is a fundamental similarity between the
|
|
power of the government to intercept communications and its ability to
|
|
search premises. Recognizing this power, the fourth amendment places
|
|
controls on the government's power of search and similar controls have
|
|
been placed by law on the use of wiretaps. There is, however, no
|
|
suggestion in the fourth amendment of a guarantee that the government
|
|
will find what it seeks in a search. Just as people have been free to
|
|
to protect the things they considered private, by hiding them or storing
|
|
them with friends, they have been free to protect their conversations
|
|
from being overheard.
|
|
|
|
The ill ease that most people feel in contemplating police use of
|
|
wiretaps is rooted in awareness of the abuses to which wiretapping can
|
|
be put. Unlike a search, it is so unintrusive as to be invisible to
|
|
its victim and this inherently undermines accountability.
|
|
Totalitarian regimes have given us abundant evidence that the use of
|
|
wiretaps and even the fear of their use can stifle free speech. Nor
|
|
is the political use of electronic surveillance a strictly foreign
|
|
problem. We have precedent in contemporary American history for its
|
|
use by the party in power in its attempts to stay in power?
|
|
|
|
The essence of the key escrow program is an attempt use the buying
|
|
power and export control authority of government to promote standards
|
|
that will deny ordinary people ready options for true protection of
|
|
their conversations. In a world where more and more communication
|
|
take place between people who frequently can not meet face to face,
|
|
this is a dangerous course of action.
|
|
|
|
|
|
The objections raised so far apply to the principle of key escrow.
|
|
Objections can also be raised to details of the present proposal. These
|
|
deal with the secrecy of the algorithm, the impact on security of the
|
|
escrow mechanism, and the way in which the proposal has been put into
|
|
effect.
|
|
|
|
Secrecy of the SKIPJACK Algorithm
|
|
|
|
An objection that has been raised to the current key escrow
|
|
proposal is that the cryptographic algorithm used in the Clipper Chip
|
|
is secret and is not available for public scrutiny. One counter to
|
|
this objection is that the users of cryptographic equipment are
|
|
neither qualified to evaluate the quality of the algorithm nor, with
|
|
rare exceptions, interested in attempting the task. In a fundamental
|
|
way, these objections miss the point.
|
|
|
|
Within the national security establishment, responsibility for
|
|
communication security is well understood. It rests with NSA. In
|
|
industry, the responsibility is far more diffuse. Individual users
|
|
are not typically concerned with the functioning of pieces of
|
|
equipment. They acquire trust through a complex social web comprising
|
|
standards, corporate security officers, professional societies, etc.
|
|
A classified standard foisted on the civilian sector will have only
|
|
one element of this process, Federal endorsement.
|
|
|
|
One consequence of the use of a classified algoritym that is of
|
|
particular concern to industry is the fact that the algorithm is only
|
|
available in tamper resistant hardware. Software is one of the most
|
|
flexible and economical ways of building products known. In typical
|
|
computer engineering practice, the additional expense of implementing
|
|
functions in hardware is only undertaken when the speed of software in
|
|
not adequate for the task. Often in these cases, more expensive,
|
|
higher performance, hardware implementations interoperate with less
|
|
expensive, lower performance versions. Having a standard that can
|
|
only be implemented in hardware will increase costs and damage
|
|
interoperability.
|
|
|
|
Security Problems with Key Escrow
|
|
|
|
From the viewpoint of a user, any key escrow system diminishes
|
|
security. It puts potential for access to the user's communications
|
|
in the hands of an escrow agent who's intentions, policies, security
|
|
capabilities, and future cannot be entirely known. In the context of
|
|
modern secure telephone systems, the contrast between escrowed and
|
|
unescrowed communications is particularly stark. In the process of
|
|
setting up a secure call, modern secure telephones manufacture
|
|
cryptographic keys that will be used for the protection of one and only
|
|
one call and will be erased after the call is complete. Public key
|
|
cryptography has made it possible to do this in such a way that these
|
|
keys, once erased, can never be recovered. This give the users
|
|
a degree of privacy similar to that in a face to face meeting. The
|
|
effect of key escrow is much like having a tape recorder on throughout
|
|
the meeting. Even if the tapes are very carefully protected, the
|
|
people whose words they hold can never be certain that they will not
|
|
someday be played to a much wider audience.
|
|
|
|
|
|
There are also specific vulnerabilities associated with the
|
|
present proposal.
|
|
|
|
The Skipjack algorithm uses 80-bit keys. If it is as good as NSA
|
|
claims, cryptanalyzing it will require searching through all these
|
|
keys or doing about a million billion billion encryptions. This makes
|
|
it sixteen million times as hard to break as DES. A telephone
|
|
conversation would have to be valuable indeed to justify the expense
|
|
of such a computation and it is quite plausible that this is entirely
|
|
infeasible today.
|
|
|
|
The problem is that in creating the Law Enforcement Access Field,
|
|
or LEAF that implements key escrow, the Clipper chip also uses 80-bit
|
|
keys. This means that in order to be able to decode everything ever
|
|
encrypted by a Clipper chip it is only necessary to do a little more
|
|
than twice as much work as would be required to read any one message
|
|
--- one cryptanalysis to recover a Session Key followed by one to
|
|
recover the Device Unique Key. A third cryptanalysis is needed to
|
|
obtain the Family Key, but this need be done only once, since it is
|
|
the same in all chips.
|
|
|
|
The process is conceptually straightforward.
|
|
|
|
1. Starting with a set of messages encrypted with a particular
|
|
Clipper chip cryptanalyze the LEAF fields, by trying every
|
|
key, until a key is found that produces a well formed
|
|
plaintext from every LEAF. This works because the LEAF
|
|
specifically includes an authenticator designed to make well
|
|
formed LEAFs recognizable. Once the Family Key has been found
|
|
it can be used in attacking any Clipper Chip and this process
|
|
need not be repeated.
|
|
|
|
2. Pick a message and decrypt its LEAF with the family key.
|
|
Eighty bits of the result form a cryptogram whose plaintext
|
|
is the Session Key used to encrypt the message. Decrypt
|
|
this field with every key in turn. Try decrypting the message
|
|
with each resulting 80-bit quantity to see if it is the
|
|
correct session key. When the correct session key is discovered,
|
|
the key that produced it will be the correct Device Unique
|
|
Key.
|
|
|
|
3. The combination of the Family Key and the Device Unique Key
|
|
can now be used to read any message ever encrypted by
|
|
the Clipper chip under attack.
|
|
|
|
It might be argued that the scenario described above requires
|
|
knowing the SKIPJACK algorithm and the LEAF creation method, both of
|
|
which are classified. It is an article of faith, however, in
|
|
communications security that nothing that stays constant for a long
|
|
period of time can be counted on to remain secret. With the passage
|
|
of time, the chances that the chips will be reverse engineered
|
|
increases.
|
|
|
|
Irregularities in Adoption of the Standard
|
|
|
|
Finally, there are disturbing aspects to the development of the
|
|
key escrow FIPS. Under the Computer Security Act of 1987,
|
|
responsibility for security of civilian communications rests with the
|
|
National Institute of Standards and Technology. Pursuant to this
|
|
statute, the Escrowed Encryption Standard appeared as Federal
|
|
Information Processing Standard 185, under the auspices of the
|
|
Commerce Department. Apparently, however, authority over the secret
|
|
technology underlying the standard and the documents embodying this
|
|
technology, continues to reside with NSA. We thus have a curious
|
|
arrangement in which a Department of Commerce standard seems to be
|
|
under the effective control of a Department of Defense agency. This
|
|
appears to violate at least the spirit of the Computer Security Act
|
|
and strain beyond credibility its provisions for NIST's making use of
|
|
NSA's expertise.
|
|
|
|
|
|
Impact on Business
|
|
|
|
Business today is characterized by an unprecedented freedom and
|
|
volume of travel by both people and goods. Ease of communication,
|
|
both physical and electronic, has ushered in an era of international
|
|
markets and multinational corporations. No country is large enough
|
|
that its industries can concentrate on the domestic market to the
|
|
exclusion of all others. When foreign sales rival or exceed domestic
|
|
ones, the structure of the corporation follows suit with new divisions
|
|
placed in proximity to markets, materials, or labor.
|
|
|
|
Security of electronic communication is as essential in this
|
|
environment as security of transportation and storage have been to
|
|
businesses throughout history. The communication system must ensure
|
|
that orders for goods and services are genuine, guarantee that
|
|
payments are credited to the proper accounts, and protect the privacy
|
|
of business plans and personal information.
|
|
|
|
Two new factors are making security both more essential and more
|
|
difficult to achieve. The first is the rise in importance of
|
|
intellectual property. Since much of what is now bought and sold is
|
|
information varying from computer programs to surveys of customer
|
|
buying habits, information security has become an end in itself rather
|
|
than just a means for ensuring the security of people and property.
|
|
The second is the rising demand for mobility in communications.
|
|
Traveling corporate computer users sit down at workstations they have
|
|
never seen before and expect the same environment that is on the desks
|
|
in their offices. They carry cellular telephones and communicate
|
|
constantly by radio. They haul out portable PCs and dial their home
|
|
computers from locations around the globe. With each such action they
|
|
expose their information to threats of eavesdropping and falsification
|
|
barely known a decade ago.
|
|
|
|
Because this information economy is relentlessly global, no nation
|
|
can successfully isolate itself from international competition. The
|
|
communication systems we build will have to be interoperable with
|
|
those of other nations. A standard based on a secret American
|
|
technology and designed to give American intelligence access to the
|
|
communications it protects seems an unlikely candidate for widespread
|
|
acceptance. If we are to maintain our leading position in the
|
|
information market places, we much give our full support to the
|
|
development of open international security standards that protect the
|
|
interests of all parties fairly.
|
|
|
|
|
|
Potential for Excessive Regulation
|
|
|
|
The key escrow program also presents the specter of increased
|
|
regulation of the design and production of new computer and
|
|
communications products. FIPS185 states that `Approved
|
|
implementations may be procured by authorized organizations for
|
|
integration into security equipment.' This raises the question of
|
|
what organizations will be authorized and what requirements will be
|
|
placed upon them? Is it likely that people prepared to require that
|
|
surveillance be built into communication switches would shrink from
|
|
requiring that equipment make pre-encryption difficult as a condition
|
|
for getting `approved implementations'? Such requirements have been
|
|
imposed as conditions of export approval for security equipment.
|
|
Should industry's need to acquire tamper resistant parts force it to
|
|
submit to such requirements, key escrow will usher in an era of
|
|
unprecedented regulation of American development and manufacturing.
|
|
|
|
|
|
Alternatives to Key Escrow
|
|
|
|
It is impossible to address the issue of alternatives to key
|
|
escrow, without asking whether there is a problem, what the problem
|
|
is and what solution, if any, the problem requires.
|
|
|
|
In recent testimony before this committee, the FBI has portrayed
|
|
communications interception as an indispensable tool of police work
|
|
and complained that the utility of this tool is threatened by
|
|
developments in modern communications. This testimony, however, uses
|
|
the broader term `electronic surveillance' almost exclusively and
|
|
appears to include some cases in which the electronic surveillance
|
|
consisted of bugs rather than wiretaps. Although the FBI testimony
|
|
speaks of numerous of convictions, it names not a single defendant,
|
|
court, case, or docket number. This imprecision makes adequate study
|
|
of the testimony impossible and leaves open two issues: the
|
|
effectiveness of communications interception in particular and that of
|
|
electronic surveillance in general.
|
|
|
|
On balance, it appears more likely that the investigative and
|
|
evidential utility of wiretaps is rising than that it is falling.
|
|
This is partly because criminals, like law abiding citizens, do more
|
|
talking on the phone these days. It is partly because modern
|
|
communication systems, like ISDN, provide much more information about
|
|
each call, revealing where it came from in real time even when it
|
|
originated a long way away. This detailed information about who
|
|
called whom, when, and for how long, that modern switches provide,
|
|
improves the PEN register and trap and trace techniques that police
|
|
use to map the extent of criminal conspiracies. It is unaffected by
|
|
any encryption that the callers may apply.
|
|
|
|
With respect to other kinds of electronic surveillance, the
|
|
picture for law enforcement looks even brighter. Miniaturization of
|
|
electronics and improvements in digital signal processing are making
|
|
bugs smaller, improving their fidelity, making them harder to detect,
|
|
and making them more reliable. Forms of electronic surveillance for
|
|
which no warrant is held to be necessary, particularly TV cameras in
|
|
public places, have become widespread. This creates a base of
|
|
information that was, for example, used in two distinct ways in the
|
|
Tylenol poisoning case of the mid-1980s.
|
|
|
|
Broadening the consideration of high tech crime fighting tools to
|
|
include vehicle tracking, DNA fingerprinting, individual recognition
|
|
by infrared tracing of the veins in the face, and database profiling,
|
|
makes it seem unlikely that the failures of law enforcement are due
|
|
to the inadequacy of its technical tools.
|
|
|
|
If we turn our attention to foreign intelligence, we see a similar
|
|
picture. Communications intelligence today is enjoying a golden age.
|
|
The steady migration of communications from older, less accessible,
|
|
media, both physical and electronic, has been the dominant factor.
|
|
The loss of information resulting from improvements in security has
|
|
been consistently outweighed by the increased volume and quality of
|
|
information available. As a result, the communications intelligence
|
|
product has been improving for more than fifty years, with no end in
|
|
sight. The rising importance of telecommunications in the life of
|
|
industrialized countries coupled with the rising importance of
|
|
wireless communications, can be expected to give rise to an
|
|
intelligence bonanza in the decades to come.
|
|
|
|
Mobile communication is one of the fastest growing areas of the
|
|
telecommunications industry and the advantages of cellular phones,
|
|
wireless local area networks, and direct satellite communication
|
|
systems are such that they are often installed even in applications
|
|
where mobility is not required. Satellite communications are in
|
|
extensive use, particularly in equatorial regions and cellular
|
|
telephone systems are being widely deployed in rural areas throughout
|
|
the world in preference to undertaking the substantial expense of
|
|
subscriber access wiring.
|
|
|
|
New technologies are also opening up new possibilities. Advances
|
|
in emitter identification, network penetration techniques, and the
|
|
implementation of cryptanalytic or crypto-diagnostic operations within
|
|
intercept equipment are likely to provide more new sources of
|
|
intelligence than are lost as a result of commercial use of
|
|
cryptography.
|
|
|
|
It should also be noted that changing circumstances change
|
|
appropriate behavior. Although intelligence continues to play a vital
|
|
role in the post cold war world, the techniques that were appropriate
|
|
against an opponent capable of destroying the United States within
|
|
hours may not be appropriate against merely economic rivals.
|
|
|
|
If, however, that we accept that some measure of control over
|
|
the deployment of cryptography is needed, we must distinguish two
|
|
cases:
|
|
|
|
The use of cryptography to protect communications and
|
|
|
|
The use of cryptography to protect stored information.
|
|
|
|
|
|
It is good security practice in protecting communications to keep any
|
|
keys that can be used to decipher the communications for as short a
|
|
time as possible. Discoveries in cryptography in the past two decades
|
|
have made it possible to have secure telephones in which the keys last
|
|
only for the duration of the call and can never be recreated,
|
|
thereafter. A key escrow proposal surrenders this advantage by
|
|
creating a new set of escrowed keys that are stored indefinitely and
|
|
can always be used to read earlier traffic.
|
|
|
|
With regard to protection of stored information, the situation is
|
|
quite different. The keys for decrypting information in storage must
|
|
be kept for the entire lifetime of the stored information; if they are
|
|
lost, the information is lost. An individual might consider
|
|
encrypting files and trusting the keys to memory, but no organization
|
|
of any size could risk the bulk of its files in this fashion. Some
|
|
form of key archiving, backup, or escrow is thus inherent in the use
|
|
of cryptography for storage. Such procedured will guarantee that
|
|
encrypted files on disks are accessible to subpoena in much the same
|
|
way that file on paper are today.
|
|
|
|
Many business communications, such as electronic funds transfers,
|
|
fall into an intermediate category. Although the primary purpose is
|
|
communication rather than storage, the transactions are of a formal
|
|
nature. In these cases, an escrow mechanism much like those in
|
|
current commercial use may be appropriate. In a high value
|
|
transaction, where the buyer and seller do not have an established
|
|
business relationship, either party may demand the use of a mutually
|
|
trusted escrow agent who will take temporary custody of both the goods
|
|
and the payment. In a similar fashion, either party to an encrypted
|
|
transaction might demand that only keys escrowed with a mutually
|
|
acceptable escrow agent be used.
|
|
|
|
What is most important here is that the laws, customs, and
|
|
practices governing electronic commerce and, in a broader context,
|
|
electronic society are just beginning to develop. It is likely that
|
|
escrow mechanisms will be among the tools employed. It is, however,
|
|
too early to say what form they should take. They will need to be
|
|
worked out as society gets more experience with the new communications
|
|
media. They should not be imposed by government before society's real
|
|
needs have been determined.
|
|
|
|
|
|
Conduct of the Key Escrow Initiative
|
|
|
|
In my experience, the people who support the key escrow initiative
|
|
are inclined to express substantial trust in the government. I find
|
|
it ironic therefore that in its conduct of this program, the
|
|
administration has followed a course that could hardly have been
|
|
better designed to provoke distrust. The introduction of mechanisms
|
|
designed to assure the governments ability to conduct electronic
|
|
surveillance on its citizens and limit the ability of the citizens to
|
|
protect themselves against such surveillance is a major policy
|
|
desision of the information age. It has been presented, however, as a
|
|
technicality, buried in an obscure series of regulations. In so
|
|
doing, it has avoided congressional consideration of either its
|
|
objectives or its budget. The underlying secrecy of the technology
|
|
has been used as a tool for doleing out information piecemeal and
|
|
making a timely understanding of the issues difficult to achieve.
|
|
|
|
|
|
Suppose We Make a Mistake
|
|
|
|
In closing, I would like to ask a question. Suppose we make a
|
|
mistake?
|
|
|
|
o Suppose we fail to adopt a key excrow system and later
|
|
decide that one is needed?
|
|
|
|
o Suppose we adopt a key escrow system now when none is
|
|
needed?
|
|
|
|
Which would be the more serious error?
|
|
|
|
It is generally accepted that rights are not absolute. If private
|
|
access to high-grade encryption presented a clear and present danger to
|
|
society, there would be little political opposition to controlling it.
|
|
The reason there is so much disagreement is that there is so little
|
|
evidence of a problem.
|
|
|
|
If allowing or even encouraging wide dissemination of high-grade
|
|
cryptography proves to be a mistake, it will be a correctable mistake.
|
|
Generations of electronic equipment follow one another very quickly.
|
|
If cryptography comes to present such a problem that there is popular
|
|
consensus for regulating it, this will be just as possible in a decade
|
|
as it is today. If on the other hand, we set the precedent of
|
|
building government surveillance capabilities into our security
|
|
equipment we risk entrenching a bureaucracy that will not easily
|
|
surrender the power this gives.
|
|
|
|
|
|
Recommendation
|
|
|
|
In light of these considerations, I would like to suggest that the
|
|
Federal Standards making process should be brought back into line with
|
|
the intent of the Computer Security Act of 1987. Congres should press
|
|
the National Institute of Standards and Technology, with the
|
|
cooperation of the National Security Agency, to declassify the
|
|
SKIPJACK algorithm and issue a revised version of FIPS 185 that
|
|
specifies the algorithm and omits the key escrow provisions. This would
|
|
be a proper replacement for FIPS 46, the Data Encryption Standard, and
|
|
would serve the needs of the U.S. Government, U.S. industry, and U.S.
|
|
citizens for years to come.
|
|
|
|
|
|
Notes
|
|
|
|
I have examined some aspects of the subjects treated here at
|
|
greater length in other testimony and comments and copies of these
|
|
have been made available to the committee.
|
|
|
|
"The Impact of Regulating Cryptography on the Computer and
|
|
Communications Industries" Testimony Before the House Subcommittee on
|
|
Telecommunications and Finance, 9 June 1993.
|
|
|
|
"The Impact of a Secret Cryptographic Standard on Encryption,
|
|
Privacy, Law Enforcement and Technology" Testimony Before the House
|
|
Subcommittee on Science and Technology, 11 May 1993
|
|
|
|
Letter to the director of the Computer Systems Laboratory at the
|
|
National Institute of Standards and Technology, commenting on the
|
|
proposed Escrowed Encryption Standard, 27 September 1993.
|
|
|
|
|
|
------------------------------
|
|
|
|
End of Computer Underground Digest #6.44
|
|
************************************
|
|
|