916 lines
42 KiB
Plaintext
916 lines
42 KiB
Plaintext
|
||
|
||
Computer underground Digest Wed Aug 11 1993 Volume 5 : Issue 60
|
||
ISSN 1004-042X
|
||
|
||
Editors: Jim Thomas and Gordon Meyer (TK0JUT2@NIU.BITNET)
|
||
Archivist: Brendan Kehoe
|
||
Shadow-Archivists: Dan Carosone / Paul Southworth
|
||
Ralph Sims / Jyrki Kuoppala
|
||
Ian Dickinson
|
||
Copie Editor: Etaoin Shrdlu, Senior
|
||
|
||
CONTENTS, #5.60 (Aug 11 1993)
|
||
File 1--CuD #5.59 Glitch & Some Belated Thanks from CuD to "helpers"
|
||
File 2--Computers, Freedom & Privacy, '94 -- Conference INFO
|
||
File 3--Congressional Fellowships for PhD/s-telecommunications
|
||
File 4--$50,000 Fine for "Software Theft" (Canadian News)
|
||
File 5--SKIPJACK Review (Encryption Background and Assessment)
|
||
|
||
Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
|
||
available at no cost electronically from tk0jut2@mvs.cso.niu.edu. The
|
||
editors may be contacted by voice (815-753-0303), fax (815-753-6302)
|
||
or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL
|
||
60115.
|
||
|
||
Issues of CuD can also be found in the Usenet comp.society.cu-digest
|
||
news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of
|
||
LAWSIG, and DL1 of TELECOM; on GEnie in the PF*NPC RT
|
||
libraries and in the VIRUS/SECURITY library; from America Online in
|
||
the PC Telecom forum under "computing newsletters;"
|
||
On Delphi in the General Discussion database of the Internet SIG;
|
||
on the PC-EXEC BBS at (414) 789-4210; and on: Rune Stone BBS (IIRG
|
||
WHQ) (203) 832-8441 NUP:Conspiracy; RIPCO BBS (312) 528-5020
|
||
CuD is also available via Fidonet File Request from 1:11/70; unlisted
|
||
nodes and points welcome.
|
||
EUROPE: from the ComNet in LUXEMBOURG BBS (++352) 466893;
|
||
In ITALY: Bits against the Empire BBS: +39-461-980493
|
||
|
||
ANONYMOUS FTP SITES:
|
||
UNITED STATES: ftp.eff.org (192.88.144.4) in /pub/cud
|
||
etext.archive.umich.edu (141.211.164.18) in /pub/CuD/cud
|
||
halcyon.com( 202.135.191.2) in /pub/mirror/cud
|
||
aql.gatech.edu (128.61.10.53) in /pub/eff/cud
|
||
AUSTRALIA: ftp.ee.mu.oz.au (128.250.77.2) in /pub/text/CuD.
|
||
EUROPE: nic.funet.fi in pub/doc/cud. (Finland)
|
||
ftp.warwick.ac.uk in pub/cud (United Kingdom)
|
||
|
||
COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
|
||
information among computerists and to the presentation and debate of
|
||
diverse views. CuD material may be reprinted for non-profit as long
|
||
as the source is cited. Authors hold a presumptive copyright, and
|
||
they should be contacted for reprint permission. It is assumed that
|
||
non-personal mail to the moderators may be reprinted unless otherwise
|
||
specified. Readers are encouraged to submit reasoned articles
|
||
relating to computer culture and communication. Articles are
|
||
preferred to short responses. Please avoid quoting previous posts
|
||
unless absolutely necessary.
|
||
|
||
DISCLAIMER: The views represented herein do not necessarily represent
|
||
the views of the moderators. Digest contributors assume all
|
||
responsibility for ensuring that articles submitted do not
|
||
violate copyright protections.
|
||
|
||
----------------------------------------------------------------------
|
||
|
||
Date: Tue, 3 Aug 1993 18:31:44 CDT
|
||
From: CuD Moderators <cudigest@mindvox.phantom.com>
|
||
Subject: File 1--CuD #5.59 Glitch & Some Belated Thanks from CuD to "helpers"
|
||
|
||
By now, those who receive CuD from the mailing list are aware of a
|
||
major glitch that occurred in #5.59. The good news is that the glitch
|
||
was *NOT* our error (or even a human error). It was a software glitch
|
||
in the mailer, apparently resulting from an overload. It has hopefully
|
||
been corrected, but we will likely move to another distribution site
|
||
to accommodate the growing list. We estimate that CuD currently has a
|
||
readership of about 80,000, only about 1.8 percent of whom are
|
||
on the mailing list, so the glitch did not affect most readers.
|
||
Nonetheless, we are embarrassed and regret any inconvenience it
|
||
caused.
|
||
|
||
We're also pleased with the CuD readers' response when they discovered
|
||
the glitch. Only one displayed the mildest of pique. The rest ranged
|
||
from a matter of fact "oops" to floor-rolling funny lines. Proving, of
|
||
course, what we've suspected all along: CuD readers are a bright,
|
||
gentle, and forgiving lot. The best line comes from Scott Best:
|
||
|
||
Of course, if "glitch"'s keep happening, they soon become
|
||
"snafu"'s. And snafus become bugs. And, finally, bugs will
|
||
ulitimately become performance features.
|
||
|
||
And, while we're at it:
|
||
|
||
Gordon Meyer and I put out CuD, and--if we run a good issue--we
|
||
generally receive a few accolades. But, CuD is actually a collective
|
||
enterprise that depends heavily on readers for contributions and other
|
||
services. Without them, we couldn't continue. Some belated thanks to
|
||
people who have been helpful over the past few months:
|
||
|
||
1. Special enthusiastic gratitudinous thanks to the folks at Central
|
||
Michigan University who provided us with a site for mailing out CuDs.
|
||
Their patience and assistance have been invaluable in reducing mailing
|
||
time and the load on NIU's system. There have been fewer bounces,
|
||
faster delivery, and (we hope) better service.
|
||
|
||
2. Thanks to Mike Stack, Mike Preis, and the other computer gurus at
|
||
NIU who have been helpful in redirecting mail, patient and calm
|
||
despite the habitually crashed mailer (prior to the mailserv switch)
|
||
and have remained low-key and supportive despite the strain we've
|
||
placed on the system. Thanks also to Neil Rickert, just on general
|
||
principle. Someday, the NIU administrators may find it in their hearts
|
||
to replace WYLBUR, an archaic operating system, but until then, these
|
||
guys--along with Phil Rider and others--nurse us along with humor and
|
||
invariable magic.
|
||
|
||
3. Thanks to everybody who sends in articles. We often hear that
|
||
somebody didn't send something--such as a news blurb or other
|
||
info--because they assumed somebody else would do it. WHEN IT DOUBT,
|
||
SEND! It's better that we have multiple copies of something than
|
||
none--so, keep the articles and news blurbs coming. We're especially
|
||
in need of local or regional "computer crime" cases that we might not
|
||
see in the Chicago or New York papers.
|
||
|
||
4. Thanks to all the people who complimented us (as well as to those
|
||
who offered constructive criticism). And, thanks to those who felt
|
||
like flaming but hit the delete key instead.
|
||
|
||
5. Thanks to the BBS readers who lack Net access, but who nonetheless
|
||
send in their contributions via disk or call with information.
|
||
Sometimes we can't run it, but the information is crucial to keeping
|
||
us informed about the BBS world.
|
||
|
||
6. We're indebted to a number of individuals and groups who've been
|
||
supportive and helpful over the years. These include the LOD crowd,
|
||
PHRACK folk, Pat and Bruce at Mindvox (telnet mindvox.phantom.com),
|
||
John McMullen for allowing reprints of his piece, Joe Abernathy for
|
||
the same, Netta Gilboa of Gray Areas (one of the most promising new
|
||
'Zines to come along), Jack Ricard of Boardwatch (an essential 'Zine
|
||
for anybody interested in the BBS world), Dark Adept, Kim Clancy, Dr.
|
||
Ripco, and many, many others. And, of course, Chenko Kaninovich and
|
||
Maggie T. Katt. These, and others we don't have space to name
|
||
(short-term memory loss notwithstanding) have been invaluable over the
|
||
years, and we're grateful.
|
||
|
||
7. Thanks to Bill Cook and the Secret Service, without whom there
|
||
would be no Cu-Digest.
|
||
|
||
There are others we've probably forgotten to acknowledge, so thanks to
|
||
all of them. And, of course, thanks to Pat Townson, CuD's symbolic
|
||
progenitor.
|
||
|
||
------------------------------
|
||
|
||
Date: Tue, 3 Aug 1993 18:31:44 CDT
|
||
From: CuD Moderators <cudigest@mindvox.phantom.com>
|
||
Subject: File 2--Computers, Freedom & Privacy, '94 -- Conference INFO
|
||
|
||
"Computers, Freedom & Privacy '94"
|
||
George B. Trubow, General Chair
|
||
Timothy R. Rabel, Conference Coordinator
|
||
|
||
John Marshall Law School
|
||
315 South Plymouth Court
|
||
Chicago, IL 60604
|
||
|
||
e-mail = cfp94@jmls.edu voice = (312) 987-1419
|
||
fax = (312) 427-8307
|
||
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
|
||
|
||
|
||
Conference Announcement and Call for Papers
|
||
Computers, Freedom, and Privacy 1994
|
||
23-26 March 1994
|
||
|
||
Announcement
|
||
|
||
The fourth annual conference, "Computers, Freedom, and Privacy,"
|
||
will be held in Chicago, Il., March 23-26, 1994. This conference
|
||
will be jointly sponsored by the Association for Computing Machinery
|
||
(ACM) and The John Marshall Law School. George B. Trubow, professor
|
||
of law and director of the Center for Informatics Law at The John
|
||
Marshall Law School, is general chairman of the conference.
|
||
|
||
The advance of computer and communications technologies holds
|
||
great promise for individuals and society. From conveniences for
|
||
consumers and efficiencies in commerce to improved public health and
|
||
safety and increased knowledge of and participation in government and
|
||
community, these technologies are fundamentally transforming our
|
||
environment and our lives.
|
||
|
||
At the same time, these technologies present challenges to the
|
||
idea of a free and open society. Personal privacy is increasingly at
|
||
risk from invasions by high-tech surveillance and monitoring; a
|
||
myriad of personal information data bases expose private life to
|
||
constant scrutiny; new forms of illegal activity may threaten the
|
||
traditional barriers between citizen and state and present new tests
|
||
of Constitutional protection; geographic boundaries of state and
|
||
nation may be recast by information exchange that knows no boundaries
|
||
as governments and economies are caught up in global data networks.
|
||
|
||
Computers, Freedom, and Privacy '94 will present an assemblage
|
||
of experts, advocates and interested parties from diverse
|
||
perspectives and disciplines to consider the effects on freedom and
|
||
privacy resulting from the rapid technological advances in computer
|
||
and telecommunication science. Participants come from fields of
|
||
computer science, communications, law, business and commerce,
|
||
research, government, education, the media, health, public advocacy
|
||
and consumer affairs, and a variety of other backgrounds. A series of
|
||
pre-conference tutorials will be offered on March 23, 1994, with the
|
||
conference program beginning on Thursday, March 24, and running
|
||
through Saturday, March 26, 1994.
|
||
|
||
The Palmer House, a Hilton hotel located at the corner of State
|
||
Street and Washington Ave. in Chicago's "loop," and only about a
|
||
block from The John Marshall Law School buildings, will be the
|
||
conference headquarters. Room reservations should be made directly
|
||
with the hotel, mentioning The John Marshall Law School or "CFP'94"
|
||
to get the special conference rate of $99.00, plus tax.
|
||
|
||
The Palmer House Hilton
|
||
17 E. Monroe., Chicago, Il., 60603
|
||
Tel: 312-726-7500; 1-800-HILTONS; Fax 312-263-2556
|
||
|
||
Call for Papers and Program Suggestions
|
||
|
||
The emphasis at CFP'94 will be on examining the many potential
|
||
uses of new technology and considering recommendations for dealing
|
||
with them. Specific suggestions to harness the new technologies so
|
||
society can enjoy the benefits while avoiding negative implications
|
||
are solicited.
|
||
|
||
Proposals are requested from anyone working on a relevant paper,
|
||
or who has an idea for a program presentation that will demonstrate
|
||
new computer or communications technology and suggest what can be
|
||
done with it. Any proposal must: state the title of the paper or
|
||
program; describe the theme and content in a short paragraph; set out
|
||
the credentials and experience of the author or suggested speakers;
|
||
and should not exceed two pages. If an already completed paper is
|
||
being proposed for presentation, then a copy should be included with
|
||
the proposal.
|
||
|
||
Student Papers and Scholarships
|
||
|
||
It is anticipated that announcement of a student writing
|
||
competition for CFP'94 will be made soon, together with information
|
||
regarding the availability of a limited number of student
|
||
scholarships for the conference.
|
||
|
||
Timetables
|
||
|
||
Proposals for papers and programs are being accepted at this
|
||
time. It is intended that program committees will be finalized by
|
||
August 1, 1993. Proposals must be received by October 1, 1993.
|
||
|
||
Communications
|
||
|
||
Conference communications should be sent to:
|
||
CFP'94
|
||
The John Marshall Law School
|
||
315 S. Plymouth Ct.
|
||
Chicago, IL 60604
|
||
|
||
(Voice: 312-987-1419; Fax: 312-427-8307; E-mail: CFP94@jmls.edu)
|
||
|
||
------------------------------
|
||
|
||
Date: Mon, 2 Aug 1993 18:29:01 CDT
|
||
From: CuD Moderators <tk0jut2@mvs.cso.niu.edu>
|
||
Subject: File 3--Congressional Fellowships for PhD/s-telecommunications
|
||
|
||
CONGRESSIONAL FELLOWSHIPS for Ph.D. level scholars of any
|
||
discipline who have a demonstrated interest in telecommunications
|
||
and in public policy. Candidates must show promise of making a
|
||
significant contribution to the public's understanding of the
|
||
political process. Ten month program of practical experience on
|
||
Capitol Hill begins November 1994 and concludes August 1995.
|
||
Application deadline, DECEMBER 1, 1993. Information: Director,
|
||
Congressional Fellowship Program, American Political Science
|
||
Association, 1527 New Hampshire Ave., N.W., Washington, DC 20036.
|
||
ph 202-483-2512.
|
||
|
||
------------------------------
|
||
|
||
Date: Tue, 10 Aug 93 16:34:27 CDT
|
||
From: Mitch Pravatiner <U15289@UICVM.BITNET>
|
||
Subject: File 4--$50,000 Fine for "Software Theft" (Canadian News)
|
||
|
||
The following story from the July 24 _Vancouver Sun_ business section
|
||
may be of interest, especially the final paragraph.
|
||
|
||
$50,000 FINE FOR SOFTWARE THEFT
|
||
|
||
TORONTO--The first corporation in Canada to be charged with software
|
||
theft has pleaded guilty and been fined $50,000.
|
||
|
||
Rexcan Circuits Inc., of Belleville, Ont., has also agreed to submit
|
||
to a software audit by the Canadian Alliance Against Software Theft, a
|
||
computer industry association.
|
||
|
||
Related charges against two senior Rexcan executives were dropped.
|
||
|
||
Rexcan, a circuit board manufacturer employing 200, is a subsidiary of
|
||
Kuriko Electrical Industry Co. Ltd of Japan.
|
||
|
||
Software theft occurs when a single computer disk is used to load
|
||
software into more than one computer.
|
||
|
||
------------------------------
|
||
|
||
Date: Mon, 02 Aug 1993 15:23:28 -0400 (EDT)
|
||
From: denning@CS.GEORGETOWN.EDU(Dorothy Denning)
|
||
Subject: File 5--SKIPJACK Review (Encryption Background and Assessment)
|
||
|
||
SKIPJACK Review
|
||
|
||
Interim Report
|
||
|
||
The SKIPJACK Algorithm
|
||
|
||
|
||
Ernest F. Brickell, Sandia National Laboratories
|
||
Dorothy E. Denning, Georgetown University
|
||
Stephen T. Kent, BBN Communications Corporation
|
||
David P. Maher, AT&T
|
||
Walter Tuchman, Amperif Corporation
|
||
|
||
July 28, 1993
|
||
|
||
(copyright 1993)
|
||
|
||
|
||
Executive Summary
|
||
|
||
The objective of the SKIPJACK review was to provide a mechanism whereby
|
||
persons outside the government could evaluate the strength of the
|
||
classified encryption algorithm used in the escrowed encryption devices
|
||
and publicly report their findings. Because SKIPJACK is but one
|
||
component of a large, complex system, and because the security of
|
||
communications encrypted with SKIPJACK depends on the security of the
|
||
system as a whole, the review was extended to encompass other
|
||
components of the system. The purpose of this Interim Report is to
|
||
report on our evaluation of the SKIPJACK algorithm. A later Final
|
||
Report will address the broader system issues.
|
||
|
||
The results of our evaluation of the SKIPJACK algorithm are as
|
||
follows:
|
||
|
||
1. Under an assumption that the cost of processing power is halved
|
||
every eighteen months, it will be 36 years before the cost of
|
||
breaking SKIPJACK by exhaustive search will be equal to the cost
|
||
of breaking DES today. Thus, there is no significant risk that
|
||
SKIPJACK will be broken by exhaustive search in the next 30-40
|
||
years.
|
||
|
||
2. There is no significant risk that SKIPJACK can be broken through a
|
||
shortcut method of attack.
|
||
|
||
3. While the internal structure of SKIPJACK must be classified in
|
||
order to protect law enforcement and national security objectives,
|
||
the strength of SKIPJACK against a cryptanalytic attack does not
|
||
depend on the secrecy of the algorithm.
|
||
|
||
|
||
|
||
1. Background
|
||
|
||
On April 16, the President announced a new technology initiative aimed
|
||
at providing a high level of security for sensitive, unclassified
|
||
communications, while enabling lawfully authorized intercepts of
|
||
telecommunications by law enforcement officials for criminal
|
||
investigations. The initiative includes several components:
|
||
|
||
A classified encryption/decryption algorithm called "SKIPJACK."
|
||
|
||
Tamper-resistant cryptographic devices (e.g., electronic chips),
|
||
each of which contains SKIPJACK, classified control software, a
|
||
device identification number, a family key used by law enforcement,
|
||
and a device unique key that unlocks the session key used to
|
||
encrypt a particular communication.
|
||
|
||
A secure facility for generating device unique keys and programming
|
||
the devices with the classified algorithms, identifiers, and keys.
|
||
|
||
Two escrow agents that each hold a component of every device unique
|
||
key. When combined, those two components form the device unique
|
||
key.
|
||
|
||
A law enforcement access field (LEAF), which enables an authorized
|
||
law enforcement official to recover the session key. The LEAF is
|
||
created by a device at the start of an encrypted communication and
|
||
contains the session key encrypted under the device unique key
|
||
together with the device identifier, all encrypted under the family
|
||
key.
|
||
|
||
LEAF decoders that allow an authorized law enforcement official to
|
||
extract the device identifier and encrypted session key from an
|
||
intercepted LEAF. The identifier is then sent to the escrow
|
||
agents, who return the components of the corresponding device
|
||
unique key. Once obtained, the components are used to reconstruct
|
||
the device unique key, which is then used to decrypt the session
|
||
key.
|
||
|
||
This report reviews the security provided by the first component,
|
||
namely the SKIPJACK algorithm. The review was performed pursuant to
|
||
the President's direction that "respected experts from outside the
|
||
government will be offered access to the confidential details of the
|
||
algorithm to assess its capabilities and publicly report their
|
||
finding." The Acting Director of the National Institute of Standards
|
||
and Technology (NIST) sent letters of invitation to potential
|
||
reviewers. The authors of this report accepted that invitation.
|
||
|
||
We attended an initial meeting at the Institute for Defense Analyses
|
||
Supercomputing Research Center (SRC) from June 21-23. At that meeting,
|
||
the designer of SKIPJACK provided a complete, detailed description of
|
||
the algorithm, the rationale for each feature, and the history of the
|
||
design. The head of the NSA evaluation team described the evaluation
|
||
process and its results. Other NSA staff briefed us on the LEAF
|
||
structure and protocols for use, generation of device keys, protection
|
||
of the devices against reverse engineering, and NSA's history in the
|
||
design and evaluation of encryption methods contained in SKIPJACK.
|
||
Additional NSA and NIST staff were present at the meeting to answer our
|
||
questions and provide assistance. All staff members were forthcoming
|
||
in providing us with requested information.
|
||
|
||
At the June meeting, we agreed to integrate our individual evaluations
|
||
into this joint report. We also agreed to reconvene at SRC from July
|
||
19-21 for further discussions and to complete a draft of the report.
|
||
In the interim, we undertook independent tasks according to our
|
||
individual interests and availability. Ernest Brickell specified a
|
||
suite of tests for evaluating SKIPJACK. Dorothy Denning worked at NSA
|
||
on the refinement and execution of these and other tests that took into
|
||
account suggestions solicited from Professor Martin Hellman at Stanford
|
||
University. NSA staff assisted with the programming and execution of
|
||
these tests. Denning also analyzed the structure of SKIPJACK and its
|
||
susceptibility to differential cryptanalysis. Stephen Kent visited NSA
|
||
to explore in more detail how SKIPJACK compared with NSA encryption
|
||
algorithms that he already knew and that were used to protect
|
||
classified data. David Maher developed a risk assessment approach
|
||
while continuing his ongoing work on the use of the encryption chip in
|
||
the AT&T Telephone Security Device. Walter Tuchman investigated the
|
||
anti-reverse engineering properties of the chips.
|
||
|
||
We investigated more than just SKIPJACK because the security of
|
||
communications encrypted with the escrowed encryption technology
|
||
depends on the security provided by all the components of the
|
||
initiative, including protection of the keys stored on the devices,
|
||
protection of the key components stored with the escrow agents, the
|
||
security provided by the LEAF and LEAF decoder, protection of keys
|
||
after they have been transmitted to law enforcement under court order,
|
||
and the resistance of the devices to reverse engineering. In addition,
|
||
the success of the technology initiative depends on factors besides
|
||
security, for example, performance of the chips. Because some
|
||
components of the escrowed encryption system, particularly the key
|
||
escrow system, are still under design, we decided to issue this Interim
|
||
Report on the security of the SKIPJACK algorithm and to defer our Final
|
||
Report until we could complete our evaluation of the system as a
|
||
whole.
|
||
|
||
|
||
2. Overview of the SKIPJACK Algorithm
|
||
|
||
SKIPJACK is a 64-bit "electronic codebook" algorithm that transforms a
|
||
64-bit input block into a 64-bit output block. The transformation is
|
||
parameterized by an 80-bit key, and involves performing 32 steps or
|
||
iterations of a complex, nonlinear function. The algorithm can be used
|
||
in any one of the four operating modes defined in FIPS 81 for use with
|
||
the Data Encryption Standard (DES).
|
||
|
||
The SKIPJACK algorithm was developed by NSA and is classified SECRET.
|
||
It is representative of a family of encryption algorithms developed in
|
||
1980 as part of the NSA suite of "Type I" algorithms, suitable for
|
||
protecting all levels of classified data. The specific algorithm,
|
||
SKIPJACK, is intended to be used with sensitive but unclassified
|
||
information.
|
||
|
||
The strength of any encryption algorithm depends on its ability to
|
||
withstand an attack aimed at determining either the key or the
|
||
unencrypted ("plaintext") communications. There are basically two
|
||
types of attack, brute-force and shortcut.
|
||
|
||
|
||
3. Susceptibility to Brute Force Attack by Exhaustive Search
|
||
|
||
In a brute-force attack (also called "exhaustive search"), the
|
||
adversary essentially tries all possible keys until one is found that
|
||
decrypts the intercepted communications into a known or meaningful
|
||
plaintext message. The resources required to perform an exhaustive
|
||
search depend on the length of the keys, since the number of possible
|
||
keys is directly related to key length. In particular, a key of length
|
||
N bits has 2^N possibilities. SKIPJACK uses 80-bit keys, which means
|
||
there are 2^80 (approximately 10^24) or more than 1 trillion trillion
|
||
possible keys.
|
||
|
||
An implementation of SKIPJACK optimized for a single processor on the
|
||
8-processor Cray YMP performs about 89,000 encryptions per second. At
|
||
that rate, it would take more than 400 billion years to try all keys.
|
||
Assuming the use of all 8 processors and aggressive vectorization, the
|
||
time would be reduced to about a billion years.
|
||
|
||
A more speculative attack using a future, hypothetical, massively
|
||
parallel machine with 100,000 RISC processors, each of which was
|
||
capable of 100,000 encryptions per second, would still take about 4
|
||
million years. The cost of such a machine might be on the order of $50
|
||
million. In an even more speculative attack, a special purpose machine
|
||
might be built using 1.2 billion $1 chips with a 1 GHz clock. If the
|
||
algorithm could be pipelined so that one encryption step were performed
|
||
per clock cycle, then the $1.2 billion machine could exhaust the key
|
||
space in 1 year.
|
||
|
||
Another way of looking at the problem is by comparing a brute force
|
||
attack on SKIPJACK with one on DES, which uses 56-bit keys. Given that
|
||
no one has demonstrated a capability for breaking DES, DES offers a
|
||
reasonable benchmark. Since SKIPJACK keys are 24 bits longer than DES
|
||
keys, there are 2^24 times more possibilities. Assuming that the cost
|
||
of processing power is halved every eighteen months, then it will not
|
||
be for another 24 * 1.5 = 36 years before the cost of breaking
|
||
SKIPJACK is equal to the cost of breaking DES today. Given the lack of
|
||
demonstrated capability for breaking DES, and the expectation that the
|
||
situation will continue for at least several more years, one can
|
||
reasonably expect that SKIPJACK will not be broken within the next
|
||
30-40 years.
|
||
|
||
Conclusion 1: Under an assumption that the cost of processing power
|
||
is halved every eighteen months, it will be 36 years before the cost of
|
||
breaking SKIPJACK by exhaustive search will be equal to the cost of
|
||
breaking DES today. Thus, there is no significant risk that SKIPJACK
|
||
will be broken by exhaustive search in the next 30-40 years.
|
||
|
||
4. Susceptibility to Shortcut Attacks
|
||
|
||
In a shortcut attack, the adversary exploits some property of the
|
||
encryption algorithm that enables the key or plaintext to be determined
|
||
in much less time than by exhaustive search. For example, the RSA
|
||
public-key encryption method is attacked by factoring a public value
|
||
that is the product of two secret primes into its primes.
|
||
|
||
Most shortcut attacks use probabilistic or statistical methods that
|
||
exploit a structural weakness, unintentional or intentional (i.e., a
|
||
"trapdoor"), in the encryption algorithm. In order to determine
|
||
whether such attacks are possible, it is necessary to thoroughly
|
||
examine the structure of the algorithm and its statistical properties.
|
||
In the time available for this review, it was not feasible to conduct
|
||
an evaluation on the scale that NSA has conducted or that has been
|
||
conducted on the DES. Such review would require many man-years of
|
||
effort over a considerable time interval. Instead, we concentrated on
|
||
reviewing NSA's design and evaluation process. In addition, we
|
||
conducted several of our own tests.
|
||
|
||
4.1 NSA's Design and Evaluation Process
|
||
|
||
SKIPJACK was designed using building blocks and techniques that date
|
||
back more than forty years. Many of the techniques are related to work
|
||
that was evaluated by some of the world's most accomplished and famous
|
||
experts in combinatorics and abstract algebra. SKIPJACK's more
|
||
immediate heritage dates to around 1980, and its initial design to
|
||
1987.
|
||
|
||
SKIPJACK was designed to be evaluatable, and the design and evaluation
|
||
approach was the same used with algorithms that protect the country's
|
||
most sensitive classified information. The specific structures
|
||
included in SKIPJACK have a long evaluation history, and the
|
||
cryptographic properties of those structures had many prior years of
|
||
intense study before the formal process began in 1987. Thus, an
|
||
arsenal of tools and data was available. This arsenal was used by
|
||
dozens of adversarial evaluators whose job was to break SKIPJACK. Many
|
||
spent at least a full year working on the algorithm. Besides highly
|
||
experienced evaluators, SKIPJACK was subjected to cryptanalysis by less
|
||
experienced evaluators who were untainted by past approaches. All
|
||
known methods of attacks were explored, including differential
|
||
cryptanalysis. The goal was a design that did not allow a shortcut
|
||
attack.
|
||
|
||
The design underwent a sequence of iterations based on feedback from
|
||
the evaluation process. These iterations eliminated properties which,
|
||
even though they might not allow successful attack, were related to
|
||
properties that could be indicative of vulnerabilities. The head of
|
||
the NSA evaluation team confidently concluded "I believe that SKIPJACK
|
||
can only be broken by brute force there is no better way."
|
||
|
||
In summary, SKIPJACK is based on some of NSA's best technology.
|
||
Considerable care went into its design and evaluation in accordance
|
||
with the care given to algorithms that protect classified data.
|
||
|
||
4.2 Independent Analysis and Testing
|
||
|
||
Our own analysis and testing increased our confidence in the strength
|
||
of SKIPJACK and its resistance to attack.
|
||
|
||
4.2.1 Randomness and Correlation Tests
|
||
|
||
A strong encryption algorithm will behave like a random function of the
|
||
key and plaintext so that it is impossible to determine any of the key
|
||
bits or plaintext bits from the ciphertext bits (except by exhaustive
|
||
search). We ran two sets of tests aimed at determining whether
|
||
SKIPJACK is a good pseudo random number generator. These tests were
|
||
run on a Cray YMP at NSA. The results showed that SKIPJACK behaves
|
||
like a random function and that ciphertext bits are not correlated with
|
||
either key bits or plaintext bits. Appendix A gives more details.
|
||
|
||
4.2.2 Differential Cryptanalysis
|
||
|
||
Differential cryptanalysis is a powerful method of attack that exploits
|
||
structural properties in an encryption algorithm. The method involves
|
||
analyzing the structure of the algorithm in order to determine the
|
||
effect of particular differences in plaintext pairs on the differences
|
||
of their corresponding ciphertext pairs, where the differences are
|
||
represented by the exclusive-or of the pair. If it is possible to
|
||
exploit these differential effects in order to determine a key in less
|
||
time than with exhaustive search, an encryption algorithm is said to be
|
||
susceptible to differential cryptanalysis. However, an actual attack
|
||
using differential cryptanalysis may require substantially more chosen
|
||
plaintext than can be practically acquired.
|
||
|
||
We examined the internal structure of SKIPJACK to determine its
|
||
susceptibility to differential cryptanalysis. We concluded it was not
|
||
possible to perform an attack based on differential cryptanalysis in
|
||
less time than with exhaustive search.
|
||
|
||
4.2.3 Weak Key Test
|
||
|
||
Some algorithms have "weak keys" that might permit a shortcut
|
||
solution. DES has a few weak keys, which follow from a pattern of
|
||
symmetry in the algorithm. We saw no pattern of symmetry in the
|
||
SKIPJACK algorithm which could lead to weak keys. We also
|
||
experimentally tested the all "0" key (all 80 bits are "0") and the all
|
||
"1" key to see if they were weak and found they were not.
|
||
|
||
4.2.4 Symmetry Under Complementation Test
|
||
|
||
The DES satisfies the property that for a given plaintext-ciphertext
|
||
pair and associated key, encryption of the one's complement of the
|
||
plaintext with the one's complement of the key yields the one's
|
||
complement of the ciphertext. This "complementation property" shortens
|
||
an attack by exhaustive search by a factor of two since half the keys
|
||
can be tested by computing complements in lieu of performing a more
|
||
costly encryption. We tested SKIPJACK for this property and found that
|
||
it did not hold.
|
||
|
||
4.2.5 Comparison with Classified Algorithms
|
||
|
||
We compared the structure of SKIPJACK to that of NSA Type I algorithms
|
||
used in current and near-future devices designed to protect classified
|
||
data. This analysis was conducted with the close assistance of the
|
||
cryptographer who developed SKIPJACK and included an in-depth
|
||
discussion of design rationale for all of the algorithms involved.
|
||
Based on this comparative, structural analysis of SKIPJACK against
|
||
these other algorithms, and a detailed discussion of the similarities
|
||
and differences between these algorithms, our confidence in the basic
|
||
soundness of SKIPJACK was further increased.
|
||
|
||
Conclusion 2: There is no significant risk that SKIPJACK can be broken
|
||
through a shortcut method of attack.
|
||
|
||
|
||
5. Secrecy of the Algorithm
|
||
|
||
The SKIPJACK algorithm is sensitive for several reasons. Disclosure of
|
||
the algorithm would permit the construction of devices that fail to
|
||
properly implement the LEAF, while still interoperating with legitimate
|
||
SKIPJACK devices. Such devices would provide high quality
|
||
cryptographic security without preserving the law enforcement access
|
||
capability that distinguishes this cryptographic initiative.
|
||
Additionally, the SKIPJACK algorithm is classified SECRET NOT
|
||
RELEASABLE TO FOREIGN NATIONALS. This classification reflects the high
|
||
quality of the algorithm, i.e., it incorporates design techniques that
|
||
are representative of algorithms used to protect classified
|
||
information. Disclosure of the algorithm would permit analysis that
|
||
could result in discovery of these classified design techniques, and
|
||
this would be detrimental to national security.
|
||
|
||
However, while full exposure of the internal details of SKIPJACK would
|
||
jeopardize law enforcement and national security objectives, it would
|
||
not jeopardize the security of encrypted communications. This is
|
||
because a shortcut attack is not feasible even with full knowledge of
|
||
the algorithm. Indeed, our analysis of the susceptibility of SKIPJACK
|
||
to a brute force or shortcut attack was based on the assumption that
|
||
the algorithm was known.
|
||
|
||
Conclusion 3: While the internal structure of SKIPJACK must be
|
||
classified in order to protect law enforcement and national security
|
||
objectives, the strength of SKIPJACK against a cryptanalytic attack
|
||
does not depend on the secrecy of the algorithm.
|
||
|
||
++++++++++++++++++++++
|
||
Appendix in LaTeX
|
||
++++++++++++++++++++++
|
||
%documentstyle%article%
|
||
%textheight 8.25in
|
||
%topmargin -.25in
|
||
%textwidth 6.5in
|
||
%oddsidemargin 0in
|
||
%begin%document%
|
||
%parskip .25in
|
||
%large
|
||
%raggedright
|
||
%setcounter%page%%8%
|
||
%centerline%%bf Appendix A%
|
||
|
||
%%bf A.1 Cycle Structure Tests%
|
||
|
||
The first set of tests examined the cycle structure of SKIPJACK. Fix
|
||
a set of keys, $%cal K$, a plaintext, $m$, and a function $h%; : %;
|
||
%%cal M% %longrightarrow %%cal K%$, where $%%cal M%$ is the set of all
|
||
64 bit messages. Let $f %; : %; %%cal K% %longrightarrow %%cal K%$ be
|
||
defined as $f(k) = h ( SJ(k,m))$ (where $SJ(k,m)$ denotes the SKIPJACK
|
||
encryption of plaintext $m$ with key $k$). Let $N = |%cal K|$. The
|
||
expected cycle length of $f$ is $%sqrt%%pi N /8%$. We chose sets of
|
||
$%cal K$ with $N %; = %; 2^%10%, 2^%16%, 2^%24%, 2^%32%,
|
||
2^%40%, 2^%48%, 2^%56%$. For all of these $N$, the mean of the cycle
|
||
lengths computed across all experiments was close to an expected
|
||
relative error of
|
||
$(1/%sqrt%j%$ for $j$ experiments) of the expected cycle length.
|
||
We did not do this test with larger sets of keys because of the time
|
||
constraints.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%lrrrrr%
|
||
$N$ & %# of exps & Mean cycle len & Expec cycle len &
|
||
Rel Err & Expec rel err %%
|
||
%hline
|
||
$2^%10%$ & 5000 & 20.4 & 20.1 & .019 & .014 %%
|
||
$2^%16%$ & 3000 & 164.7 & 160.4 & .027 & .018 %%
|
||
$2^%24%$ & 2000 & 2576.6 & 2566.8 & .004 & .022 %%
|
||
$2^%32%$ & 2000 & 40343.2 & 41068.6 & .018 & .022 %%
|
||
$2^%40%$ & 1000 & 646604.9 & 657097.6 & .016 & .032 %%
|
||
$2^%48%$ & 10 & 8,980,043 & 10,513,561 & .145 & .316 %%
|
||
$2^%56%$ & 1 & 28,767,197 & 168,216,976 & .829 & 1 %%
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
%%bf A.2 Statistical Randomness and Correlation Tests%
|
||
|
||
The second set of tests examined whether there were any correlations
|
||
between the input and output of SKIPJACK, or between a key and the
|
||
output. We also looked for nonrandomness in functions of the form
|
||
$SJ(k,m) %oplus SJ(k,m %oplus h)$ and functions of the form $SJ(k,m) %oplus
|
||
SJ(k %oplus h , m)$ for all $h$ of Hamming weight 1 and 2 and for some
|
||
randomly chosen $h$. All results were consistent with these functions
|
||
behaving like random functions.
|
||
|
||
Given a set of $N$ numbers of $k$-bits each, a chi-square test will
|
||
test the hypothesis that this set of numbers was drawn (with
|
||
replacement) from a uniform distribution on all of the $2^k$, $k$-bit
|
||
numbers. We ran the tests using a 99%% confidence level. A truly
|
||
random function would pass the test approximately 99%% of the time.
|
||
The test is not appropriate when $N/2^k$ is too small, say $%leq 5$.
|
||
Since it was infeasible to run the test for $k = 64$, we would pick 8
|
||
bit positions, and generate a set of $N= 10,000$ numbers, and run the
|
||
test on the $N$ numbers restricted to those 8 bit positions (thus
|
||
$k=8$). In some of the tests, we selected the 8 bits from the output
|
||
of the function we were testing, and in others, we selected 4 bits
|
||
from the input and 4 from the output.
|
||
|
||
Some of the tests were run on both the encryption and decryption
|
||
functions of SKIPJACK. The notation $SJ^%-1%(k,m)$ will be used to
|
||
denote the decryption function of SKIPJACK with key $k$ on message
|
||
$m$.
|
||
|
||
%%bf Test 1: Randomness test on output. % In a single test: Fix $k$,
|
||
fix mask of 8 output bits, select 10,000 random messages, run
|
||
chi-square on the 10,000 outputs restricted to the mask of 8 output
|
||
bits. Repeat this single test for 200 different values of $k$ and 50
|
||
different masks, for a total of 10,000 chi-square tests. We found
|
||
that .87%% of the tests failed the 99%% confidence level chi-square
|
||
test. This is within a reasonable experimental error of the expected
|
||
value of 1%%. On the decryption function, there were only .64%% of
|
||
the tests that failed. This was on a much smaller test set.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|%
|
||
%hline
|
||
%# $k$ & %# masks & function, $f(m)$ & mask & %% failed %%
|
||
%hline
|
||
200 & 50 & $SJ(k,m)$ & 8 of $f(m)$ & .87 %%
|
||
%hline
|
||
25 & 50 & $SJ^%-1%(k,m)$ & 8 of $f(m)$ & .64 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
%%bf Test 2: Correlation test between messages and output.%
|
||
Single test: Fix $k$, fix mask of 4 message bits and 4 output bits,
|
||
select 10,000 random messages, run chi-square.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|%
|
||
%hline
|
||
%# $k$ & %# masks & function, $f(m)$ & mask & %% failed %%
|
||
%hline
|
||
200 & 1000 & $SJ(k,m)$ & 4 of $m$, 4 of $f(m)$ & 1.06 %%
|
||
%hline
|
||
25 & 1000 & $SJ^%-1%(k,m)$ & 4 of $m$, 4 of $f(m)$ & 1.01 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
%%bf Test 3: Randomness test on the xor of outputs, given a fixed xor of
|
||
inputs. %
|
||
Single test: Fix $k$, fix mask of 8 output bits, select 10,000 random
|
||
messages.
|
||
Let $%cal H$ be the union of all 64 bit words of Hamming
|
||
weight 1 (64 of these), all 64 bit words of Hamming weight 2 (2016 of
|
||
these), and some randomly chosen 64 bit words (920 of these).
|
||
Repeat this single test for all $h %in %cal H$, 50 different masks,
|
||
and 4 different values
|
||
of $k$.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|c|%
|
||
%hline
|
||
%# $k$ & %# masks & %# $h$ & function, $f(m)$ & mask & %% failed %%
|
||
%hline
|
||
4 & 50 & 3000 & $SJ(k,m) %oplus SJ(k,m %oplus h)$ & 8 of $f(m)$ & .99 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
|
||
%%bf Test 4: Correlation test between message xors and output xors. %
|
||
Single test: Fix $k$, fix mask of 4 bits of $h$ and 4 bits of output,
|
||
select 10,000 random $(m,h)$ pairs.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|%
|
||
%hline
|
||
%# $k$ & %# masks & function, $f(m,h)$ & mask & %% failed %%
|
||
%hline
|
||
200 & 1000 & $SJ(k,m) %oplus SJ(k,m %oplus h)$ & 4 of $h$, 4 of $f(m,h)$
|
||
& .99 %%
|
||
%hline
|
||
25 & 1000 & $SJ^%-1%(k,m) %oplus SJ^%-1%(k,m %oplus h)$ & 4 of $h$, 4 of
|
||
$f(m,h)$ & 1.02 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
%%bf Test 5: Correlation test between messages and output xors.%
|
||
Single test: Fix $k$, fix mask of 4 bits of $m$ and 4 bits of output
|
||
xor, select 10,000 random messages. Let $%cal H$ be the union of all
|
||
64 bit words of Hamming weight 1 (64 of these), some of the 64 bit
|
||
words of Hamming weight 2 (100 of these), and some randomly chosen 64
|
||
bit words (100 of these).
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|c|%
|
||
%hline
|
||
%# $k$ & %# masks & %# $h$& function, $f(m)$ & mask & %% failed %%
|
||
%hline
|
||
2 & 1000 & 264 & $SJ(k,m) %oplus SJ(k,m %oplus h)$ & 4 of $m$, 4 of $f(m)$
|
||
& .99 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
%%bf Test 6: Correlation test between keys and output.%
|
||
Single test: Fix $m$, fix mask of 4 key bits and 4 output bits,
|
||
select 10,000 random keys.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|%
|
||
%hline
|
||
%# $m$ & %# masks & function, $f(k)$ & mask & %% failed %%
|
||
%hline
|
||
200 & 1000 & $SJ(k,m)$ & 4 of $k$, 4 of $f(k)$ & 1.00 %%
|
||
%hline
|
||
25 & 1000 & $SJ^%-1%(k,m)$ & 4 of $k$, 4 of $f(k)$ & 1.02 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
%%bf Test 7: Randomness test on the xor of outputs, given a fixed xor of
|
||
keys. %
|
||
Single test: Fix $m$, fix mask of 8 output bits, select 10,000 random
|
||
keys.
|
||
Let $%cal H$ be the union of all 80 bit words of Hamming
|
||
weight 1 (80 of these), all 80 bit words of Hamming weight 2 (3160 of
|
||
these), and some randomly chosen 80 bit words (760 of these).
|
||
Repeat this single test for all $h %in %cal H$, 50 different masks,
|
||
and 2 different values
|
||
of $m$.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|c|%
|
||
%hline
|
||
%# $m$ & %# masks & %# $h$ & function, $f(k)$ & mask & %% failed %%
|
||
%hline
|
||
2 & 50 & 4000 & $SJ(k,m) %oplus SJ(k%oplus h,m )$ & 8 of $f(k)$ & .99 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
|
||
|
||
%%bf Test 8: Correlation test between key xors and output xors. %
|
||
Single test: Fix $m$, fix mask of 4 bits of $h$ and 4 bits of output,
|
||
select 10,000 random $(k,h)$ pairs.
|
||
|
||
%begin%center%
|
||
%begin%tabular%%|c|c|c|c|c|%
|
||
%hline
|
||
%# $m$ & %# masks & function, $f(k,h)$ & mask & %% failed %%
|
||
%hline
|
||
200 & 1000 & $SJ(k,m) %oplus SJ(k%oplus h,m )$ & 4 of $h$, 4 of $f(k,h)$
|
||
& 1.02 %%
|
||
%hline
|
||
25 & 1000 & $SJ^%-1%(k,m) %oplus SJ^%-1%(k%oplus h,m )$ & 4 of $h$, 4
|
||
of $f(k,h)$ & 1.1 %%
|
||
%hline
|
||
%end%tabular%
|
||
%end%center%
|
||
%end%document%
|
||
|
||
------------------------------
|
||
|
||
End of Computer Underground Digest #5.60
|
||
************************************
|
||
|
||
|