mirror of
https://github.com/fdiskyou/Zines.git
synced 2025-03-09 00:00:00 +01:00
753 lines
33 KiB
Text
753 lines
33 KiB
Text
==Phrack Inc.==
|
|
|
|
Volume 0x0c, Issue 0x41, Phile #0x06 of 0x0f
|
|
|
|
|
|
|=-----------------------------------------------------------------------=|
|
|
|=--------=[ The only laws on the Internet are assembly and RFCs ]=------=|
|
|
|=-----------------------------------------------------------------------=|
|
|
|=-------------------=[ By julia@winstonsmith.info ]=--------------------=|
|
|
|=-----------------------------------------------------------------------=|
|
|
|
|
|
|
------[ Index
|
|
|
|
0 - starting point, our world and the Internet
|
|
|
|
1 - What's an incoherent law ?
|
|
|
|
2 - DEEP FOCUS, AND REAL TARGET: RIPA III
|
|
|
|
3 - Elettra
|
|
|
|
4 - THE ANONYMOUS IDENTITY: julia@winstonsmith.info
|
|
|
|
5 - REFERENCES
|
|
|
|
|
|
------[ 0. starting point, our world and the Internet
|
|
|
|
2008: During the past 10 years the Internet accrued its value in a way that
|
|
was not predictable. A lot of players are trying to gain power over it:
|
|
recording and cinema industries, governments, software vendors.
|
|
|
|
Eight, ten years ago our feeling with the net was hacking pleasure; now,
|
|
the hacking techniques are recycled by security vendors for their security
|
|
services. Once the diffusion of a crack was proof of value, now it only
|
|
represents the risk to be busted. A lot has changed and for someone this
|
|
is the opportunity to reconsider and eventually change both his ways of
|
|
acting and his priorities.
|
|
The spirit has changed: may the energies be readdressed?
|
|
Is there space for fun, still, and where?
|
|
The game now is hard. The tools in our hands have changed and threats are
|
|
much more realistic. Hacking is a game, but now the risk level has to
|
|
be carefully taken into consideration to protect ourselves and the
|
|
people near us, to avoid jail or being exploited.
|
|
|
|
Media industry, political entities and vendors are trying every possibility
|
|
to set foot where they can have even a minimum of control over users. This
|
|
is because control, even if only partial, means power.
|
|
|
|
We can envision how the Internet, its operating systems and softwares
|
|
operate. We clearly know what is going on, and why it works in that
|
|
specific way and not otherwise.
|
|
|
|
Those who are trying to get their hands on the network and on the Internet
|
|
users often (and fortunately) do not have the skills necessary to
|
|
understand the concepts on which the network is based. The fact that the
|
|
internet society is based on technical rules before that on moral ones is
|
|
somehow less natural for us to believe.
|
|
|
|
So it happens that computers and the Internet are subject to legislation,
|
|
control and governmental based restriction, whilst the Internet and
|
|
computers are by definition the same all over the world and the Internet is
|
|
made by all users equally.
|
|
|
|
Whether it speaks of computers, hard drives, forensic analysis, censorship
|
|
or wiretapping, items are always the same.
|
|
We can still understand this, while the politicians are on average 50 years
|
|
old and spent more or less 30 of those years in the political career.
|
|
Information technology has been developed seriously in recent 15 years, so
|
|
it is normal that politicians do not understand anything of IT matters.
|
|
|
|
Other than that, we see that political choices must enjoy the support of
|
|
the people, but people have not a better comprehension of IT matters than
|
|
politicians. This explains why no one has yet seen a law making some real
|
|
sense from a technological point of view and yet achieving its purposes
|
|
(which usually consists of an increased sense of security).
|
|
|
|
We think that we are the ones who can demonstrate that the Internet follows
|
|
different laws. That's because we are the generation born with the Internet
|
|
and we are able to speak internet language enthusiastically.
|
|
We, as developers, can not write laws nor directly challenge them, but we
|
|
can develop software demonstrating from a practical point of view how wrong
|
|
these laws are.
|
|
It seems to us that this is the most effective and less painful way to show
|
|
politicians their mistakes and give citizens their freedom, ruled on the
|
|
Internet by mathematical logics.
|
|
|
|
It is told by the Hacker Manifesto:
|
|
|
|
"I made a discovery today. I found a computer. Wait a second, this is cool.
|
|
It does what I want it to. If it makes a mistake, it's because I screwed
|
|
it up. Not because it doesn't like me... Or feels threatened by me... Or
|
|
thinks I'm a smart ass... Or doesn't like teaching and shouldn't be
|
|
here..."
|
|
|
|
Mentor's ideas in the 1986, and our ideas since then, are the same for all
|
|
the Internet users even now. If this freedom of acting is disturbing to
|
|
someone, it's our duty to remember that this freedom cannot be erased.
|
|
|
|
The incentives to hack have changed, exposure may have become uncomfortable
|
|
...so an anonymous identity rises: julia@winstonsmith.info, dedicated to
|
|
the spread of software written to demonstrate the inconsistency of laws
|
|
to control and limit users and the Internet.
|
|
|
|
|
|
------[ 1. What's an incoherent law ?
|
|
|
|
As the tendency to create inadequate laws is increasing, for what concerns
|
|
the control of the internet and its users, we felt the need to question
|
|
this modus operandi.
|
|
|
|
The goal is to create software designed to demonstrate that most laws that
|
|
restrict and control the Internet are inadequate (unnecessary,
|
|
counterproductive and risky).
|
|
|
|
You, more than any other community, have the chance to demonstrate that
|
|
any law trying to:
|
|
|
|
- regulate the Internet or computer use from a governmental point of view
|
|
- control users communications (by filtering and limiting them)
|
|
|
|
are not enforceable from a scientific point of view, because they are mere
|
|
transposition of real life laws on the digital dimension.
|
|
|
|
In general, our line of action can rely on this logic:
|
|
- A law is promulgated (data retention, search profiling, forbidden
|
|
publishing...)
|
|
- We analyze two aspects:
|
|
1) the logical structure of the law, in order to understand its
|
|
bases
|
|
2) the technical implementation of the law
|
|
- From (1) we can see and develop ideas not considered
|
|
- From (2) we can develop technical solutions
|
|
- The news that the law is a failure has to be spread. In fact, our
|
|
knowledge has no impact on politics, the audience and the users
|
|
information if it remains in our hands.
|
|
|
|
One example of law-reversing-and-attack could be:
|
|
|
|
1) Human side: A state deploys a law that forbids speaking about
|
|
cryptography
|
|
2) Tech side: being the domain owner of a website speaking of cryptography
|
|
lands you in jail.
|
|
3) Tech implementation: once the police is notified of the existence of a
|
|
server inside country boundaries serving forbidden content, an email is
|
|
sent asking to remove the pages, or in a short time the domain owner
|
|
will be busted and the server unplugged.
|
|
|
|
human countermeasure:
|
|
|
|
1a) Migrate to a free blog/website outside your country.
|
|
|
|
hacker countermeasure:
|
|
|
|
1a) Use a TOR hidden service, reachable through the Internet with a proxy
|
|
in another country [1]
|
|
1b) Use a FreeNET website [2]
|
|
1c) Post your content with an anonymous remailer to a mailing list [3]
|
|
1d) Publish your contents via peer to peer network using a digital
|
|
signature for trusted download, with the first node publishing the data
|
|
outside the country.
|
|
1e) Use a self decryption javascript site, capable to protect session
|
|
layer and cutting off crawlers that don't support javascript
|
|
(-> open source intelligence too), in some free-blog outside our
|
|
country. [4]
|
|
1f) A distributed server like "project R*" from "Autistici/Inventati" [5].
|
|
1g) Use one of the other infinite solutions, because we should move
|
|
ourselves among the RFCs, spreading software automatically and always
|
|
find new ways to bypass the law description.
|
|
|
|
The Internet for the mere users is only "the web", whilst we have more
|
|
possibilities. But we fail to keep our servers online, or to publish our
|
|
information without problems, because we are not using the best solutions
|
|
in strong encryption and network distribution.
|
|
|
|
|
|
------[ 2. DEEP FOCUS, AND REAL TARGET: RIPA III
|
|
|
|
RIPA (investigation of electronic data protected by encryption - power to
|
|
require disclosure, [6]).
|
|
|
|
In practical terms, it is the possibility for an UK investigator to
|
|
request the password protecting an encrypted file: in case of refusal or
|
|
impossibility to give the password, the user it is punishable with up to
|
|
two years' imprisonment.
|
|
|
|
Comparing this with the real life, it is the equivalent of a law requiring
|
|
a suspect to open his safe to investigators. But computers and the Internet
|
|
move on other schemes.
|
|
In addition, this law is specifically making the UK less secure.
|
|
Let's see why:
|
|
|
|
- A person possessing an encrypted archive containing secrets potentially
|
|
incriminating for a more than 2 years punishment will accept the 2 years
|
|
imprisonment instead of revealing the password.
|
|
|
|
- The one whose encrypted archive doesn't contain secrets incriminating,
|
|
but sensitive political, personal information, will give them to police
|
|
(or, more often, to private consultants that practically proceed with the
|
|
forensic examination).
|
|
|
|
- Theoretical security is not achieved by delegating the power to control
|
|
to an institution, because if this is corrupted it would become the
|
|
gateway for any sort of abuse. Theoretical security can be achieved by
|
|
preventing future crime, not by applying controls in order to act as a
|
|
deterrent.
|
|
|
|
Following the logic presented in the point 2 of this document, a safer way
|
|
to avoid RIPA-III is to use a more sophisticated way of protection.
|
|
|
|
Encryption models use to define actors (Alice & Bob) and scenarios (with
|
|
Mallory, Eve and the Family of Attack). The encryption model required to
|
|
hide the data, and the presence of encrypted data, is steganography [7].
|
|
In our scenario it is enough to demonstrate the absence of encrypted data,
|
|
and cryptography offering a "plausible deniability" is our solution.
|
|
|
|
Deniable cryptography allows a user to protect his data by plausibly deny
|
|
existence of hidden data inside an encrypted file. This kind of protection
|
|
is very useful when the user is compelled to give up the password by
|
|
violent or intimidating methods [8]. Different forms of deniable
|
|
cryptography are currently used:
|
|
- TrueCrypt [9] implements full disk encryption. The image of a second
|
|
encrypted disk is hidden inside free sectors of a container filesystem.
|
|
Since a TrueCrypt disk is first filled with random data, it is not
|
|
possible to differentiate between free sectors with random data on them
|
|
and sectors with encrypted data of the hidden volume. As a consequence of
|
|
this, the TrueCrypt user can plausibly deny existence of the hidden disk.
|
|
- OTR (Off-the-Record messaging) [10] is a cryptographic protocol that
|
|
provides strong encryption for instant messaging. After authenticating
|
|
the user (via key fingerprint comparison) it encrypts messages without
|
|
checking their digital signatures. This lack of integrity check allows
|
|
the sender of a message to plausibly deny sending that message since any
|
|
other user could have been the sender.
|
|
- 2c2/4c [11]: It takes two input files and generates an output.
|
|
Depending on the password used it decrypts one file or the other.
|
|
|
|
The goal of deniable cryptography is to deny the existence itself of a
|
|
piece of information. Steganography has the same goal (steganalysis is
|
|
effective only after the existence of hidden data has been proven), but it
|
|
is more ambitious in hiding the data to the adversary because it carefully
|
|
chooses a container that prevents an analyst to realize that the container
|
|
covers hidden information. In case of an encrypted file, the attacker
|
|
already knows it could contain valuable information; the aim now becomes to
|
|
deny the existence of the data inside the exposed container: in such a
|
|
situation, deniable cryptography has higher signal-to-coverdata ratio
|
|
compared to steganography.
|
|
|
|
------[ 3. Elettra
|
|
|
|
Elettra can generate archives of multiple files where a different file is
|
|
extracted depending on the password provided.
|
|
This because the password used for encryption is not only an "information
|
|
required for decrypt" one file in the archive, is also the "information
|
|
required to find" a file in the archive.
|
|
Every file is encrypted with its own password. Every password unlocks
|
|
a single file. Since elettra can add random padding to an archive, it's
|
|
impossible to determine how many files are contained in it.
|
|
|
|
Plausible deniability consists in allowing the user to deny existence of
|
|
other files except the only file he revealed the password to.
|
|
|
|
Elettra bases its security on mathematical principles derived from
|
|
reverse-engineering on the RIPA and its possible interpretations.
|
|
|
|
Elettra is a command line program developed for POSIX systems (tested under
|
|
Linux, cygwin and MacOSX). A GUI wrapper developed in wxWidgets has been
|
|
developed and both software, with related gpg signature, are available at:
|
|
|
|
https://www.winstonsmith.info/julia/elettra
|
|
|
|
Despite the fact that the GUI was coded in a tenth of the time spent for
|
|
Elettra, it helps a dramatically wider range of users to understand and use
|
|
the program. Usability and easiness of distribution of a software have
|
|
rarely been a goal for hackers, but this time we want to highlight and
|
|
spread a way of action/reaction made possible by open source technologies
|
|
and a network able to quickly communicate a content. The GUI is a necessary
|
|
compromise between features and usability :)
|
|
|
|
How to use Elettra:
|
|
|
|
user@linz:~/elettra/src/build$ ./elettra
|
|
./elettra by julia@winstonsmith.info, http://www.winstonsmith.info/julia
|
|
You should improve the quality of life, using privacy enhancing technology!
|
|
./elettra encrypt outputfile [size increment]% plainfile[::password]
|
|
./elettra decrypt cipherfile [password] [output directory]
|
|
./elettra checkpass password(s)
|
|
./elettra example (show examples of use)
|
|
- passwords, if not available, is ask with echo off
|
|
|
|
Elettra in encrypt mode
|
|
|
|
user@linz:/tmp$ ./elettra encrypt output 10% file1::passwd1 file1::passwd2
|
|
user@linz:/tmp$ ls -l file1 file2 output
|
|
-rw-r--r-- 1 user user 7132 Jan 15 18:35 file1
|
|
-rw-r--r-- 1 user user 36287 Jan 15 18:35 file2
|
|
-rw-r--r-- 1 user user 29027 Jan 17 10:35 output
|
|
|
|
Further generation of /tmp/output file, with the same file/password:
|
|
|
|
-rw-r--r-- 1 user user 30744 Jan 17 10:36 output
|
|
-rw-r--r-- 1 user user 32018 Jan 17 10:36 output
|
|
-rw-r--r-- 1 user user 29533 Jan 17 10:36 output
|
|
|
|
One goal of the algorithm is that the outputs differ given the same input.
|
|
In practice you can keep the smallest output.
|
|
|
|
here are some outputs with a padding of 100%
|
|
|
|
-rw-r--r-- 1 user user 65198 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 54336 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 57579 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 64938 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 67284 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 29219 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 48946 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 37260 Jan 17 11:43 output
|
|
|
|
and some with 1000%:
|
|
|
|
-rw-r--r-- 1 user user 247351 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 109079 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 303188 Jan 17 11:43 output
|
|
-rw-r--r-- 1 user user 301261 Jan 17 11:44 output
|
|
-rw-r--r-- 1 user user 290419 Jan 17 11:44 output
|
|
-rw-r--r-- 1 user user 288720 Jan 17 11:48 output
|
|
-rw-r--r-- 1 user user 114376 Jan 17 11:48 output
|
|
-rw-r--r-- 1 user user 169173 Jan 17 11:48 output
|
|
-rw-r--r-- 1 user user 197720 Jan 17 11:48 output
|
|
-rw-r--r-- 1 user user 114376 Jan 17 11:48 output
|
|
-rw-r--r-- 1 user user 266452 Jan 17 11:48 output
|
|
|
|
|
|
The third argument is the filename of the archive that is going to be
|
|
created.
|
|
The fourth argument (optional) is the amount of random padding that will be
|
|
inserted at the beginning or appended at the end of the compressed archive.
|
|
|
|
Padding can vary between 10% and 1000%.
|
|
|
|
How encryption works in Elettra:
|
|
|
|
Elettra has five command: encrypt, decrypt, checkpass, help and example.
|
|
|
|
Is executed with: elettra command [args]
|
|
|
|
We want to encrypt file /tmp/ls-manpage and /tmp/ps-manpage.
|
|
two file = two password we use "weirdness" and "foxnewsshower", the order
|
|
link:
|
|
ls-manpage (weirdness)
|
|
ps-manpage (foxnewsshower)
|
|
|
|
$ ./elettra encrypt /dev/shm/output 15% /tmp/ls-manpage::weirdness \
|
|
/tmp/ps-manpage::foxnewsshower
|
|
|
|
the size of our source file are:
|
|
|
|
$ ls -l /tmp/ls-manpage /tmp/ps-manpage
|
|
-rw-r--r-- 1 user user 7132 Jan 8 05:57 /tmp/ls-manpage
|
|
-rw-r--r-- 1 user user 36287 Jan 8 05:57 /tmp/ps-manpage
|
|
|
|
The command line specifies 15% of random padding. Required args for the
|
|
"encrypt" command, are the output file, the source files and the passwords.
|
|
If passwords are not inserted via command line, they are prompted
|
|
interactively.
|
|
|
|
Before the encryption gzip compression is used, the output file is:
|
|
|
|
$ ls -l /dev/shm/output
|
|
-rw-r--r-- 1 user user 42615 Jan 8 06:13 /dev/shm/output
|
|
|
|
Now we have an encrypted archive. The elettra decryption routine takes a
|
|
password and, optionally, a destination directory:
|
|
|
|
$ ./elettra decrypt /dev/shm/output weirdness /dev/shm/
|
|
$ ls -l /dev/shm/
|
|
-rw-r--r-- 1 user user 7132 Jan 8 06:32 ls-manpage
|
|
-rw-r--r-- 1 user user 42615 Jan 8 06:13 output
|
|
|
|
If you want to check your passwords, use the command "checkpass":
|
|
|
|
./elettra checkpass actresss weirdness shoeless
|
|
password(s) combinations work ok, with password block of 512 bytes, use it.
|
|
|
|
If checkpass or encrypt command receive a bad password sequence, notice
|
|
to the users.
|
|
|
|
This is how an elettra output file looks like:
|
|
|
|
RRRRRRRRRRRRRRRRRRRRRRRRRRRRRRKKKKKKKKKKKKKKKKKKKKKKKKCCCCRRRRRRRRRRRRRRR
|
|
RRRRRRRRRRRRRRRRRRRRRRKKKKKKKKKKKKKKKKKKKKKKKKCCCCRRRRRRRRRRRRRRRRRRRRRRR
|
|
RRRRRRKKKKKKKKKKKKKKKKKKKKKKKKCCCCRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRR
|
|
/-- end of initial keyblock, start of data section --/
|
|
RRrrrrccccllllddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd
|
|
dddddddddddddddddddddddddddddddffffFILE1PPPPPRRRRRRRRRRRRRRRRRRRRRRrrrrcc
|
|
ccllllddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd
|
|
dddddddddddddddddddddddffffFILE2PPPPPRRRRRRRRRRRRRRRrrrrccccllllddddddddd
|
|
ddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd
|
|
ddddddddffffFILE3PPPPPRRRRRRRRRRRR
|
|
|
|
K = Key
|
|
C = checksum of password
|
|
R = random padding (before the entry point)
|
|
r = random encrypted bytes (used in AES CBC)
|
|
c = checksum of key
|
|
l = length of the compressed file
|
|
d = compressed data
|
|
f = length of filename
|
|
FILE1, FILE2, FILE3 names of decrypted file
|
|
P = encrypted padding to fill the minimum AES-128 block up to its end
|
|
Random padding is generated with cyclic SHA256 functions.
|
|
|
|
K and C are in the first segment, called "initial keyblock", the other
|
|
components are the data in the elettra archive.
|
|
|
|
Here's how the file is created:
|
|
|
|
encrypt takes OUTPUT PADDING FILE[1..N] PASSWORD[1..N] arguments
|
|
|
|
try for n in [1 .. N]
|
|
HASHp_n =hash(PASSWORD_n)
|
|
|
|
Each of these hashes is required for obtaining an unique number dependent
|
|
from the password.
|
|
This unique number (which is modulo the initial keyblock size) is named
|
|
"entry point" thus obtaining an entry point for each file in the archive.
|
|
It is called entry point because it identifies the point where reading will
|
|
start at decryption time
|
|
|
|
initial keyblock:
|
|
|
|
/----------------- keyblock length: 512 + (x * 256) ----------------/
|
|
|
|
+--------------+--+----+--+--------+--+----------------------+--+---+
|
|
| | | | | | | | | |
|
|
| Random | | R | | R | | Random | | |
|
|
| | | | | | | | | |
|
|
+--------------+--+----+--+--------+--+----------------------+--+---+
|
|
^ ^
|
|
| |
|
|
entry point --+ |
|
|
+- password block struct (32 byte)
|
|
|
|
The keyblock length is not hardcoded but its value is choosen in an
|
|
adaptive way in such a way as to avoid password struct block collisions.
|
|
A collision occurs if two or more password struct blocks overlap.
|
|
Obviously a collision may disrupt the information contained in the initial
|
|
keyblock and as such it has to be avoided.
|
|
|
|
The password struct block is encrypted with the PASSWORD[1..N], and
|
|
contains an unsigned int checksum and a 28 byte KEY[1..N].
|
|
|
|
This mean that each password computation identifies an entry point inside
|
|
the initial keyblock where the key resides encrypted in 28 of the 32 byte
|
|
of password struct block. The other data in the initial keyblock are
|
|
random.
|
|
|
|
This is the algorithm shown in pseudocode:
|
|
|
|
for x in [1..12]:
|
|
{
|
|
size = (random_between(1, 12) * 256 ) + 512
|
|
size = size + ( N * 256 )
|
|
|
|
create keyblock[size]
|
|
|
|
for n in [1..N]
|
|
add_password_hash_to_keyblock(HASHp_n);
|
|
|
|
if keyblock[size] has collision
|
|
continue;
|
|
else
|
|
use size as good value
|
|
}
|
|
|
|
When a keyblock size is found, continue to the next step: creation of
|
|
data section.
|
|
|
|
for n in [1..N]:
|
|
{
|
|
GZ-FILE-LEN_n = gzip(FILE_n)
|
|
}
|
|
|
|
totalsize = keyblock size
|
|
for n in [1..N]:
|
|
{
|
|
MIN-EP_n = totalsize;
|
|
totalsize = totalsize + (GZ-FILE-LEN_n * PADDINDG% )
|
|
MAX-EP_n = totalsize;
|
|
totalsize = totalsize + GZ-FILE-LEN_n
|
|
}
|
|
|
|
In this step we'll get the total length of the Elettra archive to build
|
|
through using two arrays MIN-EP[1..N], MAX-EP[1..N] defined later.
|
|
|
|
Suppose to encrypt two elements:
|
|
|
|
/----- GZ-FILE-LEN 1 ------/ /----- GZ-FILE-LEN 2 -----/
|
|
+-------+--------------------------+-------+-------------------------+
|
|
| | | | |
|
|
| R1 | GZ-FILE1 | R2 | GZ-FILE2 |
|
|
| | | | |
|
|
+-------+--------------------------+-------+-------------------------+
|
|
^ ^ ^ ^
|
|
| +--- MAX-EP1 MIN-EP2 + |
|
|
+- MIN-EP1 +-- MAX-EP2
|
|
|
|
R1 and R2 are two padding blocks. The dimension of these blocks depends
|
|
on the padding % value, the size of the compressed file and a bit of
|
|
random.
|
|
|
|
The entry point for identifying FILEn position MUST fall between MIN-EP_n
|
|
and MAX-EP_n.
|
|
This second entry point is derived from KEY_n and so the keys KEY[1..N] are
|
|
chosen in order to fulfill this requirement following the algorithm
|
|
reported below:
|
|
|
|
for n in [1..N]:
|
|
{
|
|
for x in [1..10000]:
|
|
{
|
|
KEY_n = random()
|
|
HASHk_n = HASH(KEY_n)
|
|
EP_n = HASHk_n % totalsize
|
|
if( MIN-EP_n < EP_n < MAX-EP_n)
|
|
return KEY_n;
|
|
}
|
|
}
|
|
|
|
/----- GZ-FILE-LEN 1 -----/ /----- GZ-FILE-LEN 2 -----/
|
|
+---+-------------------------+----+---+-------------------------+---+
|
|
| | | | | | |
|
|
| R | GZ-FILE1 | R | R | GZ-FILE2 | R |
|
|
| 1 | | 2 | 3 | | 4 |
|
|
+---+-------------------------+----+---+-------------------------+---+
|
|
^ ^ ^ ^
|
|
| | | |
|
|
+--EP1 MIN-EP2--+ + EP2 + B
|
|
|
|
R1 is the first block of random (from MIN-EP1 to EP1)
|
|
R2 is the post file padding (from EP1 + GZ-FILE-LEN to MIN-EP2)
|
|
|
|
Every file is written between two random sequences of padding blocks.
|
|
The length of each padding sequence is random. Every length is plausible
|
|
before and after a given file, because the attacker doesn't know the
|
|
padding percentage requested at encryption time.
|
|
|
|
Now the algorithm has retrieve:
|
|
|
|
1) The length of inital keyblock
|
|
2) The entry points derived from the passwords
|
|
3) The keys
|
|
4) The padding sections
|
|
5) The internal structs used for keeping file names, file lengths and
|
|
checksums saved before the file data just after EP_n
|
|
|
|
Now it's simply a matter of opening a file, writing the initial keyblock
|
|
and the data section and save.
|
|
|
|
Decryption sequence for Elettra:
|
|
|
|
1) Password and input file are given
|
|
2) The password is hashed and this value (referred as HASHp) is used to
|
|
search the initial keyblock size. The possible value are obtained with
|
|
the following algorithm
|
|
|
|
for x in [1..80]:
|
|
try_size = 512 + ( 256 * x );
|
|
|
|
When the password is able to decrypt the first 32 bytes pointed by
|
|
HASHp modulo try_size and to verify the internal checksum, the initial
|
|
keyblock size is identified.
|
|
3) Read the key from the initial keyblock, decrypt it and evaluate its hash
|
|
modulo the file total length.
|
|
4) Decrypt the first 32 bytes. If the checksum matchs, decrypt the file
|
|
length, decompress it and rename it with its original name.
|
|
|
|
Conclusions about algorithm:
|
|
|
|
An analyst that will have to analyze files encrypted by using Elettra will
|
|
not be able to make any assumption a priori, since the algorithm aims at
|
|
behaving randomly in order to make any output plausible. Every Elettra
|
|
output file should contain one or more file, and is plausible assume that
|
|
only one file had been encrypred, because the padding before and after the
|
|
decrypted file seem plausible random padding.
|
|
|
|
The security of this algorithm is based on the propriety of encrypted data
|
|
to appear fully random over a statistical analysis. An attack which is able
|
|
to detect the difference between compressed+encrypted data and random data
|
|
could exploit Elettra.
|
|
|
|
An example of plausible deniability:
|
|
|
|
The analyst has found a 1.4Mb long file, encrypted by using Elettra.
|
|
Using the user provided password, he extracts a .pdf file 2Mb long.
|
|
Then, the following is plausible:
|
|
|
|
A: the user has ran Elettra with a 40% proportional padding. The file size
|
|
was 2Mb and it has been compressed in 1 Mb. Before the beginning and after
|
|
the end of the file a total of 400k bytes of random padding has been added.
|
|
|
|
But it is also likely that:
|
|
|
|
B: the user has used a 2Mb long .pdf as covert file and a 200k file of
|
|
secret data. The compressed size of the .pdf was 1Mb, but the other file
|
|
could not be compressed anymore, so its size stayed 200k. From 1Mb
|
|
compressed .pdf + 200k of secret file and a 16% proportional padding it has
|
|
been created the 1.4Mb resulting file.
|
|
|
|
Both cases are plausible: either way the analyst has a password that
|
|
extracts a .pdf file 2Mb long that compresses in 1Mb. The analyst could
|
|
then inspect which part of the file is decrypted, but the position of an
|
|
encrypted file in the archive gives no information, since it is plausible
|
|
that both before and after each file in the archive there is random data.
|
|
|
|
Elettra has been developed with these attacks and countermeasures in mind:
|
|
|
|
1) Files are encrypted with AES-128, random padding is the output of SHA
|
|
functions and it is also mathematically impossible to say if data is
|
|
encrypted or just random noise;
|
|
2) It is not possible to make assumptions based on the final size of the
|
|
archive (e.g. verifying if the padding is a whole quantity or a
|
|
fraction is useless because the proportional increase of dimension
|
|
supplied by the user is not used as is, but a new value is derived from
|
|
it;
|
|
3) Checksums used to verify integrity of passwords are implemented, and
|
|
are checked before the decryption of files;
|
|
4) In algorithms that work in CBC mode, first bytes are initialized with
|
|
random data to make cryptography stronger;
|
|
5) The probability distribution of random data is equal at the beginning
|
|
and at end of the encrypted file.
|
|
6) Minimum password length is 6 bytes.
|
|
7) Disclose a password or a key doesn't give to the attacker any
|
|
information useful for attacks.
|
|
|
|
Elettra counts more or less 1600 lines of source code. Every other coder
|
|
could have found other ways to accomplish the same task, even less complex
|
|
than the one presented in this article that requires to remember multiple
|
|
different passwords. In such kind of program there is always room for
|
|
improvements. The important thing to think about is that in a couple of
|
|
weeks somebody could develop something unforseen and unexpected by laws,
|
|
but still perfectly legal.
|
|
|
|
------[ 4. THE ANONYMOUS IDENTITY: julia@winstonsmith.info
|
|
|
|
The name Julia comes from the novel "1984", as the whole Winston Smith
|
|
Project. Julia shows to Winnie the way, the tools and the motives for
|
|
freedom. Thus, this Julia - just when the Internet is fighting for its
|
|
freedom - wants to be the one who demonstrates how laws written by
|
|
politicians are incoherent, inconsistent, unnecessary.
|
|
|
|
Spreading techniques aimed to the human rights protection on the Internet
|
|
have a goal that is not related to personal visibility. On the contrary, in
|
|
some cases personal safety can be endangered by the spreading of the
|
|
technique. Thus, other spreading techniques have to be evaluated.
|
|
|
|
We created an anonymous group identity to have a single reference point
|
|
(a name, a keyword, an ideal). This experiment brings us some advantages:
|
|
- Visibility is obtained as a collective effort, and can survive the single
|
|
work
|
|
- Coders minimize legal and personal risks
|
|
- Both individual confrontation and preservation of one's identity inside
|
|
the group are diminished
|
|
- At the moment there is no central blog or web site, as centralizing means
|
|
being exposed to the risk of being censored, attacked or to draw unneeded
|
|
attention
|
|
- The collective identity identifies itself through a digital signature,
|
|
and media can be distributed using p2p, web sites, blogs, usenet.
|
|
The most choices we have, the most options we can consider.
|
|
- We are a network made up of people around the world with the skills,
|
|
the knowledge and the ability to deeply understand problems related with
|
|
our countries.
|
|
|
|
Those who want to team up with julia@winstonsmith.info can write an email
|
|
message (we strongly suggest using an anonymous remailer or TOR and a throw
|
|
away email address).
|
|
|
|
The anonymous identity will not have a web site to express itself, Elettra
|
|
gets served from url http://www.winstonsmith.info/julia/elettra.
|
|
Please note that this is done just for convenience, and has no relationship
|
|
with the anonymous identity.
|
|
|
|
The only way to verify the authenticity of the source is the digital
|
|
signature of media using the following key:
|
|
|
|
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
|
Version: GnuPG v1.4.7 (Darwin)
|
|
|
|
mQGiBEdqIL0RBAChcjI1XSCY6uBj8tt822t3QAIrbUgIL1f+fknclPHPQqjyv+DI
|
|
H793WaP2TlJ0mPNJqK2D8pyhO1l8MMzZIzNq+86zblogLklYUo68LbznUPJYNl0f
|
|
5Idg6DoNHO7JyXxU1aKq15sLD92izRX5g6Jx7V14DTP/gIB+vZjtcykBZwCgmqC1
|
|
YZv/KKVtoSyX/QR0YdJk5ecEAJPurJEm82wshma7RxuOL5UDBhRR4WUBquYa5L35
|
|
rTeswSZ/5MFAX4G3VWNb28RZMcDKrd2XIbPA/NI8uVNPEmtmdrF4bA7IGYYPmwuz
|
|
SsL3MN0YcDdh8slrqNBuBFNsH95xm4FQKWc+rPPYvZVSsLBosJz9OXPJJYVh61X8
|
|
KDSzA/9bovS6D8e02en5t3XScUSBdU4GCHqqgRLpbfTECSXm2KhA2TtnSQ84lqCL
|
|
eKs4i955xmF6vQ3bZIATpohSPBz/CdvVPcwNIffVxAwX4bDJDdkXkvd2prWibBJ4
|
|
VSzcNVfyvRgYGbrTjq7Aok1f3d/GCQz0oMzGLhs8ZY0xkRNJD7Q0anVsaWEgKGFu
|
|
b255bW91cyBpZGVudGl0eSkgPGp1bGlhQHdpbnN0b25zbWl0aC5pbmZvPohgBBMR
|
|
AgAgBQJHaiC9AhsDBgsJCAcDAgQVAggDBBYCAwECHgECF4AACgkQ83466PEEGF9S
|
|
BgCeIrFkSGSUlOYhXZCNcGmmMrB1h1wAnRUL6+VOQ0SxbYTnTpDIMgGwA3byuQL1
|
|
BEdqIL0QC6C3T5hpVjgCTHUjbhu/gql/hHQV0u6av5fDGAYZmQPcDRYb5FP76+Kp
|
|
6DDLsSDo95DG1STO8QjRNrrz8tOftC6+F4kMxh1KvcyWEeam8GxYMytpQwDN2Nqr
|
|
J3tV9Q24Nv4wUd7vHNqBlcJKeWyQGxBzebelBgAOyMb4YsIGEZJgR4F+o1R2jQYW
|
|
rcNPp11aJtxyl2dApaHulzEjMCIDNKnJpbi4lLuqGVaht5NMypxsRnclb1Nw87VY
|
|
jrhyJNGT4tojm7ERzJjNLUenTgda788ivWIse68t5WHh7BIGMyNiMKMGjI2R81ei
|
|
c+M+O/wL59eh6EGv7V3nZz6qB07f8i3fsNgUF1YFrB2nHtEQTWvb62oGZG+OP5Fp
|
|
tGyNhH3HN1VhBRg0eGSEZCGFHZU2chGyOPMqynfsf7o8dkNi8+Ydd0eIn3TlH6of
|
|
Xy/TqlQAWS5BUhBX7CEpiO1XPh07o2FLyRPQiElSJ3inCh4sOde/vuFp7RMAAwYL
|
|
n3i13almOtHB4qz0J4h/J8TgDaHYmAidYRpr9m6LpKysomHNrtj2U0Am2DmjI25H
|
|
LvkOECX9x9yp98WGlzOllZA++YnCSpuG4b03VsLPqmD/r/VdcXrli5cs+UB7O8L2
|
|
2P19L+RO89+SieDEKHrKbfkkM3w4OJ+5/mfxejNfoRh0/GBJgoWGj+h/dChmvE7O
|
|
alCFDJ5q8Q1QyHMNbMuZxfub+TnpINeHkwiMeaFZRcmaBtjb7T3J+EPf0dUtxXBQ
|
|
0F8RzypLEI8FLV/SU+pkynCkp2o6wnRVs8Lms6xci1WE1asr+2Xp9vLN4ppIfo6x
|
|
reYKegPcFAw21UuBx6c7OKzEwRFB0OUSGS1Mdzt0ekq2j6Axk5WVShsDcdW+SI5N
|
|
fKKqSCWSQE9dbekHUXpBkkbI85uJ2F6QOtMFEJGlw5XTAvJyuamVqXyq6SE5AyVL
|
|
bQ9bfCtizrCOn3h547m7nm6RQ+3JfnCVjJqB9eFtP6WFIsDKKIhJBBgRAgAJBQJH
|
|
aiC9AhsMAAoJEPN+OujxBBhf16YAnRJLQTTY6JiJGDJG4f2JJFUxereAAJ9hXs0P
|
|
/yO+HtkGHnfSuwoaRvSQdw==
|
|
=+vKf
|
|
-----END PGP PUBLIC KEY BLOCK-----
|
|
|
|
Whoever shares the objectives of the Julia project can contact the address
|
|
julia@winstonsmith.info (anonymously or not).
|
|
|
|
Future objectives are not clear, still.
|
|
It would be interesting to put together a list of all the world's laws and
|
|
the technologies which render them useless. Somewhere this activity may be
|
|
seen as misdemeanor incitement, that's why we choose an anonymous identity,
|
|
and no fixed media to publish material.
|
|
|
|
Julia was not born by the will of a single person, but because we felt the
|
|
need of other people to be able to share and give information and
|
|
techologies that can help build freedom of expression, privacy rights,
|
|
technological awareness. Julia is just a landmark, whoever shares the same
|
|
aims can pursue them without it.
|
|
|
|
No vendor and no State, however much we can respect it, can rule the
|
|
Internet. Open technologies belong to everybody.
|
|
|
|
-------[ 5. REFERENCES
|
|
|
|
[1] http://www.torproject.org/docs/tor-hidden-service.html.en
|
|
[2] http://en.wikipedia.org/wiki/Freenet
|
|
[3] http://www.andrebacard.com/remail.html
|
|
[4] http://pajhome.org.uk/crypt/sda/index.html
|
|
[5] http://www.autistici.org/en/who/rplan/index.html
|
|
http://dev.autistici.org/orangebook/
|
|
[6] http://www.opsi.gov.uk/acts/acts2000/ukpga_20000023_en_8
|
|
[7] http://www.cl.cam.ac.uk/~rja14/Papers/jsac98-limsteg.pdf
|
|
[8] http://en.wikipedia.org/wiki/Rubber-hose_cryptanalysis
|
|
[9] http://en.wikipedia.org/wiki/TrueCrypt
|
|
[10] http://en.wikipedia.org/wiki/Off-the-Record_Messaging
|
|
[11] http://lcamtuf.coredump.cx/soft/2c2.tgz
|