[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
[tor-talk] Leave Your Cellphone at Home
Leave Your Cellphone at Home
Interview with Jacob Appelbaum
From OCCUPY Gazette 4, out May 1.
Earlier this year in Wired, writer and intelligence expert James Bamford
described the National Security Agencyâs plans for the Utah Data Center. A
nondescript name, but it has another: the First Intelligence Community
Comprehensive National Cyber-security Initiative Data Center. The $2 billion
facility, scheduled to open in September 2013, will be used to intercept,
decipher, analyze, and store the agencyâs intercepted
communicationsâeverything from emails, cell phone calls, Google searches, and
Tweets, to retail transactions. How will all this data be stored? Imagine, if
you can, 100,000 square-feet filled with row upon row of servers, stacked
neatly on racks. Bamford projects that its processing-capacity may aspire to
yottabytes, or 1024 bytes, and for which no neologism of higher magnitude has
yet been coined.
To store the data, the NSA must first collect it, and here Bamford relies on
a man named William Binney, a former NSA crypto-mathematician, as his main
source. For the first time, since leaving the NSA in 2001, Binney went on the
record to discuss Stellar Wind, which we all know by now as the warrantless
wiretapping program, first approved by George Bush after the 2001 attacks on
the twin towers. The program allowed the NSA to bypass the Foreign
Intelligence Surveillance Court, in charge of authorizing eavesdropping on
domestic targets, permitting the wholesale monitoring of millions of American
phone calls and emails. In his thirty years at the NSA, Binney helped to
engineer its automated system of networked data collection which, until 2001,
was exclusively directed at foreign targets. Binney left when the
organization started to use this same technology to spy on American citizens.
He tells of secret electronic monitoring rooms in major US telecom
facilities, controlled by the NSA, and powered by complex software programs
examining Internet traffic as it passes through fiber-optic cables. (At a
local event last week, Binney circulated a list of possible interception
points, including 811 10th Avenue, between 53rd & 54th St., which houses the
largest New York exchange of AT&T Long Lines.) He tells of software, created
by a company called Narus, that parses US data sources: any communication
arousing suspicion is automatically copied and sent to the NSA. Once a name
enters the Narus database, all phone calls, emails and other communications
are automatically routed to the NSAâs recorders.
The NSA wasnât the only intelligence-gathering agency to have its domestic
surveillance powers expanded in the wake of September 11th. The USA PATRIOT
Act, for instance, allows the FBI to spy on US citizens without demonstrating
probable cause that its targets are engaged in criminal activities. Under
Section 215 of the Act, the now infamous National Security Lettersâwhich
formerly required that the information being sought pertain to a foreign
power or agent of a foreign powerâcan compel the disclosure of sensitive
information held by banks, credit companies, telephone carrier, and Internet
Service Providers, among many others, about US citizens. The recipient of an
NSL is typically gagged from disclosing the fact or nature of the request.
Itâs no secret that, whereas the Fourth Amendment prevents against
unreasonable search and seizure, concerns over ânational securityâ occasioned
its disregard and the violation of privacy rights of even the most ordinary
citizens. Activists have all the more reason to worry, repeatedly turning up
as the subject of terrorist investigations. For instance, in 2006 the ACLU
revealed that the Pentagon was secretly conducting surveillance of protest
activities, antiwar organizations, and groups opposed to military recruitment
policies, including Quakers and student organizations. Relying on sources
from the Department of Homeland Security, local police departments, and FBI
Joint Terrorism Task Forces, the Pentagon collected, stored, and shared this
data through the Threat and Local Observation Database, or TALON, designed to
track terrorist threats. Or take Scott Crow, a self-described anarchist and
veteran organizer in the global justice movement, who, as the New York Times
reported last year, is one of dozens of political activists across the
country to have come under scrutiny from the FBIâs increased counterterrorism
operation. The FBI set up a video camera outside his house, monitored guests
as they came and went, tracked his emails and phone conversations, and picked
through his trash to identify his bank and mortgage companies, presumably to
send them subpoenas. Others to have been investigated included animal rights
activists in Virginia and liberal Roman Catholics in Nebraska. When in 2008,
President Obama took the reigns from George W. Bush, there was an expectation
that much, or at least some, of this activity would be curbed. Yet, as
Bamfordâs article attests, the govermentâs monitoring and collection of our
digital data remains steadfast.
When the Occupy protests started in mid-September of last year, I relied on
data-generating technologies increasingly, more so than I had ever before.
Within a few weeks I had joined multiple OWS-related listservs; Iâd started
following Twitter with unprecedented commitment; I spent more hours on
Facebook than I care to acknowledge. I doubt I am the only one. At the same
time, there was a widespread sense of precautionâjust because we were
engaging in legal activities, covered by our First Amendment rights, no one,
it seemed, should presume herself exempt from the possibility of
surveillance. Sensitive conversations took place in loud bars, never over
email. Text messages were presumed unsafe. In meetings, cell phone batteries
were removed on occasion. Nevertheless, it was easy to feel unimportant (why
would anyone watch me?) and equally easy to let standards relaxâespecially
when it meant reclaiming conveniences that, once enjoyed, weâre difficult to
give up. Leaving a trail of potentially incriminating digital data seemed
inevitable. But how bad could it really be? And was there no way to use these
same tools while safeguarding our privacy?
In late April, I sat down with the independent security researcher, hacker,
and privacy advocate Jacob Appelbaum, who knows a thing or two about the
surveillance state. Appelbaum is one of the key members of the Tor project,
which relies on a worldwide volunteer network of servers to reroute Internet
traffic across a set of encrypted relays. Doing so conceals a userâs
location, and protects her from a common form of networking surveillance
known as traffic analysis, used to infer who is talking to whom over a public
network. Tor is both free (as in freedom) and free of charge. Appelbaum is
also the only known American member of the international not-for-profit
Resnick: The recent article in Wired describes where and how the NSA plans to
store its share of collected data. But as the article explains, the Utah
facility will have another important function: cryptanalysis, or
code-breaking, as much of the data cycling through will be heavily encrypted.
It also suggests that the Advanced Encryption Standard (AES), expected to
remain durable for at least another decade, may be cracked by the NSA in a
much shorter time if theyâve built a secret computer that is considerably
faster than any of the machines we know about. But more to the pointâis
Appelbaum: Some of it is as safe as we think it can be, and some of it is not
safe at all. The number one rule of âsignals intelligenceâ is to look for
plain text, or signaling informationâwho is talking to whom. For instance,
you and I have been emailing, and that information, that metadata, isnât
encrypted, even if the contents of our messages are. This âsocial graphâ
information is worth more than the content. So, if you use SSL-encryption to
talk to the OWS server for example, great, they donât know what youâre
saying. Maybe. Letâs assume the crypto is perfect. They see that youâre in a
discussion on the site, they see that Bob is in a discussion, and they see
that Emma is in a discussion. So what happens? They see an archive of the
website, maybe they see that there were messages posted, and they see that
the timing of the messages correlates to the time you were all browsing
there. They donât need to know to break a crypto to know what was said and
who said it.
Resnick: And this type of surveillance is called â?
Appelbaum: Traffic analysis. Itâs as if they are sitting outside your house,
watching you come and go, as well as the house of every activist you deal
with. Except theyâre doing it electronically. They watch you, they take
notes, they infer information by the metadata of your life, which implies
what it is that youâre doing. They can use it to figure out a cell of people,
or a group of people, or whatever they call it in their parlance where
activists become terrorists. And itâs through identification that they move
into specific targeting, which is why itâs so important to keep this
information safe first.
For example, they see that weâre meeting. They know that I have really good
operational security. I have no phone. I have no computer. It would be very
hard to track me here unless they had me physically followed. But they can
still get to me by way of you. They just have to own your phone, or steal
your recorder on the way out. The key thing is that good operational security
has to be integrated into all of our lives so that observation of what weâre
doing is much harder. Of course itâs not perfect. They can still target us,
for instance, by sending us an exploit in our email, or a link in a web
browser that compromises each of our computers. But if they have to exploit
us directly, that changes things a lot. For one, the NYPD is not going to be
writing exploits. They might buy software to break into your computer, but if
they make a mistake, we can catch them. But itâs impossible to catch them if
theyâre in a building somewhere reading our text messages as they flow by, as
they go through the switching center, as they write them down. We want to
raise the bar so much that they have to attack us directly, and then in
theory the law protects us to some extent.
Resnick: So if I were arrested, and the evidence presented came from a
targeted attack on my computer, and I knew about the attack, I would have
some kind of legal recourse?
Appelbaum: Well, thatâs an interesting question. What is the legal standard
for breaking into someoneâs computer because they were at a protest?
Congratulations, take that to the Supreme Court, you might be able to make
some really good law. I think the answer is that itâs a national newsworthy
incidentânobody knows the cops break into peopleâs computers. The cops break
into someoneâs house, the Fourth Amendment is super clear about thatâit canât
be done without a warrant.
Resnick: In January of last year, it was reported that the records for your
Twitter accountâ along with those of Julian Assange, Private Bradley Manning,
Dutch hacker Rop Gonggrjp, and Icelandic lawmaker Brigatta Jonsdottirâwere
subpoenaed by the US government. What is perhaps most notable in this case is
not that the accounts were subpoenaed, but that the orders, usually gagged
and carried out in secret, became public knowledge. Twitter contested the
secrecy order and won the right to notify you. Several months later, the Wall
Street Journal revealed that Google and the Internet service provider
Sonic.net, had received similar orders to turn over your data.
Appelbaum: Twitter notified me. But as for Google and Sonic.net, I read about
it in the Wall Street Journal like everybody else. So now I can talk about it
because it was in a public newspaper. Those are â2703(d) administrative
subpoenas,â and they asked for IP addresses, and the email addresses of the
people I communicated with, among other things. The government asserts that
it has the right to get that metadata, that âsignalingâ or relationship
information, without a warrant. They get to gag the company, and the company
canât fight it, because itâs not their data, itâs my data, or itâs data about
me, so they have no Constitutional standing. And the government asserts that
I have no expectation of privacy because I willingly disclosed it to a third
party. And in fact my Twitter data was given to the governmentâno one has
really written about that yet. We are still appealing but we lost the stay,
which means Twitter had to disclose the data to the government, and whether
or not they can use it is pending appeal. Once they get the data, itâs not
like itâs private or secretâand even if they canât use it as evidence, they
can still use it in their investigations.
Resnick: In January of this year, the Twitter account of writer and OWS
protester Malcolm Harris was subpoenaed by the Manhattan District Attorneyâs
Office. I think itâs safe to assume these incidents are not anomalies. In
which case, is there a way to use social media sites like Twitter without
putting our private data at risk? Because these sites can be very useful
tools of course.
Appelbaum: In the case of something like Twitter, you can use Tor on the
Android phoneâwe have a version of Tor for Android called Orbotâand Twitter
together and thatâs essentially the best youâre going to do. And even that
isnât particularly great. Twitter keeps a list of IP addresses where youâve
logged in, but if you use Tor, it wonât know you are logging in from your
phone. Itâs powerful, but the main problem is that itâs kind of complicated
to use. On your computer, you can use the Tor browser, and when you log into
Twitter, youâre fine, no problem allâyour IP address will trace back to Tor
again. So now when the government asserts that you have no expectation of
privacy, you can say all right, well I believe I have an expectation of
privacy, which is why I use Tor. I signal that. And the private messaging
capability of Twitterâdonât use it for sensitive stuff. Twitter keeps a copy
of all its messages.
Resnick: During the perceived wave of Internet activism throughout the 2009
Iranian election protests, a new proprietary software called Haystack
received a lot of media attention. Haystack promised Iranian activists
tightly encrypted messages, access to censored websites, and the ability to
obfuscate Internet traffic. You later tested the software and demonstrated
its claims to be false. For those of us who donât have your technical skill
set, how can we assess whether a particular tool is safe to use, especially
if itâs new?
Appelbaum: First, is the source code available? Second, if the claims are
just too good to be true, they probably are. Thereâs a thing called snake oil
crypto or snake oil software, where the product promises the moon and the
sun. When a developer promises that a proprietary software is super secure
and only used by important people, itâs sketchy. Third, are the people
working on this part of the community that has a reputation for accomplishing
these things? Thatâs a hard one, but ask someone you know and trust. How
would you go on a date with someone? How would you do an action with someone?
Transitive trust is just as important in these situations.
Another thing to look at is whether itâs centralized or decentralized. For
example Haystack was centralized, whereas Tor is decentralized. Also, how is
it sustained? Will it inject ads into your web browser, like AnchorFree, the
producer of the Hotspot Shield VPN? Or is it like Riseup.net, whose VPN
service monetizes not through your traffic, but through donations and
solidarity and mutual aid? And if they can inject ads, that means they can
inject a back door. Thatâs super sketchyâif they do that, thatâs bad news. So
you want to be careful about that.
Finally, remember: The truth is like a bullet that pierces through the armor
Resnick: What should we know about cell phones? Itâs hard to imagine going to
a protest without one. But like all networked technologies, surely they are
Appelbaum: Cell phones are tracking devices that make phone calls. Itâs sad,
but itâs true. Which means software solutions donât always matter. You can
have a secure set of tools on your phone, but it doesnât change the fact that
your phone tracks everywhere you go. And the police can potentially push
updates onto your phone that backdoor it and allow it to be turned into a
microphone remotely, and do other stuff like that. The police can identify
everybody at a protest by bringing in a device called an IMSI catcher. Itâs a
fake cell phone tower that can be built for 1500 bucks. And once nearby,
everybodyâs cell phones will automatically jump onto the tower, and if the
phoneâs unique identifier is exposed, all the police have to do is go to the
phone company and ask for their information.
Resnick: So phones are tracking devices. They can also be used for
surreptitious recording. Would taking the battery out disable this
Appelbaum: Maybe. But iPhones, for instance, donât have a removable battery;
they power off via the power button. So if I wrote a backdoor for the iPhone,
it would play an animation that looked just like a black screen. And then
when you pressed the button to turn it back on it would pretend to boot. Just
play two videos.
Resnick: And how easy is it to create something like to that?
Appelbaum: There are weaponized toolkits sold by companies like FinFisher
that enable breaking into BlackBerries, Android phones, iPhones, Symbian
devices and other platforms. And with a single click, say, the police can own
a person, and take over her phone.
Resnick: Rightâin November of last year, the Wall Street Journal first
reported on this new global market for off-the-shelf surveillance technology,
and created âSurveillance Catalogâ on their website, which includes documents
obtained from attendees of a secretive surveillance conference held near
Washington, D.C. WikiLeaks has also released documents on these companies.
The industry has grown from almost nothing to a retail market worth $5
billion per year. And whereas companies making and selling this gear say it
is available only to governments and law enforcement and is intended to catch
criminals, critics say the market represents a new sort of arms trade
supplying Western governments and repressive nations alike.
Appelbaum: Itâs scary because [accessing these products is so] easy. But when
a company builds a backdoor, and sells it, and says trust us, only good guys
will use itâ well, first of all, we donât know how to secure computers, and
anybody that says otherwise is full of shit. If Google can get owned, and
Boeing can get owned, and Lockheed Martin can get owned, and engineering and
communication documents from Marine One can show up on a filesharing network,
is it realistic to assume that perfect security is possible? Knowing this is
the case, the right thing is to not build any backdoors. Or assume these
backdoors are all abused and bypass them so that the data acquired is very
uninteresting. Like encrypted phone calls between two peopleâitâs true they
can wiretap the data, but theyâll just get noise.
When Hillary Clinton and the State Department say they want to help people
abroad fight repressive governments, they paint Internet freedom as something
they can enable with $25 million. Whereas in reality the FBI makes sure that
our communications tech isnât secure. This makes it impossible for people
like me to help people abroad overthrow their governments because our
government has ensured that all their technology is backdoor ready. And in
theory, they try to legitimize state surveillance here, and there they try to
make it illegitimate. They say, âIn over-there-a-stan, surveillance is
oppressive. But over here, itâs okay, we have a lawful process.â (Which is
not necessarily a judicial process. For example, Eric Holder and the drones .
. . sounds like a band, right?)
Resnick: Okay, so one thing Iâve heard more than once at meetings when
security culture comes up is that . . . well, thereâs a sense that too much
precaution grows into (or comes out of) paranoia, and paranoia breeds
mistrustâand all of it can be paralyzing and lead to a kind of inertia. How
would you respond to something like that?
Appelbaum: The people who that say thatâif theyâre not cops, theyâre feeling
unempowered. The first response people have is, whatever, Iâm not important.
And the second is, theyâre not watching me, and even if they were, thereâs
nothing they could find because Iâm not doing anything illegal. But the thing
is, taking precautions with your communications is like safe sex in that you
have a responsibility to other people to be safeâyour transgressions can fuck
other people over. The reality is that when you find out it will be too late.
Itâs not about doing a perfect job, itâs about recognizing you have a
responsibility to do that job at all, and doing the best job you can manage,
without it breaking down your ability to communicate, without it ruining your
day, and understanding that sometimes itâs not safe to undertake an action,
even if other times you would. Thatâs the education component.
So security culture stuff sounds crazy, but the technological capabilities of
the police, especially with these toolkits for sale, is vast. And to thwart
that by taking all the phones at a party and putting them in a bag and
putting them in the freezer and turning on music in the other roomâtrue,
someone in the meeting might be a snitch, but at least thereâs no audio
recording of you.
Part of informed consent is understanding the risks you are taking as you
decide whether to participate in something. Thatâs what makes us freeâthe
freedom to question what weâre willing to do. And of course itâs fine to do
that. But itâs not fine to say, I donât believe thereâs a risk, youâre being
paranoid, Iâm not a target. When people say that they donât want to take
precautions, we need to show them how easy it is to do it. And to insist that
not doing it is irresponsible, and most of all, that these measures are
effective to a degree, and worth doing for that reason. And itâs not about
perfection, because perfection is the enemy of âgood enough.â
I would encourage people to think about the activity they want to engage in,
and then say, Hey, this is what I want to do. Work together collaboratively
to figure out how to do that safely and securely, but also easily without
needing to give someone a technical education. Because thatâs a path of
madness. And if people arenât willing to change their behaviors a little bit,
you just canât work with them. I mean thatâs really what it comes down to. If
people pretend that theyâre not being oppressed by the state when they are
literally being physically beaten, and forced to give up retinal scans,
thatâs fucking ridiculous. We have to take drastic measures for some of these
The FBI has this big fear that theyâre going to âgo dark,â which means that
all the ways they currently obtain information will disappear. Well, America
started with law enforcement in the dark; once, we were perceived to be
innocent until proven guilty. And just because the surveillance is expanding,
and continues to expand, doesnât mean we shouldnât push back. If you havenât
committed a crime they should have no reason to get that information about
you, especially without a warrant.
Resnick: Are there any other tools or advice you would suggest to an
activist, or anyone for that matter?
Appelbaum: Well, itâs important to consider the whole picture of all the
electronic devices that we have. First, you should use Tor and the Tor
browser for web browsing. Know that your home internet connection is probably
not safe, particularly if itâs tied to your name. If you use a Mac or Windows
operating system, be especially careful. For instance, thereâs a program
called Evilgrade that makes it easy for attackers to install a backdoor on a
computer by exploiting weaknesses in the auto-update feature of many software
programs. So if you have Adobeâs PDF reader, and youâre downloading and
installing the update from Adobe, well, maybe youâll get a little extra
thing, and youâre owned. And the cops have a different but better version of
that software. Which is part of why I encourage people to use Ubontu or
Debian or Linux instead of proprietary systems like a Mac or whatever.
Because there are exploits for everything. If youâre in a particularly
sensitive situation, use a live bootable CD called TAILSâit gives you a Linux
desktop where everything routes over Tor with no configuration. Or, if youâre
feeling multilingual, host stuff in another country. Open an email account in
Sweden, and use TAILS to access it. Most important is to know your options. A
notepad next to a fireplace is a lot more secure than a computer in some
ways, especially a computer with no encryption. You can always throw the
notepad in the fireplace and thatâs that.
For email, using Riseup.net is good news. The solutions they offer are
integrated with Tor as much as possible. Theyâre badass. Because of the way
they run the system, Iâm pretty sure that the only data they have is
encrypted. And Iâd like to think that what little unencrypted data they do
have, they will fight tooth and nail to protect. Whereas, yes, you can use
Tor and Gmail together, but itâs not as integratedâwhen you sign in, Gmail
doesnât ask if you want to route this over Tor. But also, Google inspects
your traffic as a method of monetization. Iâd rather give Riseup fifty
dollars a month for the equivalent service of Gmail, knowing their commitment
to privacy. And also knowing that they would tell the cops to go fuck
themselves. Thereâs a lot of value in that.
For chatting, use software with off-the-record messaging (OTR)ânot Googleâs
âgo off the record,â but the actual encryption softwareâwhich allows you to
have an end-to-end encrypted conversation. And configure it to work with Tor.
You can bootstrap a secure communication channel on top of an insecure one.
On a Mac, use Adiumâit comes with OTR, but you still have to turn it on. When
you chat with people, click verify and read the fingerprint to each other
over the telephone. You want to do this because there could be a âman in the
middleâ relaying the messages, which means that you are both talking to a
third party, and that third party is recording it all.
As for your cell phone, consider it a tracking device and a monitoring device
and treat it appropriately. Be very careful about using cell phones, but
consider especially the patterns you make. If you pull the battery, youâve
generated an anomaly in your behavior, and perhaps thatâs when they trigger
people to go physically surveil you. Instead, maybe donât turn it off, just
leave it at home. Because, as I said earlier, in a world with lots of data
retention, our data trails tell a story about us, and even if the story is
made of truthful facts, itâs not necessarily the truth. On a cell phone, you
can install stuff like OStel, which allows you to make encrypted
voice-over-the-Internet calls, or PrivateGSMâitâs not free, but itâs
available for BlackBerries, Android phones, iPhones and so on. Which means
that if they want to intercept your communication, they have to break into
your phone. Itâs not perfect. Gibberbot for the Android allows you to use Tor
and Jabberâwhich is like Google Chatâwith OTR automatically configured. You
type in your Jabber ID, it routes over Tor, and when you chat with other
people, it encrypts the messages end-to-end so even the Jabber server canât
see whatâs being said. And there are a lot of tools like that to choose from.
Another thing to consider is the mode in which we meet. If we want to edit
something collaboratively, thereâs a program called Etherpad. And thereâs a
social networking application called Crabgrass, and hosted at we.riseup.net.
Itâs like a private Facebook. Riseup still has a lot of the data, but itâs
private by default. So itâs secure, short of being hacked, which is possible,
or short of some legal process. And if you use it in a Tor browser, and never
reveal information about yourself, youâre in really good shape. Unlike
Facebook, which is like the Stasi, but crowdsourced. And I mean that in the
nicest way possible. I once had a Facebook accountâitâs fun and a great way
to meet people. But it is not safe for political organizing, especially when
youâre part of the minority, or when youâre not part of the minority, but you
are part of the disempowered majority.
As a final thought, Iâd say just to remember that a big part of this is
social behavior and not technology per se. And a big part of it is accepting
that while we may live in a dystopian society right now, we donât always have
to. Thatâs the tradeoff, right? Because what is OWS working toward? The
answer is, something different. And if we want an end to social inequality,
the surveillance state is part of what we have to change. If we make it
worthless to surveil people, we will have done this. So, it needs to be the
case that what we do doesnât hang us for what we wish to create.
Image: NSA headquarters, Ft. Meade, MD.
tor-talk mailing list