PRIVACY Forum Archive Document

PRIVACY Forum Home Page

PFIR - "People For Internet Responsibility" Home Page

Vortex Technology Home Page


PRIVACY Forum Digest     Friday, 2 April 2004     Volume 13 : Issue 02

               ( http://www.vortex.com/privacy/priv.13.02 )

            Moderated by Lauren Weinstein (lauren@vortex.com)         
              Vortex Technology, Woodland Hills, CA, U.S.A.
                         http://www.vortex.com 
        
                       ===== PRIVACY FORUM =====              

    -------------------------------------------------------------------
                 The PRIVACY Forum is supported in part by      
          the ACM (Association for Computing Machinery) Committee     
             on Computers and Public Policy, and Telos Systems.
                                 - - -
             These organizations do not operate or control the     
          PRIVACY Forum in any manner, and their support does not
           imply agreement on their part with nor responsibility   
        for any materials posted on or related to the PRIVACY Forum.
    -------------------------------------------------------------------


CONTENTS 
        Risks in Google's New "Gmail" Service (Lauren Weinstein)
        Balancing security and privacy by splitting behavior and identity 
           (Kurt D Fenstermacher)


 *** Please include a RELEVANT "Subject:" line on all submissions! ***
            *** Submissions without them may be ignored! ***

-----------------------------------------------------------------------------
The Internet PRIVACY Forum is a moderated digest for the discussion and
analysis of issues relating to the general topic of privacy (both personal
and collective) in the "information age" of the 1990's and beyond.  The
moderator will choose submissions for inclusion based on their relevance and
content.  Submissions will not be routinely acknowledged.

All submissions should be addressed to "privacy@vortex.com" and must have
RELEVANT "Subject:" lines; submissions without appropriate and relevant
"Subject:" lines may be ignored.  Excessive "signatures" on submissions are
subject to editing.  Subscriptions are via an automatic list server system;
for subscription information, please send a message consisting of the word
"help" (quotes not included) in the BODY of a message to:
"privacy-request@vortex.com".  Mailing list problems should be reported to
"list-maint@vortex.com". 

All messages included in this digest represent the views of their
individual authors and all messages submitted must be appropriate to be
distributed and archived without limitations. 

The PRIVACY Forum archive, including all issues of the digest and all
related materials, is available via anonymous FTP from site "ftp ftp.vortex.com",
in the "/privacy" directory.  Use the FTP login "ftp" or "anonymous", and
enter your e-mail address as the password.  The typical "README" and "INDEX"
files are available to guide you through the files available for FTP
access.  PRIVACY Forum materials may also be obtained automatically via
e-mail through the list server system.  Please follow the instructions above
for getting the list server  "help" information, which includes details
regarding the "index" and "get" list server commands, which are used to access
the PRIVACY Forum archive.  

All PRIVACY Forum materials are available through the Internet Gopher system
via a gopher server on site "gopher.vortex.com/".  Access to PRIVACY Forum
materials is also available through the Internet World Wide Web (WWW) via
the Vortex Technology WWW server at the URL: "http://www.vortex.com";
full keyword searching of all PRIVACY Forum files is available via
WWW access.
-----------------------------------------------------------------------------

VOLUME 13, ISSUE 02

     Quote for the day:

       "I'm a scientist. 
        All scientists are poor. 
        It's a law."

                -- Jason Eldridge (Jim Hutton)
                   "The Honeymoon Machine" (MGM; 1961)

----------------------------------------------------------------------

Date:    Fri, 02 Apr 2004 07:48:14 PST
From:    Lauren Weinstein <lauren@vortex.com>
Subject: Risks in Google's New "Gmail" Service

Google (or ISPs) getting into the business of routinely scanning users'
e-mail for "interesting" keywords is of staggering import, even if the
reason is "merely" to insert ads (or spam control, for that matter, though
Google's plan to act as a massive long-term e-mail repository ups the risk
ante considerably over e-mail pass-through ISPs).

What would Google's legal responsibilities and actions be if they "stumbled"
across discussions of apparently illegal activity (everything from overdue
library books to adultery to murder...), or terrorism, or illicit
pornography?  Since they've apparently opened the surveillance box, it's
quite possible they'd be legally required to report everything that might
even potentially fall into questionable categories.  

This of course would include all the false alarms that would be generated by
innocent messages that only looked suspicious but really weren't, not to
mention purposely faked messages spiked with likely nasty keywords to try
upset the system.  Even with the best of motives, do we really want Google
or ISPs becoming the commercial equivalent of Total Information Awareness?
We all want to prevent crime and terrorism, but is the creation of massive
surveillance machines in the guise of free e-mail services the proper way to
do so in our society?

And what of the proprietary information that will inevitably find its way
into Google's scannable e-mail treasure chest?  "Innocent" scanning could
reveal all sorts of goodies.  (I've thought in the past about all those new
product names and future trademarks that first drop into Google's logs when
initial searches are performed...)  Can we trust Google not to abuse this
potentially lucrative power?  For now the answer is probably yes, but market
forces make the future anything but certain.

Don't get me wrong.  I like Google -- a lot.  I think overall they've got a
good attitude, and a superb search engine (though the privacy implications
of their search logs have long been a matter of concern, as I noted).  But I
fear that they have not fully thought through the ramifications of their new
e-mail project, and how it can, even with the best of intentions, be rapidly
turned to the Dark Side.  That risk won't only result from Google's
decisions, but also from actions by government, lawyers, law enforcement,
courts, and even ISPs and Google's competitors.

E-mail is arguably the most sensitive form of Internet communications, and
deserves the highest possible levels of protection.  Mere trust or good
faith aren't enough.  

In the classic (and highly recommended) satirical film "The President's
Analyst," the protagonists gradually come to the realization that every
phone call in the country is being tapped.  The 1967 film has been
prescient in numerous ways, and doesn't seem quite so funny anymore.

Centralized scanning of e-mail (even for ostensibly innocent commercial
purposes), the push for expanded surveillance of conventional and VoIP
telephone systems, and many other moves, together point towards a 
future where all use of telecommunications is monitored through 
close alliances of commercial enterprises and government, and 
where encryption will be banned or tightly controlled.

Even if one assumes completely benign motives on the part of these 
firms and governments today, what of the future?  Will the 
incredibly powerful and pervasive monitoring infrastructures 
now being woven always be in the hands of such trustworthy entities?

History suggests that we have a lot to worry about in these regards.

--Lauren--
Lauren Weinstein
lauren@pfir.org or lauren@vortex.com or lauren@privacyforum.org
Tel: +1 (818) 225-2800
http://www.pfir.org/lauren
Co-Founder, PFIR - People For Internet Responsibility - http://www.pfir.org
Co-Founder, Fact Squad - http://www.factsquad.org
Co-Founder, URIICA - Union for Representative International Internet
                     Cooperation and Analysis - http://www.uriica.org
Moderator, PRIVACY Forum - http://www.vortex.com
Member, ACM Committee on Computers and Public Policy

------------------------------

Date:    Wed, 31 Mar 2004 15:57:24 MST
From:    "Kurt D Fenstermacher" <KurtF@Eller.Arizona.edu>
Subject: Balancing security and privacy by splitting behavior and identity 

Hello,

I have been working with a colleague in public policy (Chris Demchak) on
ideas for balancing security and privacy in an open society. The key idea in
our current thinking is to separate monitoring of behavior from the tracking
of identity. (We've labeled it the BIK framework, for Behavior-Identity
Knowledge.) By enabling law enforcement and intelligence agencies to record
behavior, they can focus on what ultimately concerns them: actions.
Information on identity would be maintained separately and could not be
linked to behavior without cause.

Behavioral databases would have identifiers, but the identifying information
would be encrypted with a private key. Agencies allowed to maintain
behavioral information could petition (I imagine something like a search
warrant) for the key to decrypt identifying information associated with
suspicious behavior.

We're writing to Privacy Forum readers for your feedback on the overall
notion of tracking behavior and identity separately and for ideas on
cryptographic techniques (e.g., key escrow systems) that would enable the
idea sketched above. I've included some more detail below from a paper that
we had accepted for an upcoming conference, as well as links to the paper
and they key diagram.

BIK Framework diagram:
http://eller.arizona.edu/~kurtf/writing/BIK-diagram.png

   From the paper:

While both sides of this debate have entrenched themselves, we argue that
the question is not, "How can citizens enjoy total privacy?", but instead,
"In an open society, what is the right balance of security and privacy?"  We
begin by proposing that the usual notion of privacy - the inability of
others to know what we do - confounds two simpler notions: knowing what we
do (behavior) and who we are (identity). By separating behavior and
identity, we propose a compromise that enables effective security policies
while protecting the rights of the individual. 

Fig. 1 ( http://eller.arizona.edu/~kurtf/writing/BIK-diagram.png )
illustrates the separation of knowledge about behavior (the horizontal axis)
and knowledge about identity (the vertical axis). Recent government policies
and many government officials argue that we should value security over
privacy, and this weighting places such policies in the upper right-hand
region, where there is extensive knowledge of individuals' behavior and
identity.  Privacy advocates look to shield individuals from prying eyes and
advocate that knowledge of either behavior or identity is unacceptable. The
debate to date is captured in the diagonal line labeled "Security-privacy
debate line", where advocates try to push along the line toward their
position.

We argue new thinking is needed on both sides and that an optimal balance of
security and privacy lies not on the line of the current debate, but below
it. The preferred policy region (in the lower right of Fig. 1) favors
knowledge of behavior over knowledge of identity. Because it is ultimately
actions that concern security personnel, they can capture the most relevant
information by monitoring behavior. Privacy advocates can be assured that
institutional safeguards will ensure that monitoring organizations cannot
associate an identity with an individual without a reasonable suspicion of a
past or future crime. 

While we argue that the default policy should be that security organizations
cannot associate extensive data on identity with deep knowledge of behavior,
there must be some provision for doing so. We propose that security
organizations must meet a minimal threshold to obtain identifying
information. In addition, we argue that because errors are inevitable,
organizations that can join data on identity and behavior must support rapid
procedures for validation and appeal.

As an example, we consider the ubiquitous video cameras that pervade
American life today. By themselves, video cameras enable security personnel
in store, parking garages and office buildings monitor behavior, but a
standard closed-circuit television (CCTV) system does not reveal identity.

However, a face recognition system (see
http://doi.acm.org/10.1145/954339.954342 ) that attempts to match images from
the same video cameras threatens anonymity, but does not monitor behavior.
During Super Bowl 35 in Tampa, Florida, officials used a face recognition
system to scan the crowd in attendance and identified 19 petty criminals. 

  [ Note that face recognition systems have generally shown extremely poor
    performance when it comes to identifying individuals from large groups
    of random people.  In the Super Bowl case, since nobody was stopped or
    questioned, the "hits" were apparently not really confirmed.  Many other
    tests of such systems have been suspended after dismal results.

                 -- PRIVACY Forum Moderator ]

By linking a CCTV system with a face recognition system, an organization ties
identity and behavior together. While this potentially offers the greatest
value to law enforcement, it is also fraught with the most danger for the
average citizen.

In the following sections, we discuss the conflict in policies recently
implemented in the United States and explore these policies in the context
of our behavior-identity knowledge (BIK) framework.

For more details, the complete paper is at:
http://eller.arizona.edu/~kurtf/writing/Balancing-security-privacy-ISI-2004.pdf

Thanks for your time and we welcome your comments,
Kurt

Kurt D. Fenstermacher          MIS Department
Eller College of Business      Computer Science Department (By courtesy)
  and Public Administration    Univ. of Arizona
1130 E Helen St                Tucson, AZ 85719
Voice: (520) 621-4016          Fax: (520) 621-2433
Email: KurtF@Eller.Arizona.edu Web: http://eller.arizona.edu/~kurtf/

   [ I converted the Roman numerals on the Super Bowl reference
     to avoid triggering brain-damaged spam filters...

     We've hashed (no pun intended) over this identity ground before, and
     the results are always the same.  Systems that allow for the mass
     collection and mining of personal data, and then try depend on somehow
     "protecting" individual identities -- to put it bluntly -- are fool's
     gold.  

     Once the information has been collected, it takes only a failure
     of the encryption/protection algorithms, or even more likely
     a change in policy, to render all of the "protections" moot.
     The assumption that the ability to reconnect data with individual
     identities will only be used responsibly is a tenuous one at best.

     As I mentioned in the previous article about Google's "Gmail," there is
     no guarantee of powers-that-be remaining benign when it comes to these
     systems.  With the flip of a virtual switch these surveillance
     infrastructures could become very personal and totally devoid of even a
     modicum of protections.  

     The only sure way to protect personal data from such abuse it not to
     collect it en masse in the first place.  Does this mean that we choose
     not to use some potential tools against terrorists and criminals?
     Indeed.  It means we make a conscious decision not to create the
     societies envisioned in "1984" -- or "Fahrenheit 451" -- or "Brazil"
     -- all of which no doubt developed with the best of intentions.  

     In those fictional societies, the terrorists really were the winners in
     the police states they helped to create.  We don't want terrorism to
     win in our real world.

     The critical balance between liberties and security cannot be
     maintained through technological trickery, but only through
     dedication to the proposition that we will not allow the values
     of our societies to be effectively destroyed by our own hands,
     in an attempt to stave off our enemies.

         -- PRIVACY Forum Moderator ]

------------------------------

End of PRIVACY Forum Digest 13.02
************************


PRIVACY Forum Home Page

Vortex Technology Home Page

Copyright © 2005 Vortex Technology. All Rights Reserved.