Categories
Module

This is new / this is not new

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Quis varius quam quisque id diam. Urna neque viverra justo nec ultrices dui sapien eget.

Overview

  • Show the exercise graphic (for participants)
  • Show the census and Clearview articles (for participants)
  • Show the timeline graphic (for participants)
  • Talk about facilitator slides
  • Talk about facilitator handout
  • Talk about answer key for news articles (facilitator first, then participants)

Exercise handout and news articles (for participants)

Jan. 9, 2020, NBC News — “Many Latinos believe a citizenship question will be asked on the 2020 census and are less likely to participate, a national Latino leader told Congress on Thursday[…] Last June, the Supreme Court ruled the administration could not add the question about citizenship to the census. The court found the administration’s justification for adding the question was “contrived”. […]“The 2020 census could leave communities across the country undercounted, underrepresented and underfunded,” Maloney said.”

January 21, 2020, CNET.com– “What if a stranger could snap your picture on the sidewalk, then use an app to quickly discover your name, address and other details? A startup called Clearview AI has made that possible, and its app is being used by hundreds of law enforcement agencies in the US, including the FBI, according to a Saturday report in The New York Times. […] The app, says the Times, works by comparing a photo to a database of more than 3 billion pictures that Clearview says it’s scraped off Facebook, Venmo, Youtube and other sites. It then serves up matches, along with links to the sites where those database photos originally appeared. A name might easily be unearthed, and from there other info could be dug up online.The size of the Clearview database dwarfs others in use by law enforcement. The FBI’s own database, which taps passport and driver’s license photos, is one of the largest, with over 641 million images of US citizens.”

Printable version here


Timeline graphic (for participants)


Printable version here


Slides (for facilitator)

Current slides here (slides 1-10)

TODO


Exercise handout (for facilitator)

Exercise: This is new / This is not new

In what new skin will the old snake come forth?
—Frederick Douglass (Black abolitionist and reformer)

History is not the past. It is the stories we tell about the past. How we tell these stories—triumphantly or self-critically, metaphysically or dialectically—has a lot to do with whether we cut short or advance our evolution as human beings.

—Grace Lee Boggs (Chinese-American activist and revolutionary)

Overview

Imagine you see an alarming headline about surveillance in the New York Times or on CNN. How do you tell apart what’s really new about the news from what’s just old news

This exercise helps participants work together to recognize the “new skin” of old surveillance thinking narratives (as Frederick Douglass put it) that reinforce the age-old unbalanced power dynamics between different groups of people.

There are no distinct events that mark the exact “moment of being watched,” so we will refer to a number of snapshots in the American history of slavery, racism, indigenous erasure and xenophobia that play into the history of surveillance, criminalization, and disinvestment. 

Goals

  1. Equip participants with the mental tools and background to break down new surveillance stories by understanding their historical context
  2. Have a narrative understanding of how power dynamics and values of oppression in surveillance thinking 
  3. Identify the impacts of surveillance and points of intervention

Supplies

  • Printouts of news events
  • Printout of timeline
  • Instructions to the participants

Instructions

  1. Participants split into two groups.
  2. Each group is given a news article, along with a framework for understanding what is not new (a timeline of surveillance events) and what is new about our circumstances today (e.g. information technology).
  3. Each group breaks down the news article identifying resonances with historical events, aspects that are unprecedented, and intervention points.
  4. The two groups reconvene and discuss their findings, with an eye toward analyzing news articles in the wild.

Discussion Questions

This is Not New:

This is New:

  1. On your linked narratives, what has changed in method—especially with data availability and data sharing?
  2. What interventions can you identify?

Themes and Language to Hold

Communities that have experienced long-held injustices and barriers to resources have been loudly calling attention to issues of access, criminalization, and the logics of oppression beyond surveillance for a while. We defer to those who have been fighting in this realm and highlight the language used in their counter-narrative work.

“Power Not Paranoia”
(A framework for collective strength against the surveillance state)
Our Data Bodies, Stop LAPD Spying Coalition 

“Detroiters Want to Be Seen, Not Watched”
(A special surveillance issue inspired by Our Data Bodies against Detroit surveillance)
Tawana “Honeycomb” Petty, Detroit Community Technology Project

  1. 2020 Census Citizenship Question: Latinos, Asian Americans still fear 2020 census over citizenship question, witnesses tell Congress (NBC News, Jan 2020)
  2. Clearview app lets strangers find your name, info with snap of a photo, report says (CNET.com, Jan 2020)

(Old) Motives: What narratives are we fighting?

Racism: Slavery, Lantern Law, Plantation Management, Jim Crow, Redlining, Welfare Queen, War on Drugs, Broken Windows, racial profiling: Stop and Frisk

“Threats to Capital”: Red Scare: Dismantling of unions, surveillance of workers, Welfare Queen myth, criminalization of poverty: Broken Windows, War on Drugs

Xenophobia: Anti-immigration border enforcement, racial profiling (AZ SB †1070), Japanese Internment / Yellow Peril, War on Terror: NYPD Surveilance of Muslim Communities, War on Drugs, ICE workplace raids

State Enemies”: Japanese Internment, Red Scare (anti-labour), Japanese Internment / Yellow Peril, War on Terror: NYPD Surveilance of Muslim Communities

(New) Methods: What are the big data implications of surveillance?

Volume: Volume is what enables modern day facial recognition and autonomous vehicles, and the store and linkage of large databases provides new modes of identification and tracking of individuals, whether for an ad or a police investigation.

Velocity: The speed at which data gives access to actionable and identifiable information is at a different scale that in the past.

Variety: New private and public data sources, especially in the form of social media, digital trails, and mobile phone data are increasingly tied to our identities.

Veracity: Use of data gives an appearance of objectivity, even if the processing is flawed or the data itself is biased.

Detailed timeline (for facilitator)

Timeline events

  • 1712: New York City Slave Revolt: With New York’s dense population, enslaved people were able to occupy public space together. A group of twenty-three slaves incited a revolt to inspire further rebellion, killing nine slave owners and injuring six. 
  • 1713: The Lantern Law: As a part of crackdown in the city, the Common Council of the City of New York passed “A Law for Regulating Negro & Indian Slaves in the Night Time,” requiring all slaves older than 14 to illuminate themselves with a lantern after sunset, deputizing private citizens to bring in violators to the goals for punishment
  • 1831: Tice Davids Escape: 1st documented escape from slavery via the “Underground Railroad,” a network of abolitionists who aided slaves escape passages to the North.
  • 1850: Fugitive Slave Act: Passed as a part of the compromise of 1850, freed slaves became marked for re-capture, and the act mandated the federal government’s role in to re-capture enslaved people. This was passed to appease southern slave owners. 
  • 1890: The US Census puts out a call for new, faster methods to conduct the next census. Hollerith automates the US Census with punch cards, inspired by train ticket conductors that punched fare cards to track the physical characteristics of riders as a part of fare enforcement. This machine is a predecessor to the same census technologies that became the predecessor for IBM
  • 1933: German Census
    • Hollerith’s successor, IBM helps the Nazi party take a national census targeted at Jews 
    • Hollerith Machine
  • 1942: Japanese Internment 
  • 1969: Red Squads, COINTELPRO
    • As a part of widespread domestic surveillance of social and political movements by the NYPD in the form of “Red Squads,” the NYPD indict “Panther 21” Black Panther Party on over a hundred of unfounded conspiracy charges
    • The NYPD worked in concert with the FBI’s COINTELPRO operations, which were projects that ran 1956–1971 to infiltrate social movements
    • The trial revealed the extent of pervasive infiltration by police and FBI and sparked a subsequent lawsuit in 1971
    •  Joy James; Duke University (2007). Warfare in the American Homeland: Policing and Prison in a Penal Democracy. books.google.com. p. 74. ISBN 978-0822339236.
  • 1971 Handschu Case
    • After the “Panther 21” are acquitted, their lawyers sue & end the practice of social and political surveillance normalized by the NYPD Red Squads. The “Handschu Agreement” establishes oversight limiting domestic spying by NYPD
    • Matt Apuzzo, Adam Goldberg; Simon & Schuster (2013). Enemies Within: Inside the NYPD’s Secret Spying Unit and bin Laden’s Final Plot Against America. books.google.com. p. 44. ISBN 978-1476727936.
  • 1985: San Francisco passes the “City of Refuge” resolution
  • 2001: Congress passed the PATRIOT Act
  • 2008: Secure Communities Act
    • The joint systems were first asked for under the USA Patriot Act of 2001 and the Enhanced Border Security and Visa Entry Reform Act of 2002 
    • Under the Secure Communities Act, local & state corrections agencies gain access to a joint database linking the FBI’s Integrated Automated Fingerprint Identification System (IAFIS), which contains 66 million criminal records, and the Department of Homeland Security’s (DHS) Automated Biometric Identification System (IDENT), which holds over 90 million immigration records.
    • Secure Communities was created in 2008 when the technology for the joint database was feasible. ICE received 0 from Congress to prioritize and deport foreign nationals who had been “convicted of a crime, sentenced to imprisonment, and who may be deportable.” 
    • Secure Communities justified current data sharing and access that continued into the Trump administration’s deportation policies that increase in the number of raids, the length of detention time, and the number of immigrant detainees without criminal records.
    • Unanswered Questions Surround ICE’s Secure Communities Program 
  • 2013: NYPD Spying of Muslims

Sample answer keys (for facilitator and participants)

Example deconstruction: 2020 census

2020 Census Citizenship Question: Latinos, Asian Americans still fear 2020 census over citizenship question, witnesses tell Congress

What’s happening?

The US federal government (particularly President Donald Trump) wanted to ask a citizenship question on the 2020 census: “Is this person a citizen of the United States?” Latino and Asian American communities fear that this creates a hostile environment toward immigrants. This could lead to under-representation and under-resourcing of those communities. The 2020 census is kicking off this month (Jan 2020).

What’s not new about it?

Census data collection has always been a tricky question because of the tension between different ways a community may be seen. The Census has been the basis of American representation and taxation since the 1787 Constitutional Convention that created the United States Constitution and with it, the “three-fifths compromise” of counting slaves as sub-human. 

While you want your community to be counted for representation or resources, you don’t want a “knock on the door” for, say, targeted persecution.

One dangerous precedent was the actions of the Census Bureau in the 1940’s for World War II. As Scientific American writes: 

Despite decades of denials, government records confirm that the U.S. Census Bureau provided the U.S. Secret Service with names and addresses of Japanese-Americans during World War II.

The Census Bureau surveys the population every decade with detailed questionnaires but is barred by law from revealing data that could be linked to specific individuals. The Second War Powers Act of 1942 temporarily repealed that protection to assist in the roundup of Japanese-Americans for imprisonment in internment camps in California and six other states during the war. 

In general, we can see this “old snake” of targeted oppression surface in other past census events as early as a century ago. These include the 1890 automated census with Hollerith machines and 1933 German census. 

Moreover, even as the means of measurement grow more sophisticated, the motivations stay the same: American wartime exceptionalism combined with a xenophobic fear of the “yellow peril” led to certain populations being targeted.

Finally, this event can relate more broadly to the means of monitoring/control of certain populations, such as plantation owners surveilling enslaved peoples by requiring detailed timecards of labor performed. 

What’s new about it?

With old censuses, you had to dig data out of punchcards in a drawer and mail things back and forth; it would take days to make requests through a paper-shuffling bureaucracy. Now, sharing data is almost instantaneous, as easy as a click of a button, with the ability to make powerful queries and filters and discover powerful correlations.

There’s a common big data hype about higher volume, velocity, and variety. This all holds true for information collected by the 2020 census than in past censuses. However, veracity is not really a factor in this story—in fact, the 2020 citizenship census question is the only one that was not “scientifically validated” through the same process as the rest of the census questions.

What can we do to intervene (both past and present)?

Socially, it is important to counter the rhetoric of wartime exceptionalism (“we can weaken protections for individual citizens because the state needs more power, because we’re at war”) by understanding that such protections are difficult to reinstate once eroded, as in the example of the War Powers Act.

Finally, anyone involved in the chain of data-sharing can offer individual acts of resistance: we won’t build it; we won’t share it. For example, in WWII, various sympathetic officers would refuse to turn over addresses of Jews or other targeted populations when asked by the state. 

Example deconstruction: Clearview facial recognition

What’s happening?

  • A new startup (Clearview AI) has built a very powerful facial recognition app that threatens to “end your ability to walk down the street anonymously.” 
  • News came out about it just this month (Jan 2020).
  • It works by combining facial recognition tech with a database of images larger than any entity (including both the US government and big tech companies) has offered before, created by scaping websites. 
  • Their tech has already been deployed by federal and tech law enforcement officers to solve cases ranging from shoplifting to murder. It is not yet available to the public, but (according to NYT) Clearview’s investors predict that it will be available eventually to the public.

What’s new about it?

  • Uniform blanket deployment (i.e. in all of public space, not just for special circumstances like police pursuing a suspect).
    • This threatens a much larger group of people than ever before, since the tech can be deployed indiscriminately and pervasively, hence the massive news coverage (NYT expose).
  • Erosion of social taboos.
    • NYT: Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” 
  • General lack of regulation of these technologies.
    • They’ve been deployed faster than legislation and public understanding can catch up.
    • …As opposed to, say, guns, which pose a well-known danger in America that the public and lawmakers understand (though are still ill-regulated).
  • North American big-city norms of privacy.
    • People actually expect to be able to walk down the street anonymously—which is good! But this is not, say, the norm of privacy in a small town.
  • Democratization of information and of information technology.
    • Availability of data — it used to be extremely hard to gather face photos and process them. Now it’s as easy as “surfing the web.”
      • Wired: Ten faces may now seem like a pretty pipsqueak goal, but in 1963 it was breathtakingly ambitious.
      • Wired: In 1973 a Japanese computer scientist named Takeo Kanade made a major leap in facial-recognition technology. Using what was then a very rare commodity—a database of 850 digitized photographs … Kanade developed a program that could extract facial features such as the nose, mouth, and eyes without human input. 
      • NYT: Police departments have had access to facial recognition tools for almost 20 years, but they have historically been limited to searching government-provided images, such as mug shots and driver’s license photos.
    • Advent of personal computing — anyone with an app + internet access can do it. (e.g. One rogue programmer named Hoan Ton-That.)
      •  In recent years, facial recognition algorithms have improved in accuracy, and companies like Amazon offer products that can create a facial recognition program for any database of images.
    • Scale of digital infrastructure — massive compute power is available to developers for cheap; CCTV cameras are everywhere in North American cities.
    • Advances in machine learning in the last 10 years.
      • Wired: Only in the past 10 years or so has facial recognition started to become capable of dealing with real-world imperfection, says Anil K. Jain, a computer scientist at Michigan State University and coeditor of Handbook of Face Recognition. 

What’s not new about it?

  • Capitalism (particularly the North American breed). Hurting impacted communities doesn’t (directly) hurt a company’s bottom line!
    • Capitalism drives the understanding of personal data as a resource to mine, and hoarding personal data as a competitive advantage for companies and governments alike.
    • Capitalism gives companies incentives to build and sell technologies to governments and institutions, regardless of impact on communities.
  • Social biases in who is building, deploying, and funding new technologies (racism, sexism, pipeline problems, etc.).
    • Wired: Many of the biases that we may write off as being relics of Woody’s time—the sample sets skewed almost entirely toward white men; the seemingly blithe trust in government authority; the temptation to use facial recognition to discriminate between races—continue to dog the technology today.
  • Photography/imaging/measuring technologies, especially those involving unique identifiers, are deployed by the state to control vulnerable populations.
    • Examples from the timeline: Bertillonage; Lantern laws; mugshots of prisoners, fingerprints; Polaroid photos to reinforce apartheid.
    • Wired: Over the following year, Woody came to believe that the most promising path to automated facial recognition was one that reduced a face to a set of relationships between its major landmarks: eyes, ears, nose, eyebrows, lips. The system that he imagined was similar to one that Alphonse Bertillon, the French criminologist who invented the modern mug shot, had pioneered in 1879. Bertillon described people on the basis of 11 physical measurements, including the length of the left foot and the length from the elbow to the end of the middle finger. The idea was that, if you took enough measurements, every person was unique. Although the system was labor-intensive, it worked: In 1897, years before fingerprinting became widespread, French gendarmes used it to identify the serial killer Joseph Vacher.
  • Poor decision-making frameworks on the part of the tech people who are building it and the law enforcement people who are using it.
    • NYT: Asked about the implications of bringing such a power into the world, Mr. Ton-That seemed taken aback. “I have to think about that,” he said. “Our belief is that this is the best use of the technology.”

What can we do to intervene (both past and present)?

  • Work toward abolition.
    • Academics have called facial recognition “the plutonium of AI.”
    • Oakland and San Francisco have passed bans on facial recognition technology.
    • Short of that, pass a strong federal privacy law requiring oversight + procurement processes for surveillance tech.
  • Build safeguards into tech from the start to protect vulnerable communities.
    • A poorly-designed Internet, with no safeguards against crawling or mechanism for data control/provenance, enables information to be freely transferred.
  • Perform individual acts of resistance.
    • Engineers can say “We won’t build it” and pressure other engineers to do the same; people inside law enforcement agencies can refuse to use these technologies.
  • Support pedagogies, sociocultural norms, and capitalistic motives that make it hard for people like the Clearview founder to advance in life.
    • Conventional computer science education will help you answer the question “how do I crawl all the photos on the open web as fast as possible?” but is largely silent on important matters like “but why shouldn’t I do that?”
    • One instructive example is the Dow company, which colluded with the US government to produce napalm for the Vietnam War in 1965. Activists worked to ensure Dow’s reputation reflected this “unethical war profiteering,” and the million napalm contract likely cost Dow billions of dollars.
    • Canadian-trained structural engineers swear by a code of ethics, symbolized by the Iron Ring. Doctors swear by the Hippocratic Oath. Why don’t information workers and software engineers do the equivalent?

References for Clearview

Other news articles to deconstruct