Categories
Module

The People’s Field Guide To Spotting Surveillance Infrastructure

Categories
Module

A walking tour of surveillance infrastructure in Seattle

Note: this guide is a work in progress and may change at any time! We’ve done our best to cite our sources, but this page has not been professionally fact-checked.

This workshop was first run as part of two pilot workshops with the Tech Equity Coalition, in partnership with the ACLU of Washington, in October 2019. A zine based on this work was included at the CtrlZ.AI zine fair and the HOT MESS digital exhibition in 2020.

Introduction

In this tour of downtown Seattle, we’ll practice spotting some of the layers of the “smart” city that are hidden in plain sight, collecting and storing data about our lives, as well as the kinds of thinking that justify their existence. Each surveillance technology in our field guide includes the following categories to help you “spot” surveillance technology in the wild: Address, Appearance, What it does, How the tech works, Social importance, Discussion and finally, References.

Tour route

This is the route we will be taking on the walking tour. Click on each stop to pop up its location, and feel free to explore it on Google Maps, e.g. with Street View. The route spans 1.3 miles. Below, we outline each of the surveillance tools/sites listed above.


Surveillance cameras

Address: Practically everywhere, but the above example is at 523 Union St.

Surveillance camera hotspots (red = more likely)

Appearance: Poles, ledges, overhangs, rooftops. They are often spotted watching parking lots, doors, banks, intersections, and government buildings. Indoors, they are typically spotted on roofs and near cash registers.

Different types of surveillance cameras.

What it does: The camera has a memory. It can record video or other data and add it to a store of records over all time. The camera can be controlled remotely: it can swivel, zoom, or change height.

How the tech works: Camera recordings can be analyzed for patterns and shared with other entities, both private (your neighbors) and public (the local police).

It might be connected to a network (via Internet or radio frequency), which lets it send video to anywhere, receive instructions from anywhere, and lets other people, who might be anywhere, watch the video stream.

Social importance:  The camera can have different ways of seeing encoded in it, including kinds of gazes that enforce social agreements about what kinds of behavior and people are considered “normal” and these gazes can be propagated all at once to the whole network of enforcement that the camera hooks into.

Discussion

  • What are other ways to question the need to have cameras, or surveillance, at all? What sort of society would we build around this way of life?
  • What are your individual or communal experiences of “light shining more brightly on some than others”?
  • What if each camera were replaced by a person? How would that change how you feel?

References


Amazon Go


Address: 2131 7th Ave

Appearance: Looks like it could be any other convenience store… but it’s not! Inside, you must scan an app to enter, and there are no cashiers.

Notice the gates where you must scan a QR code from your Amazon account to enter.

What it does: Amazon Go tracks your movement using overhead cameras to determine your browsing habits.

Overhead cameras in the ceiling of the store, tracking consumers’ movements in the store.

How the tech works: Amazon can use your purchases to know more about you using patterns. For example, if you buy Hanukkah decorations, they might know you’re Jewish. Or certain foods might be correlated with certain health issues. They can combine your in-store purchases with your online Amazon purchases for even more predicting power.

Social importance: Patterns can be harmful! They can reinforce stereotypes. For example, Google Photos labeled photos of Black people as “gorilla.”

What is Amazon doing with their knowledge about you? There’s no oversight or transparency. Your data could be sold to third parties without your consent.

Discussion

  • What are the societal effects of targeting ads based on [race, gender]?
  • When you go into Amazon Go, what do you imagine you are consenting to? How does this differ from the reality? How could this be changed?

References


Automated license plate reader

Address: 699 Spring Street

Fig. 1. One automated license plate reader (we believe) perched over a highway onramp in downtown Seattle, here, the traffic entering the ramp onto the I-5 Express highway that cuts vertically through downtown Seattle.

Appearance: An automated license plate reader (ALPR) is a little camera that is either mounted to a pole (stationary) in high-traffic locations or the top of a police car (mobile) (Fig. 2.).

Fig. 2. An ALPR can be mounted (usually high above major roads) or mobile (in/on police cars).

What it does: An ALPR photographs the license plate of every car that passes by and records the time and place of the encounter, as well as the plate number (Fig. 3.), and sends the information to a central storing place (called a database). Based on the information from an ALPR (e.g. “plate number ABC1234 detected at the intersection of Pike and Pine at 1:20 PM”), and the type of ALPR, a particular city agency may take an action.

Fig. 3. An up-close view of a stationary ALPR.

There are three main kinds of ALPRs in Seattle. Stationary ones (type #1), owned by the Dept. of Transportation, are used for traffic purposes, to estimate travel time. Mobile ones, owned by the Seattle Police Dept., are used for parking enforcement (type #2) or law enforcement (type #3), to ping a police officer directly when a “wanted” license plate is spotted. These three kinds of ALPRs have different data retention periods; police ALPR data can be stored for up to 90 days, whereas other ALPR data is (supposedly) deleted immediately. In Seattle, the Seattle Department of Transportation has at least 99 stationary ALPRs deployed, and the Seattle Police Department has 19 vehicles with mounted ALPRs.

How to spot: ALPRs are usually mounted up high near high-traffic areas, like highways, downtown areas, interstates, and bridges. Maps of stationary ones are difficult to find because cities don’t want drivers knowing where they might be issued speeding tickets.

Social importance: Zooming out from an individual ALPR, ALPR systems collect data at a huge volume: the SDOT ALPR system collects information 24/7, resulting in 37,000 license plates in 24 hours, or 13.5 million scans per year. Plates are collected indiscriminately, even from those not suspected of a crime. Nationally, billions of plates are collected nationwide per year.

Regulations on ALPR use—both the technology and the data collected—are mostly nonexistent nationally, as well as in Seattle. That means that the agency that owns the system can choose whether and how they want to retain data, or track vehicle movements. Check out the map in Fig. 4: though SDOT says it does not track individual drivers’ movements, data from an ALPR system could easily be combined to do so.

Fig. 4. Our guess at ALPR locations in Seattle. (This is not a definitive map of ALPR locations.) As a car moves, its location could be recorded by multiple ALPRs. (Image: WSDOT Traffic GeoPortal)

Because of the lack of regulations, nationally, data-sharing is rampant with license plate data. According to EFF, many law enforcement agencies share plate data directly with each other, even across borders. ALPR data also makes it into private databases such as Thomson Reuters’s CLEAR, access to which can be bought by agencies and private corporations. In Seattle, SDOT and SPD say that they do not share data directly from ALPR systems, but it’s unclear what agencies might be able to access data with a request (per the two Seattle Surveillance Ordinance reports on ALPRs).

When it comes to ALPR data, beware of scope creep (Fig. 5): due to pervasive collection and data-sharing, your license plate could leave its original context and purpose and be used in ways you never consented to, such as private investigation or targeted advertising.

Fig. 5. A diagram of scope creep: how your data, for example an ALPR sighting of your license plate, could leave its original context and purpose without your knowledge or consent.

How the tech works: ALPR is one of the older surveillance technologies; it was first invented and tested in the UK in 1984 to detect stolen cars. It uses a technique called optical character recognition (OCR), from a field called computer vision, to automatically make a guess at the letters and numbers in a picture of a license plate. This guess is probabilistic; i.e. it could be wrong. Database technologies allow all the information collected by ALPRs to be collected, and questions asked of it.

Interventions

  • In 2015, California and Minnesota passed strict laws placing limits on ALPR data-sharing. Minnesota also bars law enforcement from photographing a vehicle’s occupants. (Source: STOP)

Discussion

  • Of the three types of ALPRs, which ones do you think should be used in Seattle?
  • Is the convenience of travel time estimates (e.g. WSDOT’s chart) or more efficient law enforcement worth the privacy leaks? What are less-invasive ways that we could achieve the same goals?
  • How might ALPR use, and data collection, impact you collectively, as an innocent person who is not directly targeted by the state?

Further questions

  • What agencies have access to these systems?
  • Which cities are using this or considering?
  • What are the WA state rules regarding law enforcement using private sources of ALPR data?
  • Which tech companies are providing these systems and how much info do they keep (and what is it used for)?

(We are leaving answers to these questions out of our introductory writeup, but encourage you to find out the answers for your city. Thank you to Tech Fairness Coalition members for asking these questions!)

References


Acyclica


Address: Corners of Spring & 5th and Spring & 4th

Appearance: Flat black circles on top of traffic signal control boxes, which are large, gray or painted metal boxes, typically found at street corners.

What it does: The Acyclica device casts a fake Wi-Fi network and tracks phones that try to join the network in passing cars. Since each phone has a unique identifier (called a MAC address – like your Social Security Number, but for a device), different Acyclica installations can track your personal location as you pass them in the city.

"Question:

How the tech works:  You know how your phone or laptop auto-connects to Wi-Fi networks?  To do this, your device is shouting to the world a ton of your personal information in something called a probe packet. A probe packet contains the MAC address as well as the list of all the past Wi-fi networks that your device has tried to join before, which can reveal a lot about you! (See Fig. 1.) Acyclica listens for these probe packets, and keeps track of the different places it has heard your MAC address to create a location history.

Social importance: This technology raises many questions around data privacy and consent. Were members of the public asked for consent before the system was installed? No.

Another big issue is data escaping scope. The Seattle city government may promise certain things about the data, but data that govermnment agencies collect historically has a funny way of being stored for longer than promised and shared with other agencies (like ICE or law enforcement) or quasi-private entities (like Palantir) and used to circumscribe the movements of members of marginalized communities.

Discussion

  • How do people feel about how Acyclica is collecting their data? What could go wrong? What does the process of coercive data collection “feel” like a mosquito bite? a highway robbery?

References


Washington State Fusion Center

Address: Visit the Washington State Fusion Center (WSFC), in the Abraham Lincoln Building, 1110 3rd Ave, Seattle Washington, 98101

Appearance: Seattle’s fusion center seats a team of 15-30, with full time intelligence officers from the Seattle Police, County Sheriff, state investigators and analysts. These center employees are linked through the State Intelligence Network to every law enforcement agency in the state, and have access to the FBI both through their computer systems as well as through a security corridor linking them to the FBI’s own Field Intelligence Group office on the floor above as well as the Puget Sound Joint Terrorism Task Force.

Fig. 1. Fusion centers are located in every state of the United States. WSFC is just one of roughly 75. (Source: Public Intelligence)

What it does: After 9/11 fusion centers were born with the “Intelligence Reform and Terrorism Prevention Act of 2004” (IRTPA) along with a host of other “counter-terrorism” intelligence entities such as the Department of Homeland Security. With 18 centers first established, there are now 78 recognized centers. Fusion centers facilitated a national anti-terrorism strategy of intel sharing between local and national agencies as well as with private companies and the military. 

Fig. 2. The U.S. government’s cross-sector information system. (Source: Public Intelligence)

How to spot: This building’s location in downtown Seattle is no accident. Most fusion centers are typically located in urban centers to put them in the center of multiple agencies that administer public safety needs, fire, emergency response, public health providers, and private sector security agencies. 

Social importance:

Multiple incidents of privacy violations and political monitoring are definite examples of concerns associated with fusion centers. But actually as Brendon McQuade argues in “Pacifying the Homeland: Intelligence Fusion and Mass Supervision” this confusing array of coordinating agencies makes it harder to expose political policing the same way as COINTELPRO in the Panther 21 / Handschu Case.

Fusion Centers are mandated to include private sector involvement and their priorities are split between multiple stakeholders at the local, federal, and private level. This puts the role of fusion centers in a fractured light. 

Many fusion centers have played a role in monitoring movements. From the Cato Institute’s summary of ACLU Fusion Center reports, “We’re All Terrorists Now:”

“The North Texas Fusion System labeled Muslim lobbyists as a potential threat; a DHS analyst in Wisconsin thought both pro- and anti-abortion activists were worrisome; a Pennsylvania homeland security contractor watched environmental activists, Tea Party groups, and a Second Amendment rally; the Maryland State Police put anti-death penalty and anti-war activists in a federal terrorism database; a fusion center in Missouri thought that all third-party voters and Ron Paul supporters were a threat; and the Department of Homeland Security described half of the American political spectrum as “right wing extremists.”

However, their role during the Occupy movement showed that many fusion centers claimed official policies of non-involvement in line with DHS’s official policies at the federal level. In cases with private sector stakeholder interests such as in Arizona, however, we see a different story. When Occupy Phoenix targeted the American Legislative Exchange Council (ALEC) for its profit ties to ICE and its role in passing a bill that allowed law enforcement to racially profile latinx drivers, Arizona’s fusion center assigns an officer to monitor Occupy Phoenix and liaise with ALEC. ACTIC Provided ALEC with intelligence, including a “persons of interests” list regarding an protest of an ALEC conference who were later targeted with arrests.

How the tech works:

Fusion centers do not store most of the data available to them. Instead, they negotiate agreements that allow remote access to existing databases. They will work around privacy protections and buy access to the private databases (e.g. Vigilant’s ALPR database; see ALPR walking tour stop), which provide a plethora of information on individuals with no criminal record.

Fusion centers have access to the DHS’s Homeland Security Data network, and several FBI data portals. A few databases used by the WSFC include: 

  • Law Enforcement Information Exchange (LINX)
  • FBI Systems
  • WAFUSION Intake Log
  • Regional Information Sharing System Database (RISS)
  • Homeland Security State and Local Intelligence Community
  • Law Enforcement Online (LEO)
  • Washington State Emergency Management Department
  • DHS Infrastructure Protection Protective Security Advisor (LENS, IRIS)

Interventions:

What are tools we have against such a large federal, local, and private conglomerate? The strengths of a fusion center also contain its weaknesses. A fractured chain of command often presents conflicts and confusion with rival agency agendas. It appears that some measure of transparency calls and privacy concerns work after major incidents, with some ability to keep watch on the stakeholder agendas that float through fusion center information requests via public records requests. 

Perhaps the greatest effective intervention comes from its funding structure. Though the core “hub” of fusion centers come from federal grants, the specific programs of a fusion center are funded more individually, coming from grants that focus on domains including education, health, and neighborhoods. Such programs promote a model of community wellness that relies on police enforcement. And finally, pre-empting the creation of such centers in the first place might be the most effective intervention with these centers.

Fig. 3. If you see something, say something. (Source: Northwest Warning, Alert & Response Network)
 

Discussion

The location of this fusion center represents a focal point of infrastructure and power. What is being melded together at these fusion centers? Fusion centers popped up in the years after 9/11, particularly from 2003-2007, from an infusion of homeland security grants. This marriage between federal agencies including the CIA, FBI, Homeland Security and other federal bureaus brings a level of national scrutiny to the local level, with individual reporting made possible through the Nationwide Suspicious Activity Reporting (SAR) initiative. This resulted in two European businessmen being reported for “looking suspicious” on the Washington State Ferry in 2007.

Seattle’s most famous case involves the arrest of anti-war Port protestor Phil Chinn, a student at Evergreen State College who was arrested during an anti-war protest in May 2007. The activists had been infiltrated by an army intelligence officer who disseminated protestor information through the fusion center. Not long after the Chinn incident, the center changed its name from Washington Joint Analytical Center (WAJAC) to the Washington State Fusion Center and implemented a number of changes that appear to conform to tighter privacy controls and civil liberties concerns from advocates. 

Fusion Centers play a large role in a “human rights compliant” world of reformed policing. As  Brendon McQuade argues in “Pacifying the Homeland: Intelligence Fusion and Mass Supervision,” the decentralized combination of interests results in a model of “pacification” or control of the economically dis-enfranchised, via a pervasive collection of data to flag and identify locations, communities, and other populations.

We see this in Camden, where a new “reformed” Camden County Police opened a fusion center, the Real Time Tactical Operations Intelligence Center. This center powers a surveillance city, with cameras, ShotSpotter gunshot detectors, automated license plate readers, a mobile observation tower. Community policing interactions become intelligence gathering exercises. All of these data streams flow into Camden’s “fusion center” producing “predictive” analytics to direct police to blocks of interest. Fusion centers exist in a world where surveillance is presented as an alternative model to incarceration and traditional policing. The term used in connection with widespread domestic surveillance – “pacification” – evokes themes of criminalization, gentrification and displacement.

References


AT&T peering site (NSA wiretap site)

Address: 1122 3rd Avenue

Appearance: Tall, windowless building tucked behind a bus stop, with an AT&T logo and sculpture on its front.

Fig. 1. 1122 3rd Avenue: The AT&T peering site. Business in the front…

What it does: This is a building called a peering site where telecoms (like AT&T) exchange digital information, such as emails, phone calls, and internet chats. Naturally, as a hotspot of information, this is also a good place for intelligence agencies (like the NSA) to wiretap (or eavesdrop on and record) the information.

How to spot: Peering sites are critically important buildings meant for machines, not people. So, they are usually tall, windowless, shored-up buildings near downtown information hotspots, often near law enforcement or intelligence centers.

Fig. 2. …surveillance in the back. (The NSA wiretap site)

Social importance: This is just one of the NSA’s eight wiretap rooms in the FAIRVIEW surveillance program (Fig. 3), which allows it to read non-US citizens’ communications with no problem, as well as store US citizens’ communications, which it may read as the law permits (citation needed). AT&T is well-known for its “extreme willingness to help” the NSA.

Fig. 3. The Seattle AT&T + NSA facility is connected to the rest of the peering hubs that process internet traffic as part of the FAIRVIEW surveillance program. (Source: The Intercept)

How the tech works: Because AT&T has such a huge network, it trades network capacity with other Internet service providers, or exchanges it with them. The way that two networks “peer” is that their infrastructure physically meets in a building (as shown in Fig. 4.) to exchange information. 

Fig. 4. In a big river of information, the NSA can put a sieve.

The NSA is able to eavesdrop on this exchange (through an unknown means) and search through the collected communications.

Discussion

  • If you worked at AT&T, what would you do if the NSA asked you to install its wiretapping equipment?
  • When it comes to surveillance, should it make a difference whether someone is a US citizen?

References


Discussion

After taking participants through the walking tour, it’s good to bring folks back to a room and have a group discussion of what folks just experienced. Some questions are:

  • How can someone “escape the smart city”?
  • How is surveillance different in different U.S. cities?
  • What are the different layers of surveillance we just discussed? (Help folks distinguish between private, local, state, public, federal, and corporate, and how they interact.)

If you have feedback on this page, or would like to use or adapt the tour in your city, please send us an email.

Categories
Module

Mapping data stories in the surveillance ecosystem

Note: this guide is a work in progress and may change at any time! We’ve done our best to cite our sources, but this page has not been professionally fact-checked.

This workshop was first run as part of two pilot workshops with the Tech Equity Coalition in partnership with the ACLU of Washington in October 2019. It was next run at the CtrlZ.AI zine fair in Barcelona in February 2020. Most recently it was run online for the Tech Equity Coalition in July 2020, co-facilitated by a leader of the Council on American-Islamic Relations Washington (CAIR WA).

Facilitators: 1 to 2, plus 1 per breakout group if conducting digitally
Group size: 5-25
Time: 60-90 minutes
Recommended Location: see In-person vs. Digital section for specifics

Introduction

Surveillance is a complicated topic. We are surveilled by a host of private companies, government agencies at levels from municipal to federal, in physical and digital space, and by a staggering range of technologies.

This workshop has been designed to be useful for individuals who feel overwhelmed by this complexity, and might have questions like “how does this affect me?” or “what should I be most worried about?” It can also be a useful framework for understanding what interventions might be effective in protecting you or your community’s privacy.

The following is a facilitator-facing guide for starting that process. It uses stories and mapping as the primary vehicles for uncovering and categorizing entities and for tracing the flow of information. All of our stories are plausible narratives that we wrote based on actual technologies or events reported by reliable technology journalism organizations. (So, while a story may not have literally happened, it very plausibly could have happened, in our opinion.)

In-person vs. Digital

This workshop has been run successfully both digitally and in-person. Below are material and space requirements for both versions, as well as some tips and tricks.

In-person

In-person versions of this workshop are engaging, and generate conversation and great material evidence for participants and facilitators alike.

Material/space requirements

  • Printouts* of map & story, 1 per participant, ideally of the size 11×17″.
  • Lots of colorful markers, pens, pencils, etc. At least 5 different colors.
  • 1 large format B+W print* of map, 24×36″ or larger. Fedex + UPS stores sell prints of this size for about .
  • Screen/projector for presenting slides. To edit the agenda/structure, you can duplicate them (File > Make a copy > Entire presentation)
  • Tables + chairs. Ideally small tables to allow for easy break-out groups.

*Contact us if you would like PDFs of these materials.

Digital

This workshop can be engaging and unique in a digital format. We recommend you use Zoom for the workshop itself, or any other video conferencing service that allows breakout rooms (if more than 5 participants) and screen sharing. Unfortunately, while still possible, the act of highlighting the stories and drawing connections is somewhat fiddly. See the Mapping! section for specific advice if conducting this workshop digitally.

Google Slides allows for live editing, which can be used in lieu of highlighting and drawing connections. Here is a link to the slides. You can duplicate the slides (File > Make a copy > Entire presentation) to edit the agenda or structure as you see fit.

Agenda

This agenda is flexible upwards (up to 90 minutes) but trying to fit it into less than an hour means that certain sections will be rushed, or the discussion components will be lost.

  • Getting Comfy, Introductions, Acknowledgements: 5-15 minutes (slides 1-2)
  • Context + Goals: 5-10 minutes (slides 3-4)
  • Map Overview: 5-10 minutes (slides 4-12)
  • Mapping Example: 15-20 minutes (slides 13-17)
  • Diving In! 20 minutes (slides 18-40)
  • Discussion/Conclusion: 10+ minutes (slide 41)

Getting Comfy, Introductions, Acknowledgements

5-15 minutes, slides 1-2

This section is to allow for latecomers and to let everyone get settled.

Introduce yourself, let any other facilitators introduce themselves, and thank the organization/space that is hosting you.

If there is time, let the participants introduce themselves with their names, pronouns, and an answer to an icebreaker question, such as “what has brought you here today?” or “what is the best meal you’ve eaten this week?”

Other ways to use this time are an acknowledgement of current events, a land acknowledgment, or small talk and catching up.

Context + Goals

5-10 minutes, Slides 3-4

We’ve started this section with a quote from Shoshana Zuboff, which we believe is a good framing of why surveillance is such an important topic. Feel free to choose other quotes or pieces of content that you think will be particularly compelling for your audience.

Why Mapping? Why Stories?
You can read this statement word-for-word, or go into more detail by referring to the Introduction section of this page. Further discussions or examples of mapping can be found in the work of William Powhida, The Greater Boston Anti-Displacement Toolkit, and Ed Simon, to name just a few. These links are not necessary for facilitating this workshop, but they do provide some helpful context.

Who’s being watched
This is a segue into the next section. It is also an acknowledgement that surveillance affects us all differently. While not an immediate concern for some privileged individuals (“I have nothing to hide”), surveillance is a force that can further the oppression of marginalized groups.

Map Overview

5-10 minutes, slides 4-12

This section is necessary because the map, just like the surveillance ecosystem, represents an overwhelming amount of content when viewed in its entirety. As a result, the slides are designed to create a build, where you can speak to the various sections that make up the surveillance ecosystem, and by the end have a shared understanding with your participants about the categories and structure of the map.

This section is a good example of how two facilitators are advantageous. By switching off, slide by slide, the content can feel more conversational and less like a lecture.

These categories of the map are as follows:

  • Who’s Being Watched (slide 4)
    Surveillance happens to everyone. The data we give up also reveals information about our friends, families, co-workers.
  • Watchers (slide 5)
    These can range from government agencies (ICE, NSA) that use surveillance as a key part of their process, well-known technology corporations (Google, Facebook) that make it part of their business model, municipal agencies (local police) that benefit from data sharing, and contractors (Taser, Clarifai) that develop key surveillance technologies.
  • Actions (slide 6)
    Shows the wide variety of actions we take in physical and digital space that can be surveilled. Shows both the broad distribution of surveillance tech and which actions are particularly risky.
  • Surveillance Technologies (slide 7)
    Hardware and software that collects information about us. Can exist in physical space (surveillance camera) or digital space (browser cookie). Especially nefarious if it exhibits dragnet behavior, collecting information indiscriminately.
  • Data (slide 8)
    The information that is collected by surveillance technologies. Can individually identify us (such as our name, biometrics like our face, or our address), or might be aggregated and subject to inference from third parties.
  • Databases (slide 9)
    Collections of data held by watchers. Access to these can be sold by data brokers or tech companies, or opened up to government agencies with warrants or subpoenas.
  • Algorithmic Systems (slide 10)
    Uses large amounts of potentially biased data to make…
  • Inferences (slide 11)
    Guesses made by algorithmic decisions. May or may not be correct but can have…
  • Impacts (slide 12)
    Results of the surveillance society + economy on individuals.

Many of these boxes are left open, because it is impossible to map all examples of all of the following. Some are filled in to help the process of mapping, but it is greatly encouraged that participants fill in boxes during the mapping process.

For more information on how to specifically identify or analyze some of these categories, please see the Algorithmic Equity Toolkit.

Mapping Example

Now that the map has been introduced, it is time to give a quick tutorial of the mapping process. Because this is a content-rich workshop, it can be useful to read the room and gauge how much haste and/or participant interaction would be useful at this stage.

Storytelling

Read the story in your best stage voice. Don’t interrupt yourself with commentary, and focus instead on giving it some pacing and intonation that will help the participants simply enjoy the telling. The following story has been annotated with an example of what that might sound like: (italics are emphasis, ellipses “…” show a long pause)

What’s The DiFference?

Click. That’s a good one! Your mom is actually smiling, and your dad didn’t blink. Your sibling looks great, but they have always been photogenic… This has been a great day of good photos, and at the end of it your mom (of course) wants to know where she can see them. Being an amateur professional photographer, you upload all of your best pics to Flickr, but because this is just a hobby you have them accessible on a Creative Commons license…Which is just the license an IBM researcher is building into her web scraping algorithm…With the algorithm, she is able to add a huge amount of images to the Diversity in Faces (DiF) dataset, which is made with the intent to make facial recognition fairer… The picture of your family is annotated, correctly identifying your mom and dad’s age and gender, but enforcing a gender binary onto your sibling… The annotations also identify the skin colors of your family. Using this dataset, which is accessible to public and private research institutions, an applied research firm develops a facial recognition for the NYPD, with the intent of searching for people of a specific age or skin color…

Highlighting

Now you will show the process of breaking the story into the categories that makes up the map. For both Digital and Physical workshops, it will be most visible for you to do this live in Google Slides with the Highlight function:

Choose the color that corresponds to each category. Slide 15 contains our key, but we encourage you to do this process on your own as well. The correct answer is less important than the act of analysis!

If you have enough time, you can explain your rationale for some categorization, or ask briefly participants to volunteer answers for certain sentences:

“There are a couple mappable entries in this sentence. Can I have a volunteer tell me what they think they are?”

Mapping

After the categorization has been done, trace the data flows from component to component through the map. For the components that are missing from the map, fill them in the blank shapes that correspond to that category.

If you are doing this physically, do this with a marker on the large format map. If you are doing this digitally, you can do this on slide 16, using the arrow tool and text box tool in Google Slides.

Move through this section as quickly as possible, as the fun part comes next….

Diving In!

20 minutes, slides 18-40

Based on your group size and format, break the participants into groups of 3-5, Tell them which story they will be working on and when to return back to the main group.

If you are conducting this workshop digitally, it is a good idea to have a pre-chosen facilitator for each working group. Decide which facilitator corresponds to which story beforehand. Here is a guide for those facilitators.

The process for breakout rooms is as follows:

  1. Participants introduce themselves briefly.
  2. Ask two participants to volunteer as readers.
    If conducting this digitally, the facilitator should share their screen and navigate to the slide that corresponds to the story that group is working on.
  3. One reader reads the story from start to end in their best stage voice.
  4. Collaboratively, the group decide which components of the stories correspond to which categories on the map. If doing this physically, have them use the different color markers/pens to annotate them. If conducting this digitally, the facilitator should highlight the components in the Google Slides.
  5. Following the continuity of the story, trace arrows, using just one color, from component to component in the story, filling in the ones that are missing.
  6. At the predetermined time, the groups will come together and share out their results.

When back in the main group, go through the stories in order. Put the story that is being read up on the screen (or screenshare).

For digital workshops: Ask the readers for the group to alternate reading the story out loud, showing the highlighted version and map briefly, and sharing 1-3 insights or questions that came up in the process. Examples can include not being sure how to categorize a story component, noticing a pattern in the data flow through the map, or sharing a (short) related anecdote that they may have come across previously.

For physical workshops: Ask the readers to read the story, and come up to the large map and draw data flows and add missing components, using a different color for each story. Ask them to share out 1-3 insights or questions that came up in the process. Examples can include not being sure how to categorize a story component, noticing a pattern in the data flow through the map, or sharing a (short) related anecdote that they may have come across previously.

Discussion/Conclusion

10+ minutes, slide 41

This section tends to emerge organically from the share-out. The following are four broad questions that we have written, but we invite you to write your own to specify it for your audience or intent.

  1. Did the stories fit neatly into the categories?
  2. What patterns in data movement did you notice?
  3. What stories should we be mapping?
  4. How do we stop a data flow?

Appendix I: Stories & Sources

The following is all stories we have currently written and their source(s). All of the components we recommend you map are bold and italic. See the slide deck key for each slide for specific categorization, but please keep in mind that there is no one “right answer” for categorization.

It’s also worth nothing that the technologies and topics surrounding surveillance are rapidly developing and unequally distributed. New stories are sure to emerge, so some of these stories might be become out-of-date. For this reason, this workshop has been explicitly designed as a framework for dissecting any story. If you are aware of a more recent or more topical story when deploying the workshop, we absolutely encourage you to try it out using this format—and let us know how it goes!

No. 1: Buzzfeed Quizzes vs. Democracy (slides 18-19)

While thumbing through your Facebook feed, you see your friend is a French fry, according to a Buzzfeed personality quiz. Curious to find out what type of potato you are, you click on the quiz, click through the page that allows Buzzfeed to access your profile, and take the quiz. You are a tater tot, and you let your friends know. Meanwhile, Buzzfeed remembers your answers, and has also gained access to parts of your private profile (which may include your likes, photos, and friends). They aggregate this data with other users’ information and sell it to Acxiom, a third-party data broker, who turns around and sells it to Clarity Campaigns, a data analytics / political strategy company similar to Cambridge Analytica. Using this data, Clarity Campaigns builds a psychometric model of voters who are likely to be swayed by a well-placed Facebook ad. They help the Bernie Sanders campaign build a digital marketing strategy to target just those types of people using Facebook’s Custom Audience system. Your phone pings, and you see that someone liked your tater tot results. Just beneath that, you see an ad for the Bernie Sanders campaign that is oddly compelling. You keep scrolling.

Source: Vice, The Data That Turned the World Upside Down (Jan. 2017)

No. 2: Very Public Health (slides 21-22)

Cough, cough. Uh oh. You’ve been feeling some symptoms of the infamous coronavirus. Luckily, your city has a rigorous test and trace program. You get tested and…yikes! the result is positive 🙁 Your country’s department of health mandates that you hand over your cell phone so they can view your location history for the last few days, and alert any possible contacts.  A professional contact tracer also asks you for your social media accounts, text messages, credit card statements, and public transportation records. The government promises anything they publish will be anonymous, but they broadcast an alert that includes your age, county, workplace… and the beloved LGBT bar that is a central gathering point for your community. Your neighbor texts you, saying that the alert sounds like you. Luckily, you have already come out to her…but if she could deduce your identity, who else could? Your ex? Your parents?

Source: Time, How South Korea’s Nightclub Outbreak Is Shining an Unwelcome Spotlight on the LGBTQ Community (May 2020)

No. 3: A Dragnet the Size of the Internet (slides 25-26)

You attend a Black Lives Matter protest, coordinating on a group chat with your Google Pixel to your friends so you don’t go alone. A manhunt ensues after the protests because a warrant is issued for protestors who broke windows in the downtown retail stores. A detective with the city police department files a “Dragnet” warrant to Google for location data associated with the stores. Google combs through its Sensorvault database, and sends anonymized location traces that match the time and place. Your location trace is an especially good match, and Google reveals your name. Detectives view your Facebook account and see you are a regular organizer in your community. Your name is added to a list of suspects for inciting a riot.

Source: The New York Times, Tracking Phones, Google Is a Dragnet for the Police (April 2019)

No. 4: Target Knows You’re Pregnant (slides 29-30)

You get pregnant. It’s understandably a scary and exciting time. You still live with your Puritan parents, and they would NOT be happy if they found out. As a pregnant person begins to do, you start searching for and ordering the usual things: unscented lotion, mineral supplements, and cotton balls. Because you still live with your parents, you are cost-conscious, buying each item on the site where it is cheapest. You buy the mineral supplements at Target because they are having a sale, and because you can do in-store pickup to keep things hidden. But buying something at Target means you are now part of their customer database, and your purchases are associated with your “Guest ID.” And buying something at Target using your credit card means your address is now on file. Target uses powerful predictive algorithms to look at your purchases to predict possible future purchases. In an attempt to capture your future purchases, they send a “Congratulations on the baby!” postcard, as well as a coupon for “Goodnight Moon,” to the address on file, which is your parent’s address. D’oh!

Source: Slate, What’s Even Creepier Than Target Guessing That You’re Pregnant? (June 2014)

No. 5: Studded Black Leather Surveillance (slides 33-34)

In your small town, the mall is still a great place to hang out. You are walking with your squad when a pair of studded and shiny boots catch your eye in the window of the Doc Martens store. You pause for a moment. You’ve been looking for a pair of new boots… and a small plastic puck hidden in the door mantle notices, pings your device, “sees” that you paused, and collects your MAC address…. but your friend grabs your arm and hurries you along before you can walk in. Luckily (for them), Doc Martens already has your MAC address on file, because you clicked on an Instagram ad once. Doc Martens’ predictive algorithms know to send even more ads to people, like you, who have paused outside their stores. This time, they advertise their latests boots – the ones that made you pause. You recognize them in the ad, and buy them. Doc Martens’ advertising teams see that the beacon helped make that sale, and begins installing them at every touchpoint they can get their hands on.

Source: The New York Times, In Stores, Secret Surveillance Tracks Your Every Move (June 2019)

No. 6: Just @ Me Next Time (slides 37-38)

You’re finally getting around to writing your thesis. You email your very patient advisor, asking their opinion about a paper you are considering citing. You had no clue, but the author of the paper, now living in Tunisia, is a whistleblower taking refuge from your government, and they are on the NSA’s target list. Your email, travelling on a Sprint network, passes through an AT&T peering center—which is tapped by an NSA router. The router automatically flags your email because it contains a “selector”—the name of the author, and copies it, along with the rest of your inbox, to the NSA server codenamed Pinecone and onto their Mainway and Marina systems. An NSA agent conducting an investigation on the author uses the XKEYSCORE algorithm to search them, and your name comes up. That, in combination with less-than-complimentary tweets about the federal government you wrote in your younger and more radical days, tips the scales, and a predictive algorithm puts you on a no-fly list. You are not alerted until you attempt to fly to present your thesis, and are stopped at the gate. There is no appeal or recourse for your status on the no-fly list.

Sources: The Guardian, No-fly list uses ‘predictive assessments’ instead of hard evidence, US admits (Aug. 2015) & The Hill, 8 surprising times our intel community spied on US citizens (Feb. 2019)

Appendix II: Experience Reports

CtrlZ.AI in Barcelona

Overall, this was a very successful workshop. Here are some of the factors that I think contributed the most that that:

  1. ~20 engaged participants: Most of the participants were from the adjacent FAT* conference, and had lots of interesting subject matter knowledge to share. A willingness to volunteer, share insights, and provide feedback made this feel especially productive.
  2. Setup: Except for a space to pin up the large map, the distribution of the space and materials felt very seamless. We had four tables, with five chairs at each. At each seat was a handout that included a map on one side and the stories on the other. On each table was a wide variety of colored markers. At the front was a large screen for sharing slides.

There, of course, were some things that could be improved on in the next iteration:

  1. Time: We only had 45 minutes allotted for this workshop, and certain aspects of the workshop certainly felt rushed. I believe it is important to get through the entirety of the demo mapping so that participants have an understanding of the process, but it did eat up more than half of our time. By trying to fit it into less than hour, we lost key time for discussion.
  2. Space to pin up: The (admittedly beautiful) brick walls of the space meant pinning up the large map was tricky. I ended up taping the top portion of the map onto the TV screen. This made it hard for groups to share out their mapping process with the broader group, and made the final large map feel a little “shaky” and underwhelming.

Online with the Tech Equity Coalition

With the pandemic raging, this was the coveillance collective’s first try at leading a digital workshop. Personally, I think it went very well, and was a great learning experience. Some of the factors that contributed to this are:

  1. Co-facilitation: We recruited one of our collaborators from the Washington chapter of the Council on American-Islamic Relations to co-lead this workshop. This was especially useful in the Context + Map Overview section. We went back and forth, making it feel more conversational and less like a lecture. We also had awesome members of our collective and the ACLU of Washington to facilitate our break-out rooms, which made for a fluid experience.
  2. Flexible timing: We had an hour blocked out for this workshop, but with this being a digital workshop I predicted (correctly) that certain aspects would run over. We timed the sections so that all necessary modules would fit into the hour (including having brief dialogues in the breakout groups), and invited participants to stay past the hour for a longer discussion and feedback session. This gave us flexibility and breathing room and allowed our participants to choose their level of involvement based on their schedule.

Of course, there is always things that could be improved on. Some of them include:

  1. The Act of (Digital) Mapping: No matter how you slice it, Zoom or Google Slides is simply not as intuitive as drawing with a marker. We found that the digital tools were fiddly and time-consuming, and that participants gained a lot more from the categorizing section of the workshop than the mapping. However, the map was still a useful structure and visual element for understanding the categories, so I don’t recommend dropping the map entirely unless you are pressed for time.

Categories
Module

Understanding the rise of tech-fueled surveillance

Note: this guide is a work in progress and may change at any time! We’ve done our best to cite our sources, but this page has not been professionally fact-checked.

This workshop was first facilitated for the Tech Equity Coalition in partnership with the ACLU of Washington in August 2020.

What do the recent Black Lives Matter protests have to do with the COVID-19 pandemic? They’ve both triggered a rising tide of surveillance—the use of “smart” technology to track people.

It’s easy to get lost in the news about the latest alarming developments. So we invite you to join the coveillance group for a guided workshop on the deeper principles of how surveillance technology works.

We’ll discuss two key technologies you might have passed by in downtown Seattle without even knowing it: automated license plate readers and fusion centers. Through a series of stories, we’ll show what happens “behind the scenes” to your personal information. Through group discussion, we’ll show how to tell apart what’s new and what’s not new about the way surveillance works today.

This workshop is a pilot session that will run for an hour, so please come prepared to give feedback. Thank you!

Overview and timing

Total: 65 mins

  • Buffer (in this time, ask participants to name what they hope to learn from the workshop) — 5 min
  • Intro, history, smart streetlights, TOC — 10 min
  • ALPRs (brief overview) — 7 min
  • Fusion center (brief overview) — 7 min
  • Break out into two groups; one for ALPR and one for fusion center — 15 min
    • Monologue — 5 min
    • Discussion and exercise — 10 min
  • Share out / closing session + request for feedback — 15 min
    • ALPR group — 2 min
    • Fusion center group — 2 min
    • Quick takeaways — 5 min
    • Big group discussion on interventions — 5 min
  • Feedback — 5 min

Slides

Facilitator script

Please read the transcript below in tandem with the slides. Green italics denote a new section, [brackets] denote a new slide.

New & Old Snakes

[title slide]

Hello and welcome to today’s workshop, covering surveillance, protest and technology. Unfortunately, we couldn’t give an “in-person” tour, so think of this as a virtual one through time and space. We’ll be focusing on ALPRs and Fusion Centers with a chance to dig in at the end with breakout groups. 

Feel free to add a more personal introduction to yourself and/or your organization and fellow facilitators.

[headlines]

With worrisome escalation across the nation against Black Lives Matter protests and increasing pandemic era surveillance, these are just a smattering of headlines that make us question what kind of dystopic hell we’ve been dropped into and how to make sense of it all. 

[not a moment in time]

Borrowing from this quote from the StopLAPDSpying Coalition rings true—this is not a “moment in time, but a continuation of history.” We want to draw out the familiar and the new of this particular unspooling of history. 

The first stop of our tour today is in San Diego. 

[news headline – police used streetlights to investigate protests]

Let’s give a quick recap of what happened in San Diego.

[GE diagram on their pitch]

In 2016, the city of San Diego installed energy-saving LEDs, cameras, and sensors – supposedly for sustainability purposes.

Well, only the LEDs contribute to sustainability. The cameras and sensors gather pedestrian and vehicle data and weather data – supposedly for city planning.

[Streetlight parking and pedestrian detection]

Computer vision on streetlight cameras give traffic data like parking spot detection and pedestrian analysis, which is data the city never had before. However, that means these cameras are recording public space 24/7 in high definition. So even though these smart streetlights were only supposed to grant access to SPDB in cases of serious crimes, that definition has slowly crept into minor crimes like vandalism. 

San Diego police were using streetlight footage before the protests, but journalists have revealed that SPDB has been actively combing the footage to indict protestors for things like throwing objects or vandalizing buildings. 

[“Man charged with pointing laser pointer”]

What we know so far is that the feds have gotten involved, charging two men connected to the protests. One of those men was arrested for shining a laser pointer at a helicopter that was monitoring the protests. Smart streetlight footage was gathered for his case.

[old snakes, new skin quote]

Even though all this sounds sci-fi and futuristic, this is actually a repeat of a lot of stories we’ve seen before.Whenever a “new innovation” in policing or carceral practices emerges, I think of this Frederick Douglas quote: “…you and I and all of us had better wait and see what new form this old monster will assume, in what new skin this old snake will come forth next”

He was speaking after the civil war, warning abolitionists that slavery would only be reborn in new forms.

[old snakes]

There’s power in naming racism and xenophobia, because they come cloaked in other forms. For example, racism in the name of protecting capital and property, and xenophobia in the name of identifying and casting others as state enemies. These snakes take form in white supremacy and abuses of power at all levels – personal, cultural and institutional. 

[Streetlight map]

The city paid for many of these lights with grants intended to “help local communities overcome poverty  and the city justified its use in this context by putting the equipment in low- to moderate-income neighborhoods.” (source)

This attempt to “help communities overcome poverty” actually concentrated control in the same actors and institutions that reinforce racism and xenophobia, rather than in any transformative models of community power. 

[new skin]

We can take a common “big data” trope – the 4 V’s of Volumen, Velocity, Variety and Veracity – and think about them in the context of surveillance. The power of big data comes from its ability to draw from as wide a pool as possible – dragnet info-gathering rather than targeted surveillance. From there, the use of data can give the veneer of objectivity.

[timeline lantern and streetlights]

Well, to understand the context of smart streetlights, we want to revisit a different story about city lighting. 

First let’s go to 1712, during the New York City Slave Revolt. As a part of The next year, as a part of crackdown in the city, the Common Council of the City of New York passed “A Law for Regulating Negro & Indian Slaves in the Night Time.”

[lantern law]

The law required all Black and indigenous people to illuminate themselves with a lantern after sunset, deputizing private citizens to bring in violators for punishment. 

[Simone Browne’s Dark Matters]

Simone Browne in Dark Matters explains how the candle was an early, rudimentary piece of technology used as a form of control. What were they controlling? Movement through public space. The lantern allowed a “new” scale of encoded white supremacy – our “old” snake.”

[lighting venn diagram]

Yes – the volume, or pervasiveness of these streetlights allowed the San Diego police to use camera footage 175 times to target whatever alleged crime – serious or not – they wanted. Which of course, extends to policing protests. 

[timeline]

A more recent example of this old/new context of surveillance is the NYPD’s spying of Muslims. It combines the use of “new” surveillance tools like ALPRs with the “old” practice of domestic surveillance of social and political movements.

[COINTELPRO]

From 1956–1971, J Edgar Hoover ran the FBI program COINTELPRO to undermine social movements. Local police units that collaborated with this political spying were called “Red Squads.”

[Panther 21]

COINTELPRO targeted anti-war Vietnam protestors as well as civil rights movements. In 1969, the NYPD indicted the “Panther 21,” members from the Black Panther Party, on over a hundred of unfounded conspiracy charges as a result of COINTELPRO surveillance.

In the corner here is Richard Moore, known as Dhoruba bin-Whahad, who was targeted for this leadership role with the Panthers, explaining what happened next. 

[45 sec clip]

The “Handschu Agreement,” established oversight limiting domestic spying by NYPD. It created a paper trail through record keeping and procedure, prohibiting spying on political and religious groups unless there was information tying them to a crime, and limited the share of information to other agencies. 

[NYPD stop spying on me image]

These gains did not last long. After 9/11, NYPD removed those guidelines in the name of counterrorism, opening up a decade of unjustified surveillance of Muslim communities, including video surveillance, license plate recognition, community mapping, and infilitration.The ACLU, the NYCLU, and the CLEAR project filed suit against the NYPD in 2012, re-opening the Handschu case. This resulted in an update to Handschu Agreement that establishes greater oversight on NYPD spying.

[venn diagram on NYC spying]

So what’s not new is this unwarranted political spying. What was new was the scale that NYPD was able to surveil so-called “ordinary” citizens – mounting license plate scanners on unmarked cars parked at mosques, for example.

First we’ll dive into the use of ALPR tools as an example of what is new and not new about smart city surveillance. 

ALPRS (Automated License Plate Readers)

[what are ALPRs?]

See also – ALPR stop on our surveillance walking tour

What are automated license plate readers?

[a bunch of little cameras in situ]

See these little cameras mounted on highway overpasses, on poles above intersections and ramps, and on police patrol cars? These little cameras play a huge role in the surveillance ecosystem. Understanding what these things do will help you understand how all these other street-level surveillance technologies work as well as the problems with them.

[diagram of what an ALPR does in a city]

Those cameras are meant to read the license plates of all passing cars, record the encounter, and store the information. When a car passes, it says, “Hey! I just saw a car with license plate ABC123 here, at 18 degrees lat 20 degrees long, from New York, at 11:50 AM on February 24.”

[ALPRs in Seattle — city map of ALPRs]

Here’s a map of potential ALPR locations in Seattle.

There’s not just one camera, but tens to thousands (depending on the city), often installed in dense areas like downtown areas or on highways. They are infrared so they typically run 24/7. 

As you move around the city, you might be targeted multiple times by different license readers, creating a map of your movements.

Here are the basic facts to know about them.

[next build]

Note that maps of license plate reader locations are often not made public. This is partially due to the state not wanting people to know where they might incur speeding fines! 

[next build]

Existing law in Seattle places no specific limits on the use of ALPR technology or data, 

[next build]

meaning an agency can choose whether and how they want to retain data and track vehicle movements.

[next build]

There’s a huge volume of data. The Seattle Department of Transportation ALPR system collects 37,000 license plates in a 24-hour period—over 13.5 million scans over a full year. 

[next build]

These drivers are not specifically suspected of any crime, which calls into question the scale and purpose of such data collection.

[Where does your personal information go?]

Where does your personal information (e.g. plate) go? Who sees it?

[Clouds]

When your plate is detected, the camera sends this record, over the network, to the public agency that owns the cameras, like a local department of transportation or police department. The info is probably stored “in the cloud.”

[next build]

Then, depending on the public agency, this one record might be sent to join a really big list of records, called a “database,” that usually a private company owns. This private company makes computer tools, or software, that help the users analyze the information.

[next build]

Then, this one record, inside the huge pool of information, is often shared with many other public agencies, some outside the state, some with different purposes from the original agency, like law enforcement instead of transit.

[next build]

Then, different agencies can run queries on the database, like “Where else has this car been today?” and “Who else was seen near this car?” These agencies can also get alerts if the license plate appears on a special list called a hotlist, for cars that have been stolen or owners who are wanted for arrest.

[next build]

These agencies can then take actions like sending workers to investigate a traffic jam, or sending officers out to intercept the car.

[smart cities]

Things weren’t always this way—the ALPR was first deployed in 1984 in the UK, and since then, has supported an emerging narrative around “smart cities,” basically a high-density environment where people’s movements are tracked and stored at all times. What are the supposed benefits of “smart cities”?

The message from the city, or the vendors of the cameras, is: surveillance of public space makes everyone’s lives safer and more convenient.

[slide of traffic jam estimated time map]

If you ask a member of a city public affairs department, they’ll tell you those cameras are meant to make your life more convenient, for example, by giving you travel time information by measuring traffic.

[Vigilant slide]

If you ask a police officer, they’ll tell you those cameras are meant to make your life safer by helping officers capture fugitives and stolen cars.

While all credit must be given to the dedicated police officers and detectives whose tireless efforts solved this case, one can’t help but wonder if modern day advances in technology (namely License Plate Recognition or LPR) could have been used to identify potential suspects sooner – limiting the brutality and duration of these types of pattern crimes.  After all, it was a license plate on a parking ticket that eventually led to Berkowitz’s capture.

[Scope creep slide]

But scope creep of data is eroding civil liberties for us all. Here’s a diagram of how that works in general.

[next: original scope]

Your data starts in its original scope, for original purpose, such as helping measure hourly traffic volume in Seattle. Ideally it would stay here.

[next: expanded scope]

But then there are these expanded scopes. Like targeted advertising.

[next: data movement]

Through loose data retention policies, or agency requests, it experiences data movement out of scope, likely without your knowledge or consent. For example, if it becomes used weeks later to improve your targeted advertising based on where you were driving.

[next: seriously expanded scope]

Through several hops, it could move into a seriously expanded scope, just like in the smart streetlights example, where data used for “benign” smart city purposes was then used to target protesters, or if it ended up in the hands of a law enforcement agency in another state, who then used it as collateral evidence to arrest someone.

[next: wide scope background]

Each minor little “hop” might seem justified. Did you consent to any of these hops, or the final consequence? Where does the scope creep end? We don’t think anyone knows yet. But it’s crucial to fight it.

[NYT slide]

ALPRs are just one example of this general pattern. In our breakout group for ALPRs, we’ll examine a story about that, as reported by the New York Times in 2019.

The concentration of scope creep is best represented by fusion centers. I’ll turn that over to the other facilitator.

Fusion Centers

See also: Fusion Center walking tour stop

Who has heard of fusion centers before? 

Note: include instructions for how to “raise hand” if you are conducting this remotely

[map]

After 9/11, fusion centers were born with the “Intelligence Reform and Terrorism Prevention Act of 2004” (IRTPA). With 18 at first, there are now 78 recognized centers. Fusion centers facilitated a national anti-terrorism strategy of intel sharing between local and national agencies as well as with private companies and the military. 

[Washington State Fusion Center street]

Seattle’s Washington State Fusion Center, is in the Abraham Lincoln Building, just downtown. These center employees are linked through the State Intelligence Network to every law enforcement agency in the state. A security corridor links them physically to the FBI’s Field Intelligence Group office and the  Puget Sound Joint Terrorism Task Force.

[sector diagram]

This marriage between federal agencies including the CIA, FBI, Homeland Security and other federal bureaus brings a level of national scrutiny to the local level, with individual reporting made possible through the Nationwide Suspicious Activity Reporting (SAR) initiative.

Seattle’s fusion center seats a team of 15-30, with full time intelligence officers from the Seattle Police, County Sheriff, state investigators and analysts.

Fusion centers have access to the DHS’s Homeland Security Data network, and several FBI data portals. 

[data sharing]

It might seem strange to have such a concentration of investigators in downtown Seattle, but fusion centers are also typically located in urban centers to put them in the center of multiple agencies that administer public safety needs, fire, emergency response, public health providers, and private sector security agencies. This gives them access to data that might be of interest to infrastructure related threats. 

[funding]

Though the core “hub” of fusion centers come from similar grants, the specific programs of a fusion center are funded more individually, focusing on different domains including education, health, and neighborhoods in the name of public safety. 

[decentralized COINTELPRO]

Multiple incidents of privacy violations and political monitoring are definite examples of concerns associated with fusion centers. But actually as Brendon McQuade argues in “Pacifying the Homeland: Intelligence Fusion and Mass Supervision” this confusing array of coordinating agencies makes it harder to expose political policing the same way as COINTELPRO in the Panther 21 / Handschu Case.

[venn diagram]

The success of the suit against NYPD on the spying of Muslims was the ability to recognize the “old” practices of targeting that violated constitutional rights in the same way as with the Panther 21. Fusion centers represent a new era of local, federal and private interests converging in a “harder to pinpoint” way. While fusing new tools of control for law enforcement like ALPR data, they also obscure the intentions and command chain related to practices that further racial injustice and violate human rights. 

ALPR Breakout Group

This section begins a breakout group; this is not shown to the whole group.

[NYT slide]

Welcome to the ALPR breakout group! I’ll discuss a story, then we’ll discuss some questions about it together. At the end, we’ll report our findings back to the group, so please take notes if you can. 

In this story, we’ll see how scope creep of data is eroding civil liberties for us all, and ALPRs are just one example of that. Can I get a volunteer to take notes?

[NYT ICE story photo of background from reporter’s blog]

All right. I’ll start by setting the scene for the story.

Here in Washington State, in 2017, people kept disappearing in Pacific County’s Mexican community.

Residents noticed how their neighbors, good citizens, some of whom they’d known for decades, would just vanish. Mothers of young daughters would be deported to Mexico with little notice, their families torn apart.

They knew that ICE, the Immigration and Customs Enforcement agency, was somehow tracking down these people. But how? Was it a mole at the drivers’ license office? A spy in their community? 

[Seattle Times headline]

Here’s one answer. There was widespread uproar when it came out in the “Seattle Times,” in 2018, that the state’s Department of Licensing (DOL) was feeding information to ICE on request: if an ICE employee emailed to ask about a target, a DOL employee would respond with whatever it had, such as their driver’s license application form. Wasn’t this already bad enough?

[ICE handshake]

But the real answer was much more serious.

[next]

First, ICE already had direct access to the DOL’s database itself, so it could make its own queries without having to ask anyone. They made tens of thousands of searches in a software called DAPS (Driver and Plate Search) before the DOL cut off access last year (2018).

[next]

Second, even worse, it turns out that ICE obtained access to two huge piles of information.

The first is Vigilant Solutions’ database in Dec 2017. This is “the world’s largest privately run database of license-plate scans — more than five billion historic images captured continually and automatically, thousands per minute, by infrared devices, that is, ALPRs, attached to lampposts and police cars and repossession-agent vehicles across the United States.” Vigilant’s database lets ICE “see precisely when and where vehicles of interest have been spotted during the previous five years…. and upload 2,500-plate ‘hot lists’ that trigger immediate iPhone alerts whenever a target is scanned by a camera in the network.” 

[next]

ICE also has access to a private database called CLEAR, created by Thomson Reuters (the news company).

[conspiracy slide]

CLEAR enables a souped-up police network, just like the old cliche of investigators gathering tons of information on suspects and connecting it together by hand to find patterns and build a more complete profile of one person.

[next build]

But now… software lets us scale beyond what one person can do, as in this screenshot.

CLEAR gives ICE real-time access to address and name-change data from credit reports and to motor-vehicle registrations in 43 U.S. states, utility records, arrest records, and similar. 

[The danger]

The real danger is that Vigilant and CLEAR’s databases aren’t restricted by protections on what data the government can collect or keep—because they aren’t government-owned.

As ACLU senior policy analyst Jay Stanley put it: “If ICE were to propose a system that would do what Vigilant does, there would be a huge privacy uproar and I don’t think Congress would approve it. But because it’s a private contract, they can sidestep that process.”

[“We’re not just a public-safety agency”]

Not only that, but public agencies sell information to private companies! As this spreadsheet shows, DOL sells its information to private data brokers, like Experian and LexisNexis, that resell information that end up in these Vigilant and Clear databases that ICE already has access to. 

So cutting off access to ICE doesn’t effectively prevent them from having access to that database indirectly

[next build]

As the DOL’s Brad Benfield said in May 2018, “We’re not just a public-safety agency. We’re very much a data-sharing agency.” Did you expect this when you registered for your vehicle?

“In 2017, D.O.L. earned about 26 million dollars selling driver and vehicle records to 19 principal data brokers, including Experian, LexisNexis and R.L. Polk — a group of companies that had its own relationships with some 34,500 ‘subrecipient’ brokers, including TransUnion, Acxiom and Thomson Reuters. ‘One of the things that we realized is that we’re not just a public-safety agency,’ D.O.L.’s Brad Benfield told a State Legislature committee in May 2018. ‘We’re very much a data-sharing agency.'”

[Oh, the places you’ll go!]

When it comes to plate data, law enforcement agencies share it directly.

The EFF found that of the 2.5 billion license plate scans from 2016-2017, law enforcement agencies are sharing data directly with ~160 other agencies, and adding to a “group pool” of data! All of this could happen across borders.

[Cloud slide]

In sum, as we saw earlier, information flows from you, [next] to local agencies–municipal or law enforcement–[next] to private databases and data brokers, to other agencies, that might then make decisions about you, as we’ll see in the following story.

[NYT ICE story photo of Rodriguez]

Here’s the story of how that unfettered access enabled ICE to target Mario Rodriguez, a bilingual teaching aide who lives in Pacific County. As the NYT put it, “he matched none of ICE’s stated enforcement priorities, even under Trump.” There was no reason to target him. He is also gay, and has difficulty living in Mexico because of that.

[next] Here’s the first sentence.

Could we have volunteers to read out each sentence as it comes up?

1. He appeared on ICE’s radar earlier that summer. The two deportation officers had been “conducting an investigation into a separate individual at his apartment complex.”

[next]

2. They used DAPS [Driver and Plate Search] to run the plates of the vehicles parked in the lot. Three, including a black Jeep, were registered to Rodríguez. 

[next]

3. Armed with his name, the officers then made a “biographic search of immigration and criminal history databases.” This search showed that Rodríguez entered the United States legally on a visa in 2005. A visa overstay is a civil offense, and Rodríguez has no criminal record. 

[next]

4. But here he was, bird in hand, man in database. The ICE officers filed a worksheet, adding him to their list of targets.

[next]

5. Later that summer, the ICE officers happened to see Rodriguez’s car. They ran his plate a few times in DAPS and then arrested him.

Let’s just take a second to reflect on this story… What are people’s immediate reactions?

Begin the ALPR breakout group discussion.

Let’s break down this story based on some more general principles of surveillance. I’ll go over five discussion questions. Can someone volunteer to take notes?

DISCUSSION: Can you identify the personally-identifiable information Rodriguez had to hand over, that allowed the officers to track him?

Possible answers / discussion points:

  • Vehicle registration -> license plate -> name
  • Name, information, etc. -> get a visa to enter the US
  • The better of a citizen you are, the easier you are to track (i.e. less above-board would be hard to track)

DISCUSSION: This is new: Can you identify all the ways that new tech made Rodriguez’s arrest possible?

Possible answers / discussion points:

  • His car and plates would not have appeared in the database without a registration or an ALPR scan
  • He was not targeted; they were investigating someone else, and DAPS lowered the friction to surveillance
  • DAPS search gave them his name info from his plate info, which then enabled further searches
  • Important point: the problem is not the ALPRs, the problem is the massive private databases gathered without oversight that fuse public/private information

DISCUSSION: This is not new: 

In New York City, police officers drove unmarked vehicles equipped with license plate readers near local mosques as part of a massive program of suspicionless surveillance of the Muslim community. In the U.K., law enforcement agents installed over 200 cameras and license plate readers to target predominantly Muslim community suburbs of Birmingham. ALPR data obtained from the Oakland Police Department showed that police disproportionately deployed ALPR-mounted vehicles in low-income communities and communities of color.

Can you identify the ways that these kinds of stories perpetuate old patterns of surveillance/oppression? 

Possible answers / discussion points:

  • Dragnet information collection allows targeting of specific groups that have done nothing wrong, for creating a chilling effect on freedom of movement and association
  • Currently, the use of ALPR technology in Seattle chills constitutionally protected activities because they can be used to target drivers who visit sensitive places such as centers of religious worship, protests, union halls, immigration clinics, or health centers. Whole communities can be targeted based on their religious, ethnic, or associational makeup, which is exactly what has happened in the United States and abroad. 

DISCUSSION: How can this ALPR use impact you collectively—say, an innocent person who is not ever directly targeted by the state or do any crimes?

Possible answers / discussion points:

  • bug in software (e.g. mis-recognized license plates) results in cops stopping innocent people who are said to be on the hotlist, with possibly violent/traumatizing tactics
  • because so much of your information is collected/tracked without any need in the “dragnet,” it could be sold to predatory advertisers, or compromised by hackers or stalkers 
  • selective/retroactive enforcement: the more info that is collected, the more likely it is something can be considered a crime, or a pretext for one (e.g. empowering a racist officer)

DISCUSSION: Should we allow ALPR use in Seattle? Why or why not? 

(Try to create discussion by questioning assumptions and asking for more detail )

(If all agree to get rid of ALPR: How could that change be made – what are the points of intervention within the system? If disagreement: Get people to talk with each other…)

Possible answers / discussion points:

  • Public/private link
  • Lack of oversight on private side
  • Technology remains largely unregulated; limitations are just that of the technology
  • Scale: one camera -> many scans; scans over Seattle per day, total scans over years
  • Personalization: scanning one person over many cameras
  • Dragnet surveillance
  • What exactly the database contains and how that is connected/fused over space/time for a person, as well as over groups of people
  • How new information is pushed/pulled to other entities (agencies) — and the magnitude of that
  • Potential for scope creep and abuse
  • Difference between, say, having an officer at every corner: memory. It does not forget, and can have wrong info stored
  • If some tech is meant to have a purpose (like making life more convenient by measuring traffic), can achieve the same effect without it
  • Possibility of software errors throughout the system, resulting in real (and real-time) consequences

Fusion Center Breakout Group

Who here knows who Phil Chinn is?

[“Phil Chinn”]

One of Seattle’s most famous cases involves the arrest of anti-war Port protestor Phil Chinn, a student at Evergreen State College who was arrested during an anti-war protest in May 2007, organized by a coalition of anti-war activists in the Olympia and Tacoma areas – Students for a Democratic Society (SDS), the Wobblies, and others. The activists had been infiltrated by an army intelligence officer who disseminated protestor information through the fusion center.

Chinn’s civil lawsuit of false arrest was settled with state patrol and Grays Harbor County and city of Aberdeen, with implications of a violation of First Amendment rights to free speech. Additionally, the direct involvement between military and local law enforcement, enabled through the intel-sharing of the Fusion Center, violated the Posse Comitatus Act.

How do fusion center operations relate to current protests? Can we mount similar defenses today?

[Occupy protest]

In November 2011, evictions of the national Occupy movement occurred in Denver, Portland, Salt Lake City, Oakland and Manhattan. Were these a coordinated crackdown?

[DHS]

DHS’s official policies at the federal level and several fusion center responses suggested actually a policy of washing their hands of Occupy involvement.

[AZ]

Then we have Arizona’s response to the Occupy movement. When Occupy Phoenix targeted the American Legislative Exchange Council (ALEC) for its profit ties to ICE and its role in passing a bill that allowed law enforcement to racially profile latinx drivers, Arizona’s fusion center assigns an officer to monitor Occupy Phoenix and liase with ALEC. ACTIC Provided ALEC with intelligence, including a “persons of interests” list regarding an protest of an ALEC conference who were later targeted with arrests.

What explains the coordinated response? The common factor comes from the fusion center mandates that require private sector involvement. 

[Occupy 2]

The November of the crackdowns, two national conference calls were organized – but by private entities, the National Conference of Mayors and the Police Executive Research Forum. Of course – these are private orgs of public officials. 

[GTWO]

There’s a similar level of private sector involvement in today’s protest. Here we have a “Get the Word Out” memo distributed in the wake of the George Floyd uprisings, perpetuating the disproven “outside agitators narrative. This memo was disseminated through the Washington State Fusion Center. 

Can someone mention who they were tipped off by? The National Drugstore Chain Association.

[Obama in Camden]

In 2015, after BLM protests spread in the wake of Trayvon Martin’s death, President Obama traveled to Camden to present the results of the President’s Task Force on 21st Century Policing. In Camden, Obama  congratulated Camden’s lowered crime rates as a result of community policing programs and use of intelligence. Extensive neighborhood monitoring services and surveillance streams all feed back to the Camden “fusion center.”

As Minneapolis attempts to disband their police, mentions of Camden’s model have resurfaced. Camden’s police force disbanded in 2013 after reeling from hard economic decline.. This new “reformed” department, Camden County Police, opened a fusion center, the Real Time Tactical Operations Intelligence Center. Camden is a true surveillance city, with cameras, ShotSpotter gunshot detectors, automated license plate readers, and a mobile observation tower. Community policing interactions become intelligence gathering exercises. All of these data streams flow into Camden’s “fusion center” producing “predictive” analytics to direct police to blocks of interest. In the long term, the economic recovery plan for Camden manifested in corporate tax abatements to entice corporate interests along with police pacification. 

[Obama Chicago]

A foil to Camden is what happened in Chicago, when Obama traveled to the International Association of Chiefs of Police (IACP) in October 2015. Grassroots organizations shut down the the gathering, releasing the Counter-CAPS report, rejecting this model of policing. These orgs stand in contrast to BLM movement moderates had taken elements from this task force and launched Campaign Zero, who put out the “8 Can’t Wait” Platform. Earlier in March of that year, this same group had won a reparations bill compensating the 110 Black men tortured by Chicago Police, with .5M in damages, counseling, services and public school curriculum.

Surveillance is presented as an alternative model to incarceration and traditional policing. It’s being implemented – purposely or not – in tandem with the goal of improving city infrastructure. “Pacification theory” is a framework that describes our current industrial-security complex as a systematic war strategy with the intent of reinforcing class and the accumulation of wealth. The focus on protecting, controlling and upgrading a city’s infrastructure and property has clearly had the effect of disrupting and incarcerating its marginalized residents. 

Q: Fusion Center funding models require private sector involvement. What do you think is the impact of private sector stakeholder interests in fusion center activity?
Q: How do you think smart city tech changes fusion centers? 

Q: With ~⅓ of funding from federal homeland security grants, the rest is largely state funded and run. This means if you know one fusion center… you know one fusion center. What stakeholders do you think are involved in Washington? 

Discuss: Microsoft funding of the National Fusion Center Association

Discuss: What are tools we have against such a large federal, local, and private conglomerates? 

Possible answers: 

  • Do its strengths also contain its weaknesses? Discuss the fractured / rival chain of command
  • Transparency calls and privacy concerns – do they work? Discuss the reformation of the Washington State Fusion Center
  • Camden’s fusion center was created after its new police force. Could an effective intervention be warding against their creation and against disguised forms of new policing?

Conclusion

[Partner shareout]

Let’s do a shareout. For each group, we ask for the key takeaways and interventions from their discussion. 

All right, let’s go over some general takeaways from what we just discussed.

[principles of smart tech]

What we learned from ALPRs, we can distill into some general principles of how surveillance, or “smart,” tech works. That is:

  • It gathers personal information 24/7.
  • This information is then associated with some unique identifier like a face or license number.
  • Then this information is pooled with all the other information gathered–across the country, across many years, by different agencies—to build a profile on a person, to understand connections between groups of people, and to predict people’s actions.
  • Based on this information, different private or public agencies make decisions about a person.

[problems with smart tech]

We can also distill our discussion into some general problems with surveillance, or “smart” tech. That is:

  • New technologies are largely unregulated
  • New technologies are gathering far more information than they need to — in terms of time, granularity, different senses
  • There’s very little public transparency or oversight of these technologies–the limitations are usually that of the technology itself.
  • The real danger is in the fusion of data in huge private databases with no oversight
  • Can then be used to track people and their associations very granularly, often used by govt agencies specifically to target vulnerable groups

[next] Just remember the story of ICE and its reliance on private databases like CLEAR and Vigilant.

[generally effective interventions]

Finally, all hope is not lost. We can come up with some generally effective interventions to scope creep and infringement on civil liberties. For example:

  • Pass laws restricting tech use
  • Pass laws restricting data collecting, selling, sharing, and retention, especially to private companies / data brokers
  • Demand public oversight/audits of proposed systems
  • Refuse the tech
  • Refuse to build it
  • Adopt effective low-tech alternative
  • Do participatory co-design to find real community needs, rather than top-down imposition (“no one asked for this”)

Solicit feedback from attendees.

Give contact info for your organization.

References

ALPR sources (accessed August 2020)

Fusion center sources

Feedback form

If you’d like to make a separate form for your organization, write to us at team at (our domain) and we can share the Google Form with you.

Categories
Module

Counter-surveillance yoga

Note: this guide is a work in progress and may change at any time! We’ve done our best to cite our sources, but this page has not been professionally fact-checked.

This workshop was first facilitated at the CtrlZ.AI zine fair in Barcelona in February 2020.

It’s really stressful to have a body in 2020! Practicing traditional yoga, or other self-care strategies, may not offer specific ways to ease the stress of collectively living under the regime of a “stalker state” (as the Stop LAPD Spying Coalition put it). 

In this workshop, we will guide participants through a series of movement exercises that will expose the principles of the modern American surveillance society—but, you know, in a gentle and relaxing manner. We hope to help participants open their third eyes.

All are welcome; no yoga experience required. If you can, please bring a pillow or blanket.

Script written by Jennifer Lee and coveillance members.

Facilitator information

Goals

  • Create a powerful and empowering series of exercises that point to deeper manifestations and resistances to mass surveillance (and more generally, how order is created, kept, maintained, and destabilized)
  • Encourage people to question existing assumptions about the ways society functions—the way we move throughout society; the ways we see and are seen
  • Energize participants, physically and mentally
  • Ideally this workshop is facilitated by two people, though it can work with one as well.

General principles

  • Give notice of transitions between poses
  • Notice when body touches body
  • Move with the breath (inhale and exhale)

Exercises (45 min)

  • The exercises (from “facing the faceless” through “collective voice”) total 45 mins.
  • The debrief extends 15 minutes after the session.
  • A note on the overall flow of energy: there are three ups/downs, which follow a “do and undo” structure. For example, the first ask is to “say your name and line up,” and the undoing of that is to “please choose your name and location.”

Facing the faceless (0 min)

Leader: Facilitator 1

[This is an optional installation that involves custom software that we wrote to erase faces; if you want to try it, email us at team@coveillance.org, otherwise feel free to skip if you don’t want to set up software.]

[There will be a webcam set up with a screen behind it. The software (clmtrackr.js) will erase the faces of everyone who approaches, replacing it with the background image, to create an effect of invisibility.]

Sign at entrance: Welcome to countersurveillance yoga. We invite you to start by experiencing this installation. How does the possibility of visibility and invisibility make you feel? Once you finish interacting with the installation, please enter the room and sit or stand wherever you feel comfortable.

Alternative sign: please leave your faces at the door.

Opening discussion (5 min)

Leader: Facilitator 2

(Once everyone is inside)

Welcome to countersurveillance yoga. My name is Facilitator 2, and this is Facilitator 1. In this workshop, we seek to provide a brief respite from the unique challenges of living in a surveillance society by practicing actions of collective care and resistance in our physical bodies. 

In this workshop, we will move through 7 different flows. In each flow, we will ask you to move your bodies in different ways, but you are always welcome to opt out and do whatever makes you feel most comfortable. Each flow will involve adopting different poses that point to deeper principles of how surveillance societies function. Then, we will embody poses that work to dismantle these types of surveillance thinking.

We’re going to start by taking a minute to start by stretching out our bodies. Feel free to follow along with us, or stretch out in any way that feels good to you. 

Ok, let’s begin!

[1 minute of stretching]

Names (5 min)

Leader: Facilitator 1

Let’s line up in a row, stand up straight, and say our names. 

Then, let’s slowly take that apart…

Let’s start with names. The language used to refer to us. Forget the name you gave us for a second.

We invite you to think about the way you refer to yourself. Hold those names in your head for a second.

We invite you to think of names that your friends use to refer to you.

We invite you also to think of ways that others refer to you. Perhaps authority figures, like teachers, or even the DMV.

And think about how each name makes you feel.

The single static name is an invention meant to make individuals legible to the nation-state. Trackable, measurable–e.g. for receiving services. Institutions understand individuals as a bag of unique identifiers. Name, social security number, phone number, etc.

If you have not already, open yourself to the possibility of going by many names.

If it is available to you, pick a name that you have never used for yourself but that you would like to use, and hold that in your head for the duration of our time together.

Moving and seeing (5 min)

Leader: Facilitator 2

Now that we have explored the feeling of being named and identified, we’re going to exit this row formation, and explore the idea of movement.

This second flow is called moving & seeing. We invite you to move around the room, to exercise freedom of movement without judgment. Please take a moment to find a spot in the room and sit or stand in a position you find comfortable. 

[Facilitator 2 will sit down cross-legged here; Facilitator 1 will be standing or lying down]

Now, we invite you to think about the space and position that you chose. 

For example, some people may want to be more or less visible, and we may see that in the way we’ve chosen to arrange ourselves.

Without needing to say anything to each other, we’ve introduced ourselves. With this exercise, we each have shared our own ways of moving and arranging ourselves, relative to one other.

Seeing seeing (5 min)

Leader: Facilitator 1

Please come out of your pose, and to a standing position. Stretch out a part of your body that feels tense.

With your eyes open, we invite you to look at the wall. Imagine connecting the dots of where you’re looking.

Imagine that your eyes are laser beams that can cut through walls. Expand to moving your head and body to trace the laser beams on the walls around the room. (Focus on the energy of your own line of sight.)

Stop. 

Now, move your head and body to look around the room again, but this time, feel the lines of everyone else’s eyes and gazes moving. Move in whatever direction you feel comfortable. Really see each other. 

Understand that our gazes together have power.

Now, we invite everyone to lie on their backs or sit in a chair. 

Close your eyes. 

Switch off with either leg as necessary if your feet get tired.

Wiggle your toes. Then your ankles, bringing your lower body in.

Pretend that your toes are little cameras encircling the room.

What do you see?

Feeling Information (part I) (5 min)

Leader: Facilitator 2

Now that you’re comfortable in your chosen spot, please find a position that your body finds comfortable. You can either sit on the ground or on a chair, or lie down, whichever you prefer.  We invite everyone to close their eyes for the duration of this exercise.

This next flow is called Feeling Information, Part 1. We are constantly surrounded by flows of information from the moment we wake up to when we go to bed. We are increasingly always reachable, always traceable, and always watchable. We can get so accustomed to the constant flows of information that we consume and share that we may ignore the impacts of such flows on our bodies. 

With this exercise, we want to explore those impacts. 

Take a second to listen to your body. What is it telling you? What information is flowing through it? Is your body tense? Open? Tired? Curious? Anxious? 

Take a moment to tune into the sounds around you. 

Now close your eyes and take a few slow, deep breaths. Take a few moments to loosen your body from your head to your toes. 

Drop your shoulders. Relax the muscles in your arms so that they are heavy like sandbags. 

Relax your facial muscles. 

Release the tension from your jaw and feel your mouth and tongue relax. 

Untighten the muscles around your eyes. 

Breathe in. 1, 2, 3. Breathe out. 1, 2, 3. (Repeat) 

Pick one sensation — such as the feeling of your breath going in and out — and devote your attention to it. Just focus on that. 

Notice the information that your mind keeps returning to and try to understand them like they are clouds in the sky. Notice them, acknowledge them, but let them float on. 

Now with a clear mind, let’s move on. Please hold this pose.

Feeling information (part II) (5 min)

Leader: Facilitator 1

Please keep your eyes closed. 

Our next flow is called Feeling Information, Part 2.

In old times, people used to believe that to take a photograph of someone was to steal their soul.

Now imagine being photographed, and represented as an image. How does that feel in your body? While keeping your eyes closed, can you make a gesture or motion that indicates how that feels?

Please hold the pose, and release the pose.

Please open your eyes and look forward. Please come to a circle and look to the center of the room. Make sure your ears are visible by tucking away your hair. Adopt a neutral expression. This is the pose we’re usually asked to adopt when posing for an ID photo. How does this make you feel? 

Please hold the pose, and release the pose.

Our bodies are being imaged in many different new dimensions today. For example, every time we fill out a form, we trade information about ourselves to access services, like social media or government aid.

Now imagine filling out a form, being imaged in a different way. How does that feel in your body? Can you make a motion or a pose that indicates how that feels? 

Please hold the pose, and release the pose.

Now that we’ve felt the way it feels to be represented as information–we’re going to move into an open phase. Can you bring your body into poses that feel the opposite of how you felt when you were represented in this way? Perhaps you can make light eye contact with each other as it feels right. We invite you to move through a succession of poses.

[Participants start – pause for 20 seconds]

[Facilitator 2 & Facilitator 1 should do goofy poses to get others to loosen up] 

These poses will look and feel different for everybody, but you can do this in any way that feels real to you. Some of you may move into poses demanding that your full, holistic self be seen, others may make yourselves more invisible. 

(1 min) Please come to a neutral, standing position.

Embrace human illegibility (5 min)

Leader: Facilitator 1

Remember your experience going through airport security, or passing through borders. Think of poses you’ve had to adopt—for example, the “hands up, feet apart” pose required by the American TSA. The purpose of this pose is to make your body maximally legible to humans and machines. 

If it is available to you, please adopt this post, and hold it.

Notice the symmetry in this pose. Symmetry is deeply related to conventional notions of “beauty”—maximal symmetry is coded as easy to read, control, and track in its homogeneity. The most average face is the most conventionally beautiful face.

The length of the pose is the amount of time it takes for the xray to finish scanning you. Imagine being inspected by machines, or an airport security agent. What is the quality of that gaze?

Now let’s shake that out! On the other end, we invite you to take on attitudes of ugliness—poses, facial expressions, and voices that may be coded as ugly, to embrace illegibility to humans and machines.

As Mia Mingus writes, “[Consider] the magnificence of a body that shakes, takes up space, spills out, needs help, moseys, slinks… [consider] the magnificence of a body that has been coded, not just undesirable, but inhuman. See [the ugly] for what it is: our greatest strength.”

If it’s available to you, gently rest your gaze on each other, not staring at any one person, and feel your body respond. What is the quality of your gaze, of our looks?

Facilitator 2 will lead us through our final flow.

Collective voice (5 min)

Leader: Facilitator 2

This is a group of people that might not see each other again, but we were united by our time together.

We would like to lead us in a collective voicing, to simulate a coming storm. This represents the ways that we can use our voices and bodies to create collective power. We have the power to make our voices as loud or as quiet as we wish to make them. Please stand or take a seat in a chair and follow along. 

  • Rubbing palms together
  • Snapping fingers
  • Drumming thighs (crescendo)
  • Pounding feet (crescendo)
  • Drumming thighs (softer) 
  • Snapping fingers
  • Rubbing palms together

Thank you for joining us in practicing counter-surveillance yoga.

Debrief (15 min)

Leader: Facilitator 2

We would like to invite those who want to share the feelings and thoughts we experienced doing these exercises to remain for a few extra minutes. 

Let’s sit in a circle. Please find a spot and a position in a chair or on the floor so that we’re facing each other. 

We want everyone to have space to talk about how their bodies and minds felt during the workshop. We’d like to ask folks a couple of specific questions, then we’ll open up the conversation to be more freeform.

  • Let’s do a quick feelings check-in: Can you describe how you feel right now in one sentence?
  • Now that we all had a chance to check in with each other: How did your body feel throughout the workshop?
  • What did you find the most surprising experiences during this session?
  • Do you have any questions or suggestions for the facilitators?

 If you left your face at the door, please remember to pick it up on the way out.

References

Categories
Module

This is new / this is not new

Overview

  • Show the exercise graphic (for participants)
  • Show the census and Clearview articles (for participants)
  • Show the timeline graphic (for participants)
  • Talk about facilitator slides
  • Talk about facilitator handout
  • Talk about answer key for news articles (facilitator first, then participants)

Exercise handout and news articles (for participants)

Jan. 9, 2020, NBC News — “Many Latinos believe a citizenship question will be asked on the 2020 census and are less likely to participate, a national Latino leader told Congress on Thursday[…] Last June, the Supreme Court ruled the administration could not add the question about citizenship to the census. The court found the administration’s justification for adding the question was “contrived”. […]“The 2020 census could leave communities across the country undercounted, underrepresented and underfunded,” Maloney said.”

January 21, 2020, CNET.com– “What if a stranger could snap your picture on the sidewalk, then use an app to quickly discover your name, address and other details? A startup called Clearview AI has made that possible, and its app is being used by hundreds of law enforcement agencies in the US, including the FBI, according to a Saturday report in The New York Times. […] The app, says the Times, works by comparing a photo to a database of more than 3 billion pictures that Clearview says it’s scraped off Facebook, Venmo, Youtube and other sites. It then serves up matches, along with links to the sites where those database photos originally appeared. A name might easily be unearthed, and from there other info could be dug up online.The size of the Clearview database dwarfs others in use by law enforcement. The FBI’s own database, which taps passport and driver’s license photos, is one of the largest, with over 641 million images of US citizens.”

Printable version here


Timeline graphic (for participants)


Printable version here


Slides (for facilitator)

Current slides here (slides 1-10)

TODO


Exercise handout (for facilitator)

Exercise: This is new / This is not new

In what new skin will the old snake come forth?
—Frederick Douglass (Black abolitionist and reformer)

History is not the past. It is the stories we tell about the past. How we tell these stories—triumphantly or self-critically, metaphysically or dialectically—has a lot to do with whether we cut short or advance our evolution as human beings.

—Grace Lee Boggs (Chinese-American activist and revolutionary)

Overview

Imagine you see an alarming headline about surveillance in the New York Times or on CNN. How do you tell apart what’s really new about the news from what’s just old news

This exercise helps participants work together to recognize the “new skin” of old surveillance thinking narratives (as Frederick Douglass put it) that reinforce the age-old unbalanced power dynamics between different groups of people.

There are no distinct events that mark the exact “moment of being watched,” so we will refer to a number of snapshots in the American history of slavery, racism, indigenous erasure and xenophobia that play into the history of surveillance, criminalization, and disinvestment. 

Goals

  1. Equip participants with the mental tools and background to break down new surveillance stories by understanding their historical context
  2. Have a narrative understanding of how power dynamics and values of oppression in surveillance thinking 
  3. Identify the impacts of surveillance and points of intervention

Supplies

  • Printouts of news events
  • Printout of timeline
  • Instructions to the participants

Instructions

  1. Participants split into two groups.
  2. Each group is given a news article, along with a framework for understanding what is not new (a timeline of surveillance events) and what is new about our circumstances today (e.g. information technology).
  3. Each group breaks down the news article identifying resonances with historical events, aspects that are unprecedented, and intervention points.
  4. The two groups reconvene and discuss their findings, with an eye toward analyzing news articles in the wild.

Discussion Questions

This is Not New:

This is New:

  1. On your linked narratives, what has changed in method—especially with data availability and data sharing?
  2. What interventions can you identify?

Themes and Language to Hold

Communities that have experienced long-held injustices and barriers to resources have been loudly calling attention to issues of access, criminalization, and the logics of oppression beyond surveillance for a while. We defer to those who have been fighting in this realm and highlight the language used in their counter-narrative work.

“Power Not Paranoia”
(A framework for collective strength against the surveillance state)
Our Data Bodies, Stop LAPD Spying Coalition 

“Detroiters Want to Be Seen, Not Watched”
(A special surveillance issue inspired by Our Data Bodies against Detroit surveillance)
Tawana “Honeycomb” Petty, Detroit Community Technology Project

  1. 2020 Census Citizenship Question: Latinos, Asian Americans still fear 2020 census over citizenship question, witnesses tell Congress (NBC News, Jan 2020)
  2. Clearview app lets strangers find your name, info with snap of a photo, report says (CNET.com, Jan 2020)

(Old) Motives: What narratives are we fighting?

Racism: Slavery, Lantern Law, Plantation Management, Jim Crow, Redlining, Welfare Queen, War on Drugs, Broken Windows, racial profiling: Stop and Frisk

“Threats to Capital”: Red Scare: Dismantling of unions, surveillance of workers, Welfare Queen myth, criminalization of poverty: Broken Windows, War on Drugs

Xenophobia: Anti-immigration border enforcement, racial profiling (AZ SB †1070), Japanese Internment / Yellow Peril, War on Terror: NYPD Surveilance of Muslim Communities, War on Drugs, ICE workplace raids

State Enemies”: Japanese Internment, Red Scare (anti-labour), Japanese Internment / Yellow Peril, War on Terror: NYPD Surveilance of Muslim Communities

(New) Methods: What are the big data implications of surveillance?

Volume: Volume is what enables modern day facial recognition and autonomous vehicles, and the store and linkage of large databases provides new modes of identification and tracking of individuals, whether for an ad or a police investigation.

Velocity: The speed at which data gives access to actionable and identifiable information is at a different scale that in the past.

Variety: New private and public data sources, especially in the form of social media, digital trails, and mobile phone data are increasingly tied to our identities.

Veracity: Use of data gives an appearance of objectivity, even if the processing is flawed or the data itself is biased.

Detailed timeline (for facilitator)

Timeline events

  • 1712: New York City Slave Revolt: With New York’s dense population, enslaved people were able to occupy public space together. A group of twenty-three slaves incited a revolt to inspire further rebellion, killing nine slave owners and injuring six. 
  • 1713: The Lantern Law: As a part of crackdown in the city, the Common Council of the City of New York passed “A Law for Regulating Negro & Indian Slaves in the Night Time,” requiring all slaves older than 14 to illuminate themselves with a lantern after sunset, deputizing private citizens to bring in violators to the goals for punishment
  • 1831: Tice Davids Escape: 1st documented escape from slavery via the “Underground Railroad,” a network of abolitionists who aided slaves escape passages to the North.
  • 1850: Fugitive Slave Act: Passed as a part of the compromise of 1850, freed slaves became marked for re-capture, and the act mandated the federal government’s role in to re-capture enslaved people. This was passed to appease southern slave owners. 
  • 1890: The US Census puts out a call for new, faster methods to conduct the next census. Hollerith automates the US Census with punch cards, inspired by train ticket conductors that punched fare cards to track the physical characteristics of riders as a part of fare enforcement. This machine is a predecessor to the same census technologies that became the predecessor for IBM
  • 1933: German Census
    • Hollerith’s successor, IBM helps the Nazi party take a national census targeted at Jews 
    • Hollerith Machine
  • 1942: Japanese Internment 
  • 1969: Red Squads, COINTELPRO
    • As a part of widespread domestic surveillance of social and political movements by the NYPD in the form of “Red Squads,” the NYPD indict “Panther 21” Black Panther Party on over a hundred of unfounded conspiracy charges
    • The NYPD worked in concert with the FBI’s COINTELPRO operations, which were projects that ran 1956–1971 to infiltrate social movements
    • The trial revealed the extent of pervasive infiltration by police and FBI and sparked a subsequent lawsuit in 1971
    •  Joy James; Duke University (2007). Warfare in the American Homeland: Policing and Prison in a Penal Democracy. books.google.com. p. 74. ISBN 978-0822339236.
  • 1971 Handschu Case
    • After the “Panther 21” are acquitted, their lawyers sue & end the practice of social and political surveillance normalized by the NYPD Red Squads. The “Handschu Agreement” establishes oversight limiting domestic spying by NYPD
    • Matt Apuzzo, Adam Goldberg; Simon & Schuster (2013). Enemies Within: Inside the NYPD’s Secret Spying Unit and bin Laden’s Final Plot Against America. books.google.com. p. 44. ISBN 978-1476727936.
  • 1985: San Francisco passes the “City of Refuge” resolution
  • 2001: Congress passed the PATRIOT Act
  • 2008: Secure Communities Act
    • The joint systems were first asked for under the USA Patriot Act of 2001 and the Enhanced Border Security and Visa Entry Reform Act of 2002 
    • Under the Secure Communities Act, local & state corrections agencies gain access to a joint database linking the FBI’s Integrated Automated Fingerprint Identification System (IAFIS), which contains 66 million criminal records, and the Department of Homeland Security’s (DHS) Automated Biometric Identification System (IDENT), which holds over 90 million immigration records.
    • Secure Communities was created in 2008 when the technology for the joint database was feasible. ICE received 0 from Congress to prioritize and deport foreign nationals who had been “convicted of a crime, sentenced to imprisonment, and who may be deportable.” 
    • Secure Communities justified current data sharing and access that continued into the Trump administration’s deportation policies that increase in the number of raids, the length of detention time, and the number of immigrant detainees without criminal records.
    • Unanswered Questions Surround ICE’s Secure Communities Program 
  • 2013: NYPD Spying of Muslims

Sample answer keys (for facilitator and participants)

Example deconstruction: 2020 census

2020 Census Citizenship Question: Latinos, Asian Americans still fear 2020 census over citizenship question, witnesses tell Congress

What’s happening?

The US federal government (particularly President Donald Trump) wanted to ask a citizenship question on the 2020 census: “Is this person a citizen of the United States?” Latino and Asian American communities fear that this creates a hostile environment toward immigrants. This could lead to under-representation and under-resourcing of those communities. The 2020 census is kicking off this month (Jan 2020).

What’s not new about it?

Census data collection has always been a tricky question because of the tension between different ways a community may be seen. The Census has been the basis of American representation and taxation since the 1787 Constitutional Convention that created the United States Constitution and with it, the “three-fifths compromise” of counting slaves as sub-human. 

While you want your community to be counted for representation or resources, you don’t want a “knock on the door” for, say, targeted persecution.

One dangerous precedent was the actions of the Census Bureau in the 1940’s for World War II. As Scientific American writes: 

Despite decades of denials, government records confirm that the U.S. Census Bureau provided the U.S. Secret Service with names and addresses of Japanese-Americans during World War II.

The Census Bureau surveys the population every decade with detailed questionnaires but is barred by law from revealing data that could be linked to specific individuals. The Second War Powers Act of 1942 temporarily repealed that protection to assist in the roundup of Japanese-Americans for imprisonment in internment camps in California and six other states during the war. 

In general, we can see this “old snake” of targeted oppression surface in other past census events as early as a century ago. These include the 1890 automated census with Hollerith machines and 1933 German census. 

Moreover, even as the means of measurement grow more sophisticated, the motivations stay the same: American wartime exceptionalism combined with a xenophobic fear of the “yellow peril” led to certain populations being targeted.

Finally, this event can relate more broadly to the means of monitoring/control of certain populations, such as plantation owners surveilling enslaved peoples by requiring detailed timecards of labor performed. 

What’s new about it?

With old censuses, you had to dig data out of punchcards in a drawer and mail things back and forth; it would take days to make requests through a paper-shuffling bureaucracy. Now, sharing data is almost instantaneous, as easy as a click of a button, with the ability to make powerful queries and filters and discover powerful correlations.

There’s a common big data hype about higher volume, velocity, and variety. This all holds true for information collected by the 2020 census than in past censuses. However, veracity is not really a factor in this story—in fact, the 2020 citizenship census question is the only one that was not “scientifically validated” through the same process as the rest of the census questions.

What can we do to intervene (both past and present)?

Socially, it is important to counter the rhetoric of wartime exceptionalism (“we can weaken protections for individual citizens because the state needs more power, because we’re at war”) by understanding that such protections are difficult to reinstate once eroded, as in the example of the War Powers Act.

Finally, anyone involved in the chain of data-sharing can offer individual acts of resistance: we won’t build it; we won’t share it. For example, in WWII, various sympathetic officers would refuse to turn over addresses of Jews or other targeted populations when asked by the state. 

Example deconstruction: Clearview facial recognition

What’s happening?

  • A new startup (Clearview AI) has built a very powerful facial recognition app that threatens to “end your ability to walk down the street anonymously.” 
  • News came out about it just this month (Jan 2020).
  • It works by combining facial recognition tech with a database of images larger than any entity (including both the US government and big tech companies) has offered before, created by scaping websites. 
  • Their tech has already been deployed by federal and tech law enforcement officers to solve cases ranging from shoplifting to murder. It is not yet available to the public, but (according to NYT) Clearview’s investors predict that it will be available eventually to the public.

What’s new about it?

  • Uniform blanket deployment (i.e. in all of public space, not just for special circumstances like police pursuing a suspect).
    • This threatens a much larger group of people than ever before, since the tech can be deployed indiscriminately and pervasively, hence the massive news coverage (NYT expose).
  • Erosion of social taboos.
    • NYT: Until now, technology that readily identifies everyone based on his or her face has been taboo because of its radical erosion of privacy. Tech companies capable of releasing such a tool have refrained from doing so; in 2011, Google’s chairman at the time said it was the one technology the company had held back because it could be used “in a very bad way.” 
  • General lack of regulation of these technologies.
    • They’ve been deployed faster than legislation and public understanding can catch up.
    • …As opposed to, say, guns, which pose a well-known danger in America that the public and lawmakers understand (though are still ill-regulated).
  • North American big-city norms of privacy.
    • People actually expect to be able to walk down the street anonymously—which is good! But this is not, say, the norm of privacy in a small town.
  • Democratization of information and of information technology.
    • Availability of data — it used to be extremely hard to gather face photos and process them. Now it’s as easy as “surfing the web.”
      • Wired: Ten faces may now seem like a pretty pipsqueak goal, but in 1963 it was breathtakingly ambitious.
      • Wired: In 1973 a Japanese computer scientist named Takeo Kanade made a major leap in facial-recognition technology. Using what was then a very rare commodity—a database of 850 digitized photographs … Kanade developed a program that could extract facial features such as the nose, mouth, and eyes without human input. 
      • NYT: Police departments have had access to facial recognition tools for almost 20 years, but they have historically been limited to searching government-provided images, such as mug shots and driver’s license photos.
    • Advent of personal computing — anyone with an app + internet access can do it. (e.g. One rogue programmer named Hoan Ton-That.)
      •  In recent years, facial recognition algorithms have improved in accuracy, and companies like Amazon offer products that can create a facial recognition program for any database of images.
    • Scale of digital infrastructure — massive compute power is available to developers for cheap; CCTV cameras are everywhere in North American cities.
    • Advances in machine learning in the last 10 years.
      • Wired: Only in the past 10 years or so has facial recognition started to become capable of dealing with real-world imperfection, says Anil K. Jain, a computer scientist at Michigan State University and coeditor of Handbook of Face Recognition. 

What’s not new about it?

  • Capitalism (particularly the North American breed). Hurting impacted communities doesn’t (directly) hurt a company’s bottom line!
    • Capitalism drives the understanding of personal data as a resource to mine, and hoarding personal data as a competitive advantage for companies and governments alike.
    • Capitalism gives companies incentives to build and sell technologies to governments and institutions, regardless of impact on communities.
  • Social biases in who is building, deploying, and funding new technologies (racism, sexism, pipeline problems, etc.).
    • Wired: Many of the biases that we may write off as being relics of Woody’s time—the sample sets skewed almost entirely toward white men; the seemingly blithe trust in government authority; the temptation to use facial recognition to discriminate between races—continue to dog the technology today.
  • Photography/imaging/measuring technologies, especially those involving unique identifiers, are deployed by the state to control vulnerable populations.
    • Examples from the timeline: Bertillonage; Lantern laws; mugshots of prisoners, fingerprints; Polaroid photos to reinforce apartheid.
    • Wired: Over the following year, Woody came to believe that the most promising path to automated facial recognition was one that reduced a face to a set of relationships between its major landmarks: eyes, ears, nose, eyebrows, lips. The system that he imagined was similar to one that Alphonse Bertillon, the French criminologist who invented the modern mug shot, had pioneered in 1879. Bertillon described people on the basis of 11 physical measurements, including the length of the left foot and the length from the elbow to the end of the middle finger. The idea was that, if you took enough measurements, every person was unique. Although the system was labor-intensive, it worked: In 1897, years before fingerprinting became widespread, French gendarmes used it to identify the serial killer Joseph Vacher.
  • Poor decision-making frameworks on the part of the tech people who are building it and the law enforcement people who are using it.
    • NYT: Asked about the implications of bringing such a power into the world, Mr. Ton-That seemed taken aback. “I have to think about that,” he said. “Our belief is that this is the best use of the technology.”

What can we do to intervene (both past and present)?

  • Work toward abolition.
    • Academics have called facial recognition “the plutonium of AI.”
    • Oakland and San Francisco have passed bans on facial recognition technology.
    • Short of that, pass a strong federal privacy law requiring oversight + procurement processes for surveillance tech.
  • Build safeguards into tech from the start to protect vulnerable communities.
    • A poorly-designed Internet, with no safeguards against crawling or mechanism for data control/provenance, enables information to be freely transferred.
  • Perform individual acts of resistance.
    • Engineers can say “We won’t build it” and pressure other engineers to do the same; people inside law enforcement agencies can refuse to use these technologies.
  • Support pedagogies, sociocultural norms, and capitalistic motives that make it hard for people like the Clearview founder to advance in life.
    • Conventional computer science education will help you answer the question “how do I crawl all the photos on the open web as fast as possible?” but is largely silent on important matters like “but why shouldn’t I do that?”
    • One instructive example is the Dow company, which colluded with the US government to produce napalm for the Vietnam War in 1965. Activists worked to ensure Dow’s reputation reflected this “unethical war profiteering,” and the million napalm contract likely cost Dow billions of dollars.
    • Canadian-trained structural engineers swear by a code of ethics, symbolized by the Iron Ring. Doctors swear by the Hippocratic Oath. Why don’t information workers and software engineers do the equivalent?

References for Clearview

Other news articles to deconstruct