You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Predict and surveil: Data, discretion, and the future of policing, by Sarah Brayne: A review by Karolina La Fors

Predict and Surveil: Data, Discretion, and The Future of Policing. Sarah Brayne (2021) Oxford: Oxford University Press.

Sarah Brayne’s work, Predict and Surveil offers a convincing critique of the risks involved with crime prediction algorithms. Is it still, and will it still remain possible, for a person to change how law enforcement professionals perceive his/her future, and to oppose an algorithmic prediction? Put it differently, can crime predicting algorithms anticipate individuals falling into criminality, or do they actually perpetuate individuals falling into crime once ‘marked’? Brayne offers a highly compelling, lucid, and densely detailed sociological account of how data-intensive and artificial intelligence (AI) based, predictive policing practices render it increasingly impossible for surveilled, and apprehended citizens, to contest police suspicions.

The empirical works is the product of a five-year long ethnographic, dedicated work during which the author followed the artificial intelligence-led police professionals within the third largest local law enforcement agency in the United States, the Los Angeles Police Department (LAPD). One of the main achievements of the book lies in the richness of her ethnographic description in which the author never loses sight of her own position as a researcher. In this sense, her analysis shows us the daily routines of the “social side” (p. 4) big data analytics. She demonstrates how AI technologies reinforce and perpetuate criminal suspicions through aggregating and correlating vast amounts of multi-contextualized data, and how any encounter with the LAPD increases the chances of becoming a suspect with subsequent discriminatory and unequal treatment.

Brayne’s book guides the reader through seven chapters, beginning with how the objectives for improving crime mapping practices historically shaped and constructed contemporary big data-intensive policing practices of the LAPD. She describes how these practices became the “dye that illuminates the cracks in the [criminal justice] system” (p. 27). In chapter two the author shows how the historical eagerness of the police for datafication as an optimization goal became operationalized through an initial respone to the need for better allocation of resources through ‘’pin maps” (p. 13); through to responding to demands for enhanced police accountability and transparency (triggered by high-profile cases such as Rodney King); to reacting to the need for legislative improvements (“offender management”, p. 31). Brayne shows how each of these purposes gradually turned policing practices towards a scientific approach. Planning and allocating resources in a fashion that is free from political influence optimizes crime prevention, and leads to the main objective and driving force for acquiring and analyzing more data: “precision policing” (p. 23). She demonstrates how police forces, big data innovating companies, and public procurement practices aimed at acquiring analytical solutions from the military, blur the boundaries between the private and public sector. Consequently, ‘incriminating’ data originates from several sources, as “police is not only using data collected by private companies”, but also employ companies such as PredPol, HunchLab, IBM i2 and Palantir to “store, share and analyze” data (p. 25).

Chapter three describes how the detrimental effects of AI emerge targeting those under surveillance, in part, by depicting how police officers can be uncertain about “what data were in data bases they were accessing” (p. 41). While being uncertain, accessing data bases only increases the likelihood for potential hurtful consequences for those unjustifiably stopped, searched, and interrogated. The author also demonstrates how the harmful effects can be reinforced by the so-called “dragnet” which allows combining private and public data for risk calculations. Her meticulous eye-opening work becomes particularly powerful in this chapter, for instance, through her account and analysis of how the database LexisNexis allows for a stunning variety to correlate data from a vastness of daily life contexts. This aligns with the claim by Andrejevic and Gates that in big data “function is the creep” (p. 24).

In the fourth chapter, Brayne effectively expands her analytical narrative by employing police jargon to emphasize the incriminating nature of data analytics. She convincingly replicates how the premise that “a small percentage of people are disproportionately responsible for most violent crime” (p. 57) became internalized in the daily mindset of the police officers. Brayne’s work shows how the officers increasingly become sensitized and triggered by “pre-warrant intelligence”, “likely offenders”; “risk-based deployment” or “heatmaps” as “force multipliers”, which only further maintains how such mindsets become reinforced in the professional daily routines. Her descriptions of, for instance, the Los Angeles Strategic Extraction and Restoration Program (LASER) system, and the Chronic Offender Bulletin, are powerful because her observations and quotes about the daily operations of these systems provide us with rich stories. For instance, an answer of a police officer on questions of Brayne pertaining to whether they “had grounds to stop people” based upon LASER: “Yeah, not the probable cause but the awareness…the awareness of what the individual is about.” (p. 62). Assertions like these underline how AI systems in practice nurture a feedback loop.

Chapter five accompanies the reader to explore how the police forces themselves perceive being surveilled, or “when the watcher becomes the watched” (p. 75), and the paradoxes through which LAPD police officers resist their own surveillance. Brayne uncovers not only particular forms of employee resistance, such as “union opposition”, but also how police officers fully appreciate the harmful consequences of AI surveillance practices. The analysis also includes, how police officers’ “new emphasis on proficiency of data” (p. 99) comes about and how data fosters their privileged position. Here, the study would have benefitted from some insights from the philosophy of technology. By responding to these systems police officers become what Foucault calls ‘’docile bodies” (Foucault, 1977). This could have demonstrated how police officers’ relationship to surveillance systems gradually shift from being masters of the system, towards becoming its servant.

One of the salient points in this part is that by subjectifying themselves to predictive data analytics-mediated proficiency, police officers are more likely rendered to perceive that “catching the bad guys” (p. 41) is (almost) only achievable through the improvement of the efficiency in data-analytics. This perception makes police officers manage the AI mediated correlations as if they were actionable, and real reasons for future crime (O’Neil, 2016). Whilst Brayne’s account of the motives is lucid (“the practice of resistance shapes future data”, p. 99), the models are actually intended to “serve to lower the value of sworn officers’ existing competencies” (p. 83). The data only indicates that such deskilling process is a byproduct of predictive algorithms. However, the extent to which this transformation in power relations among police staff would be the result of such systems would require additional research.

In chapter six, the analytical focus is directed towards to what extent predictive policing algorithms have “ambivalent implications for social inequalities” and whether police surveillance disproportionately exaggerates inequalities. The chapter highlights the set of institutional entities, such as the Centre for Policing Equity (which established the National Justice Database being tasked to increase transparency on police use of predictive algorithms). But Brayne emphasizes that none of her interviewees had any recollection of a Palantir-conducted audit on the police’s data analytics, and without such audit as she explains “LAPD’s predictive models have created a behavioral loop: they not only predict events […], they also actually contribute to those events’ future occurrence”, because “social dynamics result in systemic bias that becomes training data fed into predictive policing algorithms” (p. 107). By showing how AI technology creates more inequality she summarizes her analysis by emphasizing “data extends the ‘mark’ of a criminal record” (p. 107). For this chapter two pieces of literature could have offered additional inspiration. First, Cathy O’Neil’s work on the inequality perpetuating effects of mathematical models when built into predictive policing algorithms (O’Neil, 2016). And secondly, Virginia Eubank’s work on automating inequality within different US government-citizen relationships, including policing contexts (Eubanks, 2018).

In chapter seven Brayne criticizes in an insightful manner: a) how big data “shifted the locus of discretion about what constitutes admissible evidence from judges to police” (p. 135) and b) the shortcomings of existing legal frameworks being “anachronistic” (p. 133), too narrow, too theoretical and too engaged in doctrinal debates. With respect to the first point, she compellingly shows how the patchwork of US privacy regulations outside the Fourth Amendment, must answer questions around whether the fairness of legal frameworks “holds in data-intensive contexts” (p. 135). She offers brilliant arguments as to, for instance, how “on grounds of third-party doctrine” (p. 120) voluntarily shared data of civilians can become incorporated without a warrant into law enforcement subpoenas. With respect to the shortcomings of legal frameworks being too focused on doctrinal debates her observation and perception has historical foundations that have moved these debates to the center of the discourse. However, she misses out some of the legal scholarship that has informed the legal discipline based on empirical insights into policing practices (v. Brakel & de Hert, 2011; La Fors 2017: Weston et al., 2019). By way of example, my personal contribution was aimed at informing legal frameworks and distilling recommendations for change by empirically following the ethical and social implications of digitalization goals, the design and use of these systems. In another recent critique of the role of AI in law enforcement, I argue for the incorporation of values such as forgiveness beyond the principles of non-discrimination and fairness within the children rights framework in order to better accommodate for harmful implications of algorithmic mediated risk profiling. Converting forgiveness into the design, implementation and use of AI would be essential for children in general, but in particularly for those “unforgiven” ones who are not responsible for any of the crimes they have become algorithmically correlated to (La Fors, 2020, p. 3).

Brayne offers six important provocations in her concluding chapter which all are valuable beyond the US context. She encapsulates these by asking, from the perspective of big data-mediated law enforcement practices, the highly relevant question: “what does successful policing look like for the [structurally disadvantaged] communities?” (p. 145). She partly answers this question by highlighting the invaluable importance of third-party auditing and awareness-raising in the law enforcing organizations. Such research could include what it means for the subjects of surveillance in different contexts of their lives that such a vast amount of data becomes available about them for risk profiling purposes. s Brayne concludes her book by identifying some pathways for future research. In accordance with her provocations, it would be interesting to see how she can track possible mechanisms of resistance, if possible, as an attempt to reconfigure fairness by learning about how the diversity of dragnet-like algorithms become organized and used whenever they become in reach of LAPD. For instance, constructivist technology studies and philosophy of technology could also provide inspiration to assess how values become mediated in human technology interactions of the LAPD.

Such research could also inform what legal changes are necessary and how to enforce them effectively. The latter seems essential, because even though such new laws as, for instance, in the EU the recently proposed Artificial Intelligence Act exemplifies a unique step towards curbing the detrimental effects of AI, it remains to be awaited how this will affect life chances of data subjects as the final version of this act appears to be wetted against multiple interests, including those of big data companies whose business model depends upon marketing, for instance, facial recognition technologies.

Overall, Predict and Surveil is a very valuable addition to existing empirical studies of the actual impacts of AI driven algorithms of policing in the United States, and offers a thorough and lively account of the dynamics in play for anyone looking for how to use such AI responsibly.

Dr. Karolina La Fors

Post-doc Researcher Responsible Design

DesignLab

University of Twente

E-mail: k.lafors@utwente.nl

References

[1] 

Bryane, S. ((2021) ). Predict and Surveil: Data, Discretion, and the Future of Policing, Oxford University Press.

[2] 

Eubanks, V. ((2018) ). Automating inequality: How High-Tech tools profile, police and punish the poor, St. Martin’s Press.

[3] 

Foucault, M. ((1977) ). Discipline and Punish: The Birth of the Prison. Pantheon Books.

[4] 

La Fors, K. ((2017) ). Profiling ‘anomalies’ and the anomalies of profiling: Digitalized risk assessments of Dutch youth and the new European data protection regime. In N. Purtova, S. Adams, & R. Leenes (Eds.), Under observation: The synergies, benefits and trade-offs of eHealth. Springer Publishers.

[5] 

La Fors, K. ((2020) ). Legal remedies for a forgiving society: Children’s rights data protection rights and the value of forgiveness in AI-mediated risk profiling of children by Dutch authorities, Comput Law Secur Rev, 38: Sep. (2020) .

[6] 

O’Neil, C. ((2017) ). Weapons of math destruction. Penguin Books.

[7] 

Van Brakel, R. & Hert, P. ((2011) ). Policing, surveillance, and law in a pre-crime society: Understanding the consequences of technology-based strategies. Journal of Police Studies, 20: : 163-192.

[8] 

Weston, C., Bennett Moses, L., Sanders, C. ((2019) ). The Changing Role of the Law Enforcement Analyst: Clarifying Core Competencies for Analysts and Supervisors Through Empirical Research (January 6, 2019). Policing & Society, 30: (5): 1-16, UNSW Law Research Paper No. 20-77.