How can we get around the surveillance devices?

Ruses et tactiques de résistance dans les environnements numériques

Illustré par :

Faced with the spectacular emergence of increasingly intrusive technologies capable of collecting, analyzing and processing gigantic quantities of data for profiling and surveillance purposes, citizens are finding themselves increasingly powerless. How to escape permanent tracking in a context where technical devices facilitate the exploitation of data without people’s knowledge? Is it realistic to expect citizens to « manage » their data and ensure control over their profiles when they very often lack the most basic familiarity with the technologies they use? 

The contemporary debates around privacy and so-called « personal » data oppose at their extremes two totally contradictory conceptions of the power of the individual in the era of the new  » algorithmic governmentality « (1) They consecrate, on the one hand, the fantasy of an absolute power of the autonomous and responsible individual on his data; on the other hand, the nightmare of a total heteronomy of a passive and impotent individual. Representing two sides of the same coin, these conceptions neglect to take into account the gradual emergence of a variety of everyday forms of resistance that take place through the staging of modest, tiny and fragmentary actions(2). Among the subtle forms of control situated in this grey zone between autonomy and total heteronomy, it is indeed possible to identify a series of subversive practices proceeding from what is commonly called the ruse. Aware of the inherent limits of regulatory or self-regulatory processes to respond to the risks raised by the proliferation of profiling mechanisms in digital environments, various actors have organized themselves in a more or less formal manner and have developed a series of projects in recent years aimed at thwarting these mechanisms, turning the enemy’s weapon against itself and « tracking the trackers. 

These projects aim to design and disseminate technical tools on the Internet that both inform users and give them the opportunity to resist profiling and surveillance mechanisms through various subterfuges. From a technical point of view, these projects are generally very simple. They do not rely, or only very rarely, on the development of heavy and complex technical architectures, such as cryptographic processes. In addition, they provide users with software or applications that are not only easy to use, but also do not interfere with the proper functioning of their machine. Second, the tools made available to users by these projects differ in their effects from other well-known tactics to ensure secrecy or anonymity. One example is the use of anonymous platforms such as Tor(3). For the protagonists of the ruse, disappearance, secrecy, anonymity or total refusal are not really options. To these, they prefer practical intelligence and the art of deception. 

Cunning is a notion that generally refers to ingenuity, inventiveness and creativity deployed in everyday uses. As such, this notion has a strong link with the skills, gestures, routines and know-how required to develop and manipulate technical objects and machines. In the literature dedicated to the uses of media or information and communication technologies, this « art of the trick » or DIY has been widely commented on to describe, for example, the technical virtuosity of free software developers or computer hackers. The developers involved in the projects we discuss here certainly demonstrate the same qualities. The practical intelligence that characterizes cunning is expressed here through a first tactical movement consisting, thanks to a process of familiarization with the mechanisms of tracing and profiling, in « finding the trick » that will allow the exploitation of the loopholes(4). It is then a priority to open the algorithmic « black boxes », thanks to reverse engineering processes, in order to understand their workings. Such a familiar relationship with objects is precisely what is very often lacking in « ordinary users » whose capacities and skills seem more than limited when it comes to manipulating their machines and protecting their data. In their daily dealings with digital environments and devices, most ordinary people engage in awkward, even contradictory routines that offer them no control over their data and can be dangerous. This is what the different protagonists of these projects try to compensate for by putting both their virtuosity and their tricks at the service of lay users. 

The practical intelligence proper to the ruse is then deployed through a second movement taking the form of a « pedagogy of the ruse ». Once the black boxes of the tracking and profiling mechanisms have been unpacked, their operation is revealed to ordinary users. The aim of these projects is to provide users with information and tools (such as cookie databases, trace maps, company rating systems, etc.) to better understand how their data is used by advertising networks, behavioral data providers, website publishers and other companies that are interested in their online activities. It’s about promoting a Do it Yourself ethic by revealing to ordinary users tips and tricks to make their experience in digital environments more meaningful. 

Beyond their pedagogical virtues, let’s not forget that these different tactics of resistance serve above all to deceive. Such is the nature of cunning. However, if the objective of this one is to play a (bad) trick, it is necessary to bear in mind that the effects sought as well as the means can vary. Indeed, several artifices/artifacts can be used for different types of mystifications. 

For example, some of these projects use what Brunton and Nissenbaum call « obfuscation, » which can be defined as the production and reporting of misleading, ambiguous, or false data in order to create confusion and make the data collection less reliable and therefore less valuable to data aggregators(5). The TrackMeNot (TMN) project, in particular, proposes abrowser extension that aims to prevent, or at least limit, profiling through search engines. Instead of using cryptographic tools to cover traces, TrackMeNot masks user requests by paradoxically relying on the opposite strategy: noise and obfuscation. With TMN, real user queries are hidden in the midst of system-generated ghost searches on engines that users choose. In other words, TMN hides users’ searches in a nebula of fake searches to make user profiling difficult and inefficient. In the same vein, the aptly named Ad Nauseam project automatically clicks on any previously blocked ad and, in doing so, records a visit for the ad in question within the ad networks’ databases. This omnivorous and uninterrupted flow of clicks reveals a complete lack of logic, making the data collected unusable for profiling, targeting or surveillance purposes. By simulating a user’s behavior without disguising his identity and without making his data unreadable, these programs aim to blur his profile by « hiding him in the crowd », by drowning him in the mass. 

Other projects are developing tools based on a different trickery model. They aim in particular to disguise the identity of users on social networks. The Undefined project proposes a tool that allows users to automatically alter their identities on social networks like Facebook, Foursquare or Twitter(6). By using this tool, the user agrees to let Undefined post content on social networks and interact with other people on their behalf. These actions can be pre-selected by the user from a list of different tactics, which are supposed to alter the digital identities that fall prey to the surveillance algorithms. Other projects, such as Vortex(7), allow users to observe how profiling algorithms react to varying inputs in different ways, including playing with cookies. Still in the prototype stage, Vortex is a browser extension, designed as a data management game, allowing users to manage their digital identities by inviting them to exchange cookies and observe in real time the behavior of their browser according to the cookies used. On this basis, it becomes possible to blur one’s tracks and make profiling less easy. 

Obfuscation, simulation, diversion, blocking, camouflage… a real diversity of tricks and tactics are progressively developed to deceive the tracking algorithms. The semantic register used by the protagonists of these projects, on the other hand, is not misleading: it comes from the art of war or combat(8). Cunning, through the stratagems it implements, is a weapon used to thwart the enemy’s plans. It is a practice of resistance that is situated in a relationship of force and tries to make good use of the circumstances. The ruse is thus nourished by the conflict and the rivalry in relation to a rationality which claims to impose itself without discussion, be it political, economic or techno-scientific. In this perspective, the tactical and militant commitment of the actors developing these different projects leads them to conceive « counter-artifacts »(9) intended to compensate for the situations of asymmetry or structural imbalance that the users of digital devices (including themselves) are confronted with in terms of data collection and processing. The cunning practices they develop are forms of resistance aimed at fighting against the « tyranny of data » generating situations of weakness and vulnerability that must be compensated as much as possible. 

The crafty practices developed within these projects are therefore eminently political. By revealing the workings of the technical devices of profiling, these projects highlight the specific forms of subordination that pass through things and that today particularly disarm the critic. In particular, these projects highlight the fact that, in digital environments, citizens do not have real holds that would allow them to exercise any control over their data. Even though the subject is expected to (re)take control over his data, the environment in which such control is supposed to take place is not at all shaped in this sense. The only holds it offers to the individual turn out to be in fine the best means to ensure its hold. This is particularly true of the famous user-friendliness of the devices and interfaces that make up Web 2.0, which is supposed to facilitate participation, sharing, interactivity and autonomy. The promises of ease and interactivity are inevitably offset by the willing or unwilling transfer of detailed information to ever more powerful data collection and analysis systems(10).

Given the situations of deep imbalance in which the users are engaged, the cunning practices, through their « tactical creativity », aim primarily at « working » the things in order to appropriate them and to make them livable. The reflections of M. de Certeau are valuable in this respect(11). Indeed, for this author, tactics, understood as the cunning of the subaltern, is an original way of dealing with power and accessing resources. It refers to a way of moving in a space that is not owned. The « arts of making » that we examined are then similar to attempts to better « make with », temporary and provisional arrangements, taking advantage of the faults within a space streaked by indeterminate and disproportionate forces. In such a space, cunning does not protect against uncertainty, nor does it guarantee revolution. At most, it offers a variety of options for navigating it, for dealing with it through means that allow us to re-establish some form of control… 

« Weapon of the weak »(12) aiming at accommodating as well as possible the social order and the violence of things, the ruse approaches the problematic of the private life on an agonistic mode and, by doing so, contributes to relativize the contemporary fantasies on the individual control of the data. In a world where it is becoming increasingly difficult to erase one’s tracks, is cunning the only solution left? The weapon of last resort? To accept such an outcome seems to us dangerous because it would mean to reduce too quickly the man to his animality, to make of him only, as said Deleuze in his famous primer, a « being on the lookout « …

Christophe Lazaro

LIST OF PROJECTS 

  • TrackMeNot http://cs.nyu.edu/trackmenot/fr/
  • AdNauseam http://dhowe.github.io/AdNauseam/
  • Privacy badger https://www.eff.org/privacybadger
  • Undefined http://vincentdubois.fr/undefined.php
  • Are we private yet? http://www.areweprivateyet.com/
  • Adchoices http://www.youronlinechoices.com/ie/ your-ad-choices
  • FaceCloak https://crysp.uwaterloo.ca/software/facecloak/
  • Disconnect https://disconnect.me/
  • Vortex http://www.milkred.net/vortex
  • Cryptagram http://cryptogram.prglab.org/
  • Terms of Service; Didn’t Read https://tosdr.org/downloads.html

Some projects are more artistic in nature. Resistance tactics are deployed, for example, through the development of prosthetic masks or make-up-camouflage processes aimed at combating facial recognition systems (the URME Surveillance project(13), CV Dazzle(14), or the Facial Weaponization Suite project(15)). In these different projects, the faces are disfigured, reconfigured, even erased; the subversive virtues of the mask are rehabilitated in a carnivalesque rejection of surveillance and identification. Other initiatives, more dedicated to digital environments, show rather a techno-militant character. 

Notes et références
  1. A. Rouvroy & T. Berns, «Gouvernementalité algorithmique et perspectives d’émancipation. Le disparate comme condition d’individuation par la relation ? », Réseaux, 2013/1 (n° 177), p. 163–196.
  2. G.T. Marx, « A Tack in the Shoe: Neutralizing and Resisting the New Surveillance », Journal of Social Issues, Vol. 59, No. 2, 2003, pp. 369–390.
  3. https://www.torproject.org.
  4. J. Pasteur, « La faille et l’exploit : l’activisme informatique », Cités, n° 17, 2004, pp. 55–72
  5. F. Brunton & H. Nissenbaum, Obfuscation. A User’s Guide for Privacy and Protest, MIT Press, 2015. Voir aussi leur article disponible sur Internet en libre accès: « Vernacular resistance to data collection and analysis: A political theory of obfuscation », First Monday, Vol. 16, No. 5, 2 May 2011, http://firstmonday.org/ojs/index.php/fm/rt/printerFriendly/3493/2955.
  6. http://vincentdubois.fr/undefined.php
  7. http://www.milkred.net/vortex
  8. Voy. aussi G. Deleuze, Pourparlers 1971–1990, Les Éditions de Minuit, (1999) 2003, pp. 229–239. Lorsqu’il forge le concept de « société de contrôle », G. Deleuze évoque la nécessité de « chercher de nouvelles armes »…
  9. B. Pfaffenberger, Technological drama , Sci. Technol. Human Values, Vol. 17, No. 3, 1992, pp. 282–312.
  10. M. Andrejevic, « Privacy, exploitation, and the digital enclosure », Amsterdam Law Forum, Vol 1, No 4, 2009, p. 6, http://amsterdamlawforum.org/article/view/94/168.
  11. M. de Certeau, L’invention du quotidien, tome 1: Arts de faire, Gallimard, Paris, 1990.
  12. J. C. Scott, Weapons of the weak: Everyday forms of peasant resistance, Yale University Press, New Haven, CT, 1985, p. 29.
  13. http://www.urmesurveillance.com.
  14. https://cvdazzle.com.
  15. http://interventionsjournal.net/2014/03/13/artist-project-facial-weaponization-suite.

Espace membre

Member area