University of Illinois at Urbana-Champaign
Citation: Grosser, Ben. “Go Rando.” Hyperrhiz: New Media Cultures, no. 21, 2019. doi:10.20415/hyp/021.bt02
Abstract: Go Rando (2017) is a web browser extension that obfuscates how you feel on Facebook, randomizing your reactions to make your mood inscrutable to the site.
Keywords: Facebook, emojis, surveillance, machine learning, identity, social media, big data, emotional profiling.
Project URL: https://bengrosser.com/projects/go-rando/
Promo Video: https://vimeo.com/202612867
Facebook’s “reactions” let you express how you feel about a link, photo, or status. While such data might be helpful for your friends, these recorded feelings also enable increased surveillance, government profiling, more targeted advertising, and emotional manipulation. Go Rando is a web browser extension that obfuscates your feelings on Facebook. Every time you click “Like,” Go Rando randomly chooses one of the six “reactions” for you. Over time, you appear to Facebook’s algorithms as someone whose feelings are emotionally “balanced”—as someone who feels Angry as much as Haha or Sad as much as Love. You can still choose a specific reaction if you want to, but even that choice will be obscured by an emotion profile increasingly filled with noise. In other words, Facebook won’t know if your reaction was genuine or not. Want to see what Facebook feels like when your emotions are obscured? Then Go Rando!
Go Rando adopts the strategy of obfuscation to disrupt Facebook’s increasingly fine-grained data collection practices. While unlikely, if everyone started using Go Rando tomorrow, it could have broad collective effects against state and corporate emotion profiling. But regardless, for any one user it provides individual benefits by disrupting Facebook’s News Feed algorithm (and thus blunting the potential of targeted disinformation campaigns), by resisting the site’s attempts at emotional manipulation, and by confusing corporate and governmental surveillance. Further, Go Rando provokes questions about the uses of Facebook’s “reactions.” Who benefits when you mark yourself as “Angry” or “Sad” in response to a particular post? Which groups have the most to lose? And how might the uses of this data further change the nature of privacy and democracy over the coming months or years? Finally, every time one of your friends is surprised by your unexpected or “inappropriate” reaction, this momentary blip in the social media flow forces a previously invisible aspect of the Facebook interface into the foreground: that we’re all reporting how we feel all the time. Most broadly I intend these moments to create discourse about the effects of emotional surveillance online.