Monday, December 03, 2018

A Lack of (Contextual) Integrity

I must have the people over at Google thoroughly confused. They now think that I am an impoverished-but-wealthy black, gay, Jewish female who is into cooking and who races Formula I cars in France on the weekends. I'll get to why they think this in a moment. (This is assuming that there are people at Google, and that these days it's not simply a pulsating, gelatinous glob of algorithmically driven hive-mind protoplasm. Although that would be very cool too, and would make an awesome B movie. It's too bad that Tab Hunter is dead.)

You see, like Facebook, Twitter, and the rest, Google has always been about collecting, manipulating, and mining the data we happily supply it. The Goog then takes the data and sells it to people who use it to sell stuff to us, sometimes further manipulating and mining the data along the way. In this fashion, marketers can build up surprisingly accurate—and often chillingly complete—dossiers on us. These are used to present to us items for sale in which we may be interested. (This is only a little unsettling, and could even be helpful.) Sometimes the marketers use the information they've purchased from The Goog in order to sell us things related to things they know we like. For instance, if I've purchased vinyl records, it's probably a good guess that A) I'm interested in turntables and B) I may have a man-bun. If I've purchased (or even simply viewed) a baseball glove from an online vendor, it’s a decent bet that I could be interested in, say, sports memorabilia, season tickets to a sporting event, or perhaps a particular brand of sportswear. (That's getting just a bit creepy.) But what if I have not viewed or commented on or reviewed an item, let's say a guitar, but I have friends who have done those things? For a marketer or data salesperson, it's perfectly reasonable to assume that, since people with like interests tend to hang out together, I may begin seeing guitar-related ads on my various social media feeds or in sidebar ads on websites that I happen to frequent, not because I've looked at such items online, but because my friends have done so. (Okay, now we're getting a seriously creepy.)

What we've encountered here is what some social scientists have called "dataveillance." We're being surveilled digitally, based on the data trail we leave when we traverse the Web. Now, that's not always a terrible thing: sometimes these ads are helpful, just as Facebook's "people you may know" list is occasionally useful and surprisingly accurate.

But we have very little control (read: almost no control) over how that data is used. The real issue with dataveillance, as Cornell University's Helen Nissenbaum has noted, is that it often constitutes a violation of what she calls "contextual integrity." We give someone certain information with a particular understanding of the context in which that information is to be used. I don't mind giving my doctor very private information about myself and my (growing number of) physical ailments. But I would mind very much if she were to share that information with a drug rep or insurance salesman. I explicitly give The Goog data about my travels on the Web, but I did not (knowingly and willingly) give The Goog permission to mine that data, manipulate it, compare it to my friends' data, and then sell it to people who will further refine it and who may then turn around and resell it or combine it with other datasets, the existence of which I am unaware. (If you're dying to read more about Dr. Nissenbaum's work, I interviewed her for my book, which—not at all coincidentally—is available here.)

Of course, The Goog pretends that this is all harmless and that its data collection is benign, incidental, and in fact helpful.

Except that they're not even pretending anymore. You may have encountered a Google program called Google Opinion Rewards. If you sign up, The Goog will pay you to fill out "opinion surveys." For each brief survey, Google will add from 10 to 30 cents or so to your Google Play account; you can then turn around and use that money to buy books, music, apps, etc. on the Google Play store.

But these surveys rarely actually ask your opinion about something. By and large, The Goog doesn't want to know what you think; it wants to know what you are. How much money you make. Whether you rent or own. What sort of car you drive, and if you're likely to be in the market for a new one soon. Here are some sample survey questions:

  • What is the likelihood that you will get a flu shot this year?
  • Did anyone in your household get food stamps . . . In 2017?
  • What is the combined income of all members of your family in 2017?
  • Are you covered by any kind of insurance or health plan . . . ?
  • What medical condition or concern are you most embarrassed to ask your doctor about?
  • Which [of the following categories] best describes your political views?

These are sent to you with the disclaimer that they will be used "to show you more relevant advertising" or to "improve Google products." (Which is more than a little ironic, given that, in the end, you are the product.)

But I've gamed the system: I simply give wildly inaccurate (and often contradictory) answers to the survey questions. Thus, The Goog is now completely confused about who I am, which is only fair, given that I am also confused about who I am. (I mean, in an existential sense, aren't we all confused about who we are, about our place in the world? My personal existential crisis begin in 1973 with an attempt to understand the lyrics to songs by the Steve Miller band.) I think it's only fair that Google's algorithms should be just as confused as the rest of us. Perhaps the algorithm in charge of all of Google's other data-mining algorithms has called an 8 a.m. meeting to discuss what went wrong and to argue about which of the junior algorithms was supposed to bring doughnuts to the meeting.

No comments:

Post a Comment