IV: Openness vs. Control
Interview: Part 4 of 6
DE: You bring up the politics of that friction. I wanted to ask you about the politics of data. Because it's a big part of your work. I was reading that Alexander Galloway (2010) has written that networked technologies had this utopian promise because of interactivity, which is something that writing didn't have, mass media forms didn't have, but that these same technologies have now become "instruments of control and organization" (p. 291).
DE: And it made me think about your work at the New York Times Research and Development Lab working on OpenPaths, which is described as a personal data locker. It enables users to retain control over their own data. It's described as an experiment in personal data sovereignty. I thought that was really interesting partly because it's so different from the way that we usually think about data. I was wondering if you could speak to that idea of openness and control, and that idea of rethinking data, or the politics of data.
BH: One of the goals of OpenPaths, for me, was to rethink the politics around data collection as it relates to organizations that have that power. There's some sense with—we'll pick on Google—that there's a trade-off. That you use Google services and you extract some from value from that. It adds something to your life. And they're free. But they're gathering data based on that and advertising to you or selling that data or doing whatever else. That data belongs to them. And they have a responsibility to keep it private, to not leak your data everywhere. We've seen that in all these data breaches that happen. What was the one that just happened with the dating site?
DE: Yeah, Ashley Madison.
BH: Yeah, it's like this big controversy because you're like, well, they need to be responsible with the data that they have. Which is all well and good, and they should be. But there is this underlying sense that to use these services you're surrendering your data and then they have a right to that. And that they need that right in order to provide you with that service. But that's not entirely true. There could be other models of data sharing that don't function in that same way. With OpenPaths, it's responding to the fact that anyone who is carrying a mobile device is sharing their location with a couple different corporations. If you have an iPhone—I have an iPhone; I use AT&T.—AT&T has a record of everywhere I go. They do need that record to locate my phone, to provide cell service to me. Apple also collects all that data, so Apple knows how I'm moving around. But there's a default in that I don't know where I've been. I mean, I experienced it, but I don't have access to that data. I can't go to Apple and get my location history in a raw form that I could then do something with, whatever it is that I might want to do with it. Maybe I make an art piece, maybe I share it with epidemiological researchers, maybe I—who knows. But there's no default apparatus for that. And so, first, OpenPaths was to produce that—the fact that there's all these other people tracking my data. Well, I'll track my location and generate my own data set for my own purposes. And that—it's a subtle distinction. There's all kinds of apps that track location. OpenPaths is doing the same thing that they're doing. But then it's just offering it up to you. But then the second thing was this idea that—who's privileged with that data? So, OpenPaths—all the data is encrypted on the server side. If I run OpenPaths servers, I can't see anyone's data because it's encrypted and only they have the keys. That's very different. If you said, "Google, you can collect all this information but you can't look at it," you'd be like, wait, what? That's not the deal. But it can be functional. With OpenPaths, you can then share your data on your terms, willingly, with any kind of research project that requests access to it. So, there's a mobility study in a university in Europe. And they want to look at people that have been to particular cities. They send a request and then you can share that information with them. That's a brokerage between the individual and the third party that wants the data. It doesn't imply a privilege on the part of the actual apparatus that's generating the data, if that's clear. So, there's a fundamentally different assumption about where the power lies, who owns what, with this focus on the individual, and this focus on personal sovereignty over your information. This model is not going to be adopted. But the gesture of that project, I think, was to show that other ways exist. We could come up with something that does not always say the corporation has an absolute right to the information that it supposedly needs for the services.
DE: But making those options visible is something that's been talked about—that things go invisible, right?
BH: The problem is that any app you use, you get that terms and conditions.
BH: And you say, "Accept." And it's a completely ineffective means of establishing a relationship or agency. You click that and you have no idea. It's legalese; it's a total dysfunctional relationship between the technology provider and the individual. Can we make that relationship actually meaningful, so you're not just surrendering, like, "Oh, I don't know what they're looking at, I don't know what they're doing with this"? You're actually involved, and have some stake in where the information is connected to you, how it's being used. And there's a larger politics there. [Jacques] Ranciere talks about the political in terms of not necessarily the debate between two political parties or something, but what's recognized as speech to begin with, to have a voice, to have something that is heard. That's the site of the political. And acts of resistance are putting yourself forward as "This is speech. I'm saying something" and changing the terms of the conversation in some way to recognize something that was not heard before.
Galloway, Alexander R. (2010). Networks. In W. J. T. Mitchell & Mark B. N. Hansen (Eds.), Critical terms for media studies (pp. 280–298). Chicago, IL: Chicago University Press.