Protecting Privacy

Danielle Citron
May 10, 2024

Professor Danielle Citron discusses themes in her most recent book, “The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age.” Citron spoke at the Law School Foundation’s Alumni Board and Council luncheon.

Transcript

DANIELLE CITRON: Before I begin, I just wanted to thank Dean Goluboff for not only recruiting me. So bless you. So many of us can say that in this room, but for your incredible leadership, your warmth, and you've mentored me as a scholar. So you'll see it in my talk how your work in the lost history of civil rights has been so instrumental to how I think about civil rights. So just thank you, and thank you for creating this feeling of love and warmth. So I love being here. So thank you so much.

[APPLAUSE]

And I'm also a parent of a graduating 3L who is boo hoo crying because she is graduating as you are and had the luck of having your seminar with Rich. So it's just a joy to be here and to come talk about my work on intimate privacy. So what I'm going to do is give us a roadmap of the central core mission of the book, and then I'll go ahead and proceed and dive in.

So intimate privacy is access to and information about our bodies, our health, our innermost thoughts, which, of course, we share every second of the day as we browse, as we search, as we text, as we email. It's our sexual activity, gender and-- sex and sexual orientation and sexual activities.

And it's our close relationships. And when it's denied, crucial opportunities are on the line. So our ability to develop and think about who we are, our ability to form close relationships, to work. To get and to keep our jobs and get new ones. To be physically safe, and even our freedom now.

And for women and minorities, the risks are more profound. At least the possibilities for your invasion of intimate privacy is more likely and the costs are going to be even more profound. So let's just talk about our expectations, because this part, I think, will be intuitive to everyone. That is, all day long, we go about our activities. And we assume that intimate privacy is ours.

So we go to a public restroom, we take off our clothes, we're patients of a doctor's office. We are in our bedroom. And all the while, we are assuming that no one is recording us there and it's not being uploaded to the cloud. We send texts and emails to our confidants. And we assume that our confidences are kept and that the companies are securing those communications.

And all those apps on your phone, to our fitbits, right? Our fitness bands, to our personal devices, to Siri, we disclose our health conditions. Our prescriptions, our sexual activities, our dates. So much of our intimate life is shared with and is tracked by these companies. And they tell us and they promise that it's all meant to make our lives better.

It's good for you, it's so much fun, it's so seductive. But all the while, they're not telling us that they're gathering and exploiting our data, and they're selling it. And they're selling it to parties we have no relationship with, including data brokers who are, in turn, the biggest clients are government agencies, law enforcement.

So let's talk a little bit about the values of why. Why intimate privacy? Why is it special? Why is it foundational? And so it's central, of course, to it. So let's just start with self-development. Our first relationship, of course, as we come to know ourselves, is through our bodies. It's how we come to know and understand ourselves over time. And when we can draw the invisible boundaries around our bodies either alone or with others, and it's often, privacy is a we. It's not a me.

But when we can draw those boundaries, we can figure out who we are. It's an ongoing process. Like, We don't know. None of us do. But when we can draw those boundaries, we can become our authentic selves over time. Janet Mock, a writer and activist, explained that when she was young, she would go into her mom's closet with her best friend when mom and brother were out and would try on clothing.

And it was over the course of many, many years of doing that, that you could tell and share with her mom and brother her gender identity. Because she had that space with her best friend, not alone. And privacy is central to human dignity. I'm going to describe that as it being your personal integrity.

Your self-esteem and your social esteem. So first, self-esteem. When we decide who has access to our intimate lives, we feel like we're in charge. We have a sense that we have a sense of self-esteem. And then there's the piece of social esteem. That we are regarded as a whole person rather than just a fragment or body parts. And so you'll have to indulge me because my mom is a French literature professor, so hence, I'm going to invoke Jean-Paul Sartre and his being and nothingness.

Everyone's like, where did you get that from? I'm like, growing up, eighth grade was Camus or Sartre or maybe Saint Stendhal, but that was the choices. But in being in nothingness, Stendhal has a riff on these two concepts that I think really help capture the dignitarian insights about intimate privacy. And that's the concept of being ashamed and of pure shame.

And he invokes the peeping Tom. And he says, imagine the peeping Tom is looking and peering into, and this will be very 20th century, certainly mid 20th century, peering into a keyhole, watching someone, watching a woman undress without her permission. And he hears a rustle on the stairs, and he gets red in the face.

And in that moment, he feels ashamed. Embarrassed that he's doing the wrong thing. That he's violating the woman's intimate privacy or he calls it privacy. So he's ashamed. And he said, by contrast, the concept of pure shame is what the woman inside the room is thinking.

Because in that moment, when she figures out she's being watched without her permission, she sees herself through his eyes as an object, not as a subject. It's just body parts. She's just a fragment. And you'll see that as I talk about invasions of intimate privacy, that theme echoes so profoundly when you talk to people whose intimate privacy has been invaded. They see themselves through other people's eyes as an object.

And then, people always say, law professors, what are you doing talking about love? Well, intimate privacy has everything to do with our close relationships. How do we form them? Social psychologists tell us that it is through a process of and over time, self-disclosure, mutual self-disclosure and mutual vulnerability.

And we're not going to take that leap of faith unless we trust the other person to keep our confidences, to treat our information, as we hope, rather than as we fear. And so I can't say it any better than the late and great Charles Fried, who in a wonderful book called The Anatomy of Values, where he says privacy is the oxygen for love. I was swoon every time I read that. So forgive me.

So when intimate privacy is denied, it's devastating. So for my book project, I interviewed 60 people from around the globe. So from the United States, UK, India, Iceland, South Korea, and Israel. And ever since working with lawmakers in South Korea, Singapore, and Australia to work on their issues involving intimate privacy. And the story of the United States is the story of Singapore. It's the story of South Korea.

Regrettably, it is the pattern of the harms and wrongs that we see. And you really can't understand the significance of an invasion of intimate privacy unless you've seen it through the victim's eyes. So I'm going to tell you some stories today or share some stories. And the stories I'm going to tell are mostly from the United States, but I'm going to call her Joan. Joan had just-- and that's the name that she chose in our book to talk about her.

Joan had just graduated from law school. She was top of her class at a law school as fancy as ours. It's not possible, I know. But there are other really fancy, fabulous law schools. And she was started working at her firm, and she went to take a deposition with a partner and senior associate. So they flew and she stayed in a hotel for two nights.

And when she came back, she got an email from someone calling himself Not a Bad Guy 2. And in the email it said, "send me more nude photos, or I distribute this widely." And attached was a video of her in the hotel undressing, urinating, and showering. She calls her mentor at law school, is one of my closest friends.

He calls me, I call Joan. I say, Joan, you're not sharing more. What has happened to you is two things. Two types of invasions of intimate privacy, video voyeurism, and an effort at sextortion. To extort more nude photos from other nude photos. So we wait the 24 hours, and he makes good on his promise.

So what happens first is he has scraped clearly all of her contacts on LinkedIn and sends an email that looks like it comes from Joan to all of her contacts. So it includes all the people she's working with, her law school classmates, professors, and he CCs Joan, so she sees what this email is. And the email says, as if it's from Joan, "Watch this video. See me in my natural state. I'd love to know what you think."

Then, at the same time, he puts up the video on adult sites, so on Pornhub is among them, right? Xtube and others. And sites devoted to non-consensual intimate imagery. So names like Private Voyeur, these are the most unlovely and uncreative people just saying hidden Cam. There are countless sites that I'll talk to you about.

And adult impersonator sites-- sorry. Adult finder sites, so like dating websites. And first of all, in the video, he embeds her full name. So when you watch the video, you see her full name, but as you can figure out who it is. On the adult finder sites, he includes and impersonates her. So includes the video and says, I'm interested in anonymous sex. My name is, and her real name and her home address.

So she does everything in her power. This is just like within the first four to eight hours because the video is up on hundreds of sites. She does everything she can to contact the sites and see if they'll take down the video. But as I'll explain, they have no legal responsibility to take it down. They enjoy immunity from responsibility. So no one writes her back.

Pornhub actually helps her twice because he keeps putting them back up. And at the third time, they stop responding to her emails. So it remains up. And the only person to write her back, the website, which we figured out is hosted in Russia, the person says, "If you send me nudes in a private showing, I'll take it down."

So at this point, she's terrified, right? The idea of even going to the gym seems a frightening proposition. She talks to her partner at the law firm who's such a lovely person, but he had seen the video. And as Joan said to me, I can't how he was so supportive, but I can't help but see how he sees me.

It's like he sees the video. I'm not me anymore. And it changed the arc of her life. So for Joan, there's like a before and an after. She always felt like, at least in the first two years, and this happens in 2018, she's like, on a knife's edge. She's got a great job, she prays she's going to keep it. She had wanted to clerk for a circuit judge and she was top of her class on law review.

Letters were already written, but she felt like, how can I possibly apply to a federal clerkship at the circuit level with this stuff online? And she fundamentally changed. She wants to make herself and wanted-- this echoes to every victim I talked to. So more than 60 people. But she wanted to make herself unrecognizable.

So this echoes. People usually either gain a lot of weight or lose a lot of weight. And so we met twice in person and she was like a sliver of herself. And she also got-- and this resonates too with so many victims, she got a tattoo. And she explained that it was her late grandmother's name. And she said, this is my body, and she will protect me.

So Joan's story is the story of so many people. It's like recurring that I talk to, and it's more often women and minorities. We pick on people and their vulnerabilities. So governments, especially authoritarian governments, follow this playbook and target dissenters and journalists. So Rana Ayyub is an incredibly well-known investigative journalist in India.

And she won a bridge too far in her criticism of the Modi regime's torment of Muslims in India. And she criticized him. The next day, she gets a text from a source in the Modi regime, and it says, like, watch your back. And within five minutes, her phone starts blowing up. There's a video where her face is swapped into porn. So early deep fakery, this is April 2018.

And I have to say, having sat with her and looked at it, it was her with the big beautiful brown eyes and hair. It absolutely looked like her. And she said, she first saw it, and she threw up. It was like a punch in the face. And within 48 hours, it was on half of the phones in India. Her Twitter feed was filled with death and rape threats. She was doxed, so her home address is everywhere.

And the suggestion that she is interested in sex and that she is a prostitute. So what's the Modi regime's goal? Is to silence her. And they do. She stays in her parents' house for about six months. She's in her early 30s. She's unmarried. She's terrified to go outside and have anyone take her picture for such a long time.

She said, oh, my goodness. There will be another image that they can use for fakery. She has not written for an Indian outlet ever since. So she's basically been blackballed. You will find her brilliant words on The Washington Post World Opinions column. Now, there are sites that not only enable this kind of intimate privacy violations, but they solicit it. It's their business model.

So when I first started tracking these sites devoted Private Voyeur, hidden Cam, hidden camera, there were about 40 in 2014. In 2019, there were 1,900, according to researchers. And in 20-- well, 2019, as I'm writing my book, there were 9,500 sites devoted to this. This is not the dark web, these are sites devoted to inviting users to post images largely of women or sexual minorities for fun. Is the theme of these sites.

It's the theme of degradation and shame. And one can imagine the horror looking at these things, but there's incredible amount of participation, unfortunately. Each site will have 100,000 hits a day. OK. But you'd say, OK. So those are just bad actors, Danielle. You can get up in arms. But more broadly speaking, though, our information capitalism, our surveillance economy is built on the idea that our intimate information can be collected, shared, and sold to third parties we have no relationship with.

And it's tremendous money maker. So I'm going to tell you a story about someone I did not interview because this was after my book was published. Jeffrey Burrill was an administrator for the Catholic church, like bishops association. And a reporter disclosed that he had visited gay bars and then he lost his job.

And you might think, OK. Gumshoe investigation. He must have thought the reporter must have followed Jeffrey Burrill. But no. The Washington Post reported about six months later that a group in Colorado had been subscribing, there are about 40 location brokers. So several location brokers spending millions of dollars to out gay priests.

And this is all through inference. Right? Like you have people's mobile device IDs. Everyone, if you have a cell phone, a smartphone, it's a singular identity. Your mobile advertising ID in your phone that stays in your home at night typically. And if you are being tracked and traced and you're going to gay bars, then you are outed.

So Jeffrey Burrill was just one of many people. So you might say, OK. Where's law? We're lawyers, we're law professors. Law has to have an answer here. And especially in the United states, though, I have to say not much better in other countries. Let's not think that Europe is so great with GDPR. It's not right.

But in the United states, law just isn't doing enough. So let's take Joan's case. The sites that are-- so content platforms, they're in the best position to minimize the damage. You can't control what our hotel employee is doing. But you can minimize the damage. And those site operators are, under Section 230 of the Communications Decency Act, shielded from liability, even if they solicit, encourage, and monetize non-consensual intimate imagery.

So they're off the hook. That's why they totally ignored Joan and didn't hear from them. Now, Joan went to law enforcement. And the very first reaction was, did you make this yourself? Come on. You're just trying to seek attention. Complete doubt of what she had experienced. Then she brings me in, and we go to the FBI and the Secret Service. And even then, even with the help of someone working on these issues, the answer was, we can't find him.

Even with all of Joan's super sleuth computer skills, we can't find him. So that does not help Joan. She sues the hotel, because I got her great counsel. And even then, they fought her tooth and nail, and she's still in discovery. Because they say, the hotel employee, you can't find him, it's no one who work for us. They're disclaiming vicarious liability.

Now, what about Burrill? Jeffrey Burrill. In the United states, we treat data protection as a consumer protection problem. So what that means is so long as you don't lie and you don't deceive individuals, consumers, then you can collect, share, sell, use, store intimate information. Hand over fist, and you can sell it to third parties with which we have no relationship with. Data brokers.

And in fact, those data brokers are selling it onward, not only to advertisers and marketers, but to third parties, including governments outside our country. So law is failing us. So this is the pitch of my book. Is that we should think of intimate privacy as a foundational right, as a human right. And that's very much the language of Europe and other countries. I feel like, Miller, you're with me here.

But what resonates with us in the United States is the language of civil rights. That I want us to understand intimate privacy as a civil right. And civil right, understood as a right owed everyone. Ground truth essential for human flourishing. And that has built in extra protections against invidious discrimination. And in that regard, I build on Dean Goluboff's brilliant book called The Lost History of Civil Rights, in which in the 1940s, the Department of Justice thought of their mission in civil rights as a right to work.

That is a right owed everyone an engagement in labor unions. And that conception also is reinforced by scholars like Robin West as a matter of natural law that we can think of certain rights and civil rights as not just rights against discrimination, including that and essential to it, but also rights owed everyone much the way that we think about human rights.

So when we call something a right, and I'm going to think of and invoke Fred Schauer's work, when we think about a right, we say that it means it's so important that it can't be traded away without a really good reason. Like profits and efficiency, boys will be boys, which is often the response to victims by law enforcement. That's not good enough.

And thinking about modern civil rights law, it gives us so both practically and expressively a really important path forward. So under our modern conception of civil rights law, when you say something is a civil right, and an entity has power over it, they become the guardians of that right. They're the stewards.

So if we shift our thinking of intimate privacy as one in which site operators and companies have to think of them as the stewards of their right, it means they have special responsibilities. And so in my book, I argue, and we just saw Maryland passed a data protection law that very much speaks in the agenda I lay out. I was like, yes. That companies, let's just first take content platforms.

We need to change Section 230 to ensure that sites have a duty of care. That they have a responsibility to take reasonable steps to tackle intimate privacy violations. And companies, as data guardians, shouldn't collect intimate data unless they strictly need it to provide a service or product.

And then as soon as that evaporates, you got to get rid of it, and you definitely cannot sell it. And what this kind of agenda, I think, does is really important things about the signals it sends to companies. That we design our products and services with intimate privacy at the foreground. When you build privacy in, intimate privacy into a product, it's not stapled on in the end, but you design your products and services with intimate privacy in mind.

You can prioritize it. You can prevent all sorts of problems. Like, we have to shed this beta test mindset that technologies, companies have, which is as Kashmir Hill describes it, like technological sweetness. Just because you can create something, you shouldn't. We have to ask those questions.

So I feel like my time is running out. But the work that we've been doing at the Cyber Civil Rights Initiative, so I'm the vice president, and along with Dr. Mary Anne Franks, we've seen some progress. Taylor Swift, I feel like there's some stories that we now know in the news, we feel like everybody knows what a deepfake is, because we've seen Taylor Swift's face being morphed into porn.

And middle schoolers, there are apps like Nudify, which allow you to create a nude image of someone. You just have to have a picture of their face. There are middle schoolers in Westfield, New Jersey, who were swapped into porn and all of the content was shared with their classmates. Same is true for Beverly Hills, California, high schoolers, 16 female high schoolers, they were seniors, their faces morphed into porn and Nudify used. It's easy to do. It's like so user-friendly to make deepfake unfortunately sex videos.

So I think it helps the moral story that we're telling, or at least the argument we're making at the Cyber Civil Rights Initiative to get parents involved, which is, I think, really important. And as depressing as I know I depress, everyone looks so sad. I hate this. I walk in a room, and I'm like, it's going to suck. I know. Work with me here. It's going to be OK. But there is some good news.

So when I first started doing this work, there were two laws on the books to combat intimate privacy violations, civil and criminal penalties. There are now 48 states-- DC, Guam, and Puerto Rico. We've got a federal bill, the Shield Bill, which we've now reintroduced six times. So we're getting exhausted, but it's going up again this time.

And the really cool news is, and I think we got to focus on the content platforms. Because they're in the best-- they're what do we call it? Efficient deterrents. They're in the best position to minimize the harm that's being externalized on individuals. So Representative [INAUDIBLE] in about a couple of weeks is going to drop a bill, which would include a duty of care that would condition the current shield from liability on a duty of care, which we outline in five steps.

I don't know if Ken Abraham is here, but he helped me with the article, How to Reform Section 230. And when it came out, Representative [INAUDIBLE], this never happens. The principle, he called me, and he's like, I love this article. Let's write a bill. I was like, let's do it. So it's coming out in a couple of weeks.

So there is some, I know, you're saying, Danielle, nothing will happen in Congress. We're working with state lawmakers on comprehensive data protection laws. There's good stuff there happening too. So with that, and with some optimism, I say thank you to all of you, and thank you for being my colleagues and dearest friends. And I'm so proud to be at this institution and have you all alum here. So thank you.