Mad About Media

Alice Marwick learned how to code when she was 11 and began working in the tech industry at 19. After the dot-com bubble burst, she realized she could combine her passion for technology with her love for social science in a graduate program. Now, the UNC communications professor researches disinformation and privacy, two of the most pressing issues in the world of media ethics.

an illustration of Alice Marwick with a bunch of old computers behind herillustration by Corina Cudebec
November 18th, 2020

Within 24 hours of Election Day, six tweets by President Donald Trump were flagged by Twitter for “violating its rules because they included unsupported claims of widespread election fraud and premature declarations of victory in key battleground states,” according to The New York Times.

President Trump had claimed he won the race in Pennsylvania — which was unfounded. It took five days for media networks to declare his opponent, former Vice President Joe Biden, the winner based on already tallied votes. And the state won’t even declare official counts until 20 days after the election.

a screenshot of President Trump's tweets from the day after Election Day 2020

During the 2020 presidential election, Twitter flagged a series of President’s Trump’s tweets as potentially spreading misleading information about the election. (screenshot from Twitter)

This isn’t the first time the president has attempted to use disinformation to his advantage.

“In the weeks before the election, President Trump purposefully spread information about voter fraud and inaccurate voter counts in what seems to be a strategic attempt to undermine faith in the democratic process,” says Alice Marwick, a UNC communications professor.

Misinformation is incorrect or misleading information. Disinformation is false information that is purposefully spread in an attempt to sway people’s beliefs. To what degree does this work? That’s what Marwick hopes to uncover.

While media theory researchers have not found an immediate link between people viewing media and changing their political beliefs, there is some evidence that suggests, over a long period of time, our world view is shaped, in part, by the media we consume.

“There’s this idea that people get radicalized by things they see online,” Marwick says. “But the idea that you’re just going about your business, you see something, and then suddenly you’re a racist — that is not empirically supported by any media theory.”

In her most recent project, Marwick questions narratives of online radicalization and strives to discover how and why people come to believe fringe, false, or extremist viewpoints that they encounter on social media platforms.

This research is just one part of a larger goal: to understand how digital media affects collective audiences and individuals — work that stems from a passion for technology.

Beginning with BASIC

The summer before sixth grade, Marwick and her dad — who worked for IBM — wrote a rudimentary version of a computer program using BASIC, an old-school programming language. Called ELIZA, the program was first invented in the 1960s and is supposed to emulate a therapist, mimicking modern-day chat bots in the simplest way. The user would type something like, “I’m having a difficult day,” and ELIZA would respond: “Tell me about your day.”

“It was just kind of something I did for fun,” she says, chuckling.

Marwick has never known a life without computers. While access to technology is more typical today, it was certainly not the norm for someone who grew up in the ’80s.

“At that time, if you were interested in computers, you just learned to do a lot of things yourself,” she explains. “You learned how to put a computer together yourself. You learned how to code yourself. You learned how to network. You kind of just had to. So I’ve always had this hobbyist interest in computers.”

Marwick is a social media pioneer of sorts. In the ’90s, she had a homepage — a personal website that highlighted whatever the user wanted. She was also really into LiveJournal, one of the first social networking services created in 1999, and was an avid user of an early chat application called Internet Relay Chat.

Alice Marwick

Alice Marwick (photo by Jon Gardiner)

“I’m a very social person,” she admits. “So when the internet came around, I combined my hobbyist interest in computers with talking to other people.”

She interned with Microsoft in college, and after graduating in 1998, began working at a tech startup in Seattle, where she conducted research on how people use technologies and developed products for specific audiences, including a cell phone for teenagers.

Then, in the early 2000s, the dot-com bubble burst. Marwick was fired.

To use her newfound free time to her advantage, she enrolled in a graduate program in the communication department at the University of Washington, where she studied online identity.

“I ended up just deeply falling in love with the ability to fuse the critical theory I had learned in my undergraduate degree in women’s studies with the joy and entertainment that I found in social technology,” Marwick shares.

Unpacking privacy

Before joining UNC in 2017, Marwick spent much of her time studying online privacy and surveillance. Her current book, called “The Private is Political,” discusses networked privacy — the idea that we don’t have individual control over the information we share.

“Mainstream media often focuses on privacy as an individual right or something we have control over. So if you post something on social media, and it leaks, it’s your fault,” she says. “The view I take is that it’s inevitable, that these technologies are sort of designed to leak. And instead we should be thinking of structural solutions to privacy rather than individual solutions.”

The book also unpacks the distribution of privacy, which, according to Marwick, is incredibly unequal. Different populations are granted different levels of privacy. Socioeconomic status, sexual orientation, race, and gender all play a role in how much or how little privacy people feel they have.

In a 2017 study, Marwick interviewed 28 young adults who identified as having low socioeconomic status about online privacy and information sharing. She and her collaborators were “struck by how mindful [their] participants were of what they put online.”

One participant, a 17-year-old African American man named Malik, knew employers and college admission officers might look at his social media, so he was careful about what he posted. More specifically, he explained that he didn’t need to “put a mask over everything” on his profiles, unlike other peers, because he had already censored them.

Batuk, an 18-year-old Indian man, agreed that limiting information online is smart, but more from the perspective that people aren’t entitled to share every view they have about society. He shared a similar feeling of responsibility with Malik about how his personal information is spread.

Marwick also asked her study participants about their involvement with and perception of police surveillance. Many believe that physical surveillance by police is unavoidable, even if they try to stay out of trouble. But they are much less worried about online spaces.

“I think that in the physical world it’s more intrusive and violent to be surveilled,” says Beth, a 21-year-old African American woman who participated in Marwick’s study. “Like you can see and feel the camera bearing down with its red dot of death. And you can feel the stare of the police officer watching you intently across the street while you’re not doing anything, just because you’re Black. Whereas online that type of surveillance can be happening, maybe even more magnified, but you don’t detect it at all because it’s happening behind a screen.”

In truth, both police and online surveillance are inevitable. Marwick was surprised that participants didn’t take responsibility for their privacy when it was violated by police, but in online spaces, they blamed themselves if their information leaked, even though it too is structural and unavoidable.

“We thought [surveillance by law enforcement] might be a good example that educators could use to show people that privacy violations are structural and to push back against individual responsibility,” Marwick says.

Detecting disinformation

While disinformation seems like a modern-day problem birthed from social media, it is an age-old concept, according to Marwick.

One Monday morning in May 1921, a young Black man named Dick Rowland enters an elevator. Inside is Sarah Page, a white elevator operator. What happens next is unclear. Page screams. Rowland flees the scene. Forty-eight hours later, 35 city blocks are firebombed and more than 300 people are killed.

Today, this event is known as the 1921 Tulsa Race Massacre. It occurred in the Greenwood District of Tulsa, a predominantly Black middle-class community often called “Black Wall Street.” How did one moment spark one of the worst massacres in 20th-century U.S. history?

Disinformation certainly played a role.

Whatever happened between Rowland and Page, rumors circulated throughout the community, with local news outlets reporting a variety of scenarios. When whispers of a large-scale revolt among Black Tulsans began spreading, a group of white rioters — some of whom were deputized and given weapons by city officials — began looting homes, burning buildings, and committing numerous acts of violence.

The 1921 Tulsa Race Massacre, for example, is seldom openly discussed in U.S. history classes. That in and of itself, Marwick says, could be considered a disinformation campaign.

“This is a significant event that had huge impacts on the Black community in Oklahoma — and probably Black communities around the country — and it has been suppressed strategically in order to cover up what happened,” she says.

Disinformation permeates history. In 1475, a preacher in Italy blamed the disappearance of a 2-year-old child on the Jewish community, resulting in the death of 15 Jews. In 1782, Benjamin Franklin circulated false stories claiming that American Indians were murdering innocent women and children at the behest of King George. In 1917, British newspapers reported that Germans were addressing their fat shortage by boiling the corpses of their own soldiers for fats, bone meal, and pig food. In the 1980s, Russian media outlets claimed the U.S. created AIDS.

Social media and the political polarization of the nation has upped the pace for this kind of rumor-spreading.

Most recently, Marwick received a Carnegie Fellowship to continue studying far-right groups and the spread of disinformation. For this project, called “Redpills and Radicalization: Understanding Disinformation’s Impact,” Marwick will explore redpilling — how these groups use the internet to recruit others — by collecting first-person accounts from people within extremist communities. She will sift through online spaces like Endchan, a far-right online discussion board, and explore the intricacies of conspiracies like QAnon, a right-wing theory alleging that a gang of Satan-worshipping pedophiles is running a global child sex-trafficking ring.

“You’re trying to understand a community on their own terms,” Marwick says. “You don’t have to agree with what the people of the community believe. You don’t have to love the members of the community. But you want to understand how they think about themselves, how they make meaning of the world, what values they ascribe to different parts of their life.”

Marwick uses ethnographic observation to understand how these groups operate. She does this through watching community interactions and consuming their media. But in her newest project, she hopes to interact directly with the people who participate far-right spaces.

Engaging in this kind of research is not easy on the mind, Marwick stresses. Self-care is crucial when working in such hateful environments.

“I have to read a lot of racist stuff. I have to read a lot of sexist stuff. I have to interact with points of view that are anti-democratic and, in many ways, I think reprehensible,” she says. “And that can take a toll just because it can be very hard, day after day, to interact with those viewpoints and maintain any kind of positive attitude or outlook on humanity.”

Forging friendships  

Marwick is not alone when it comes to processing the difficult information she encounters through her research. One line of support comes from the very place she studies — online spaces — where she relies on an informal group of fellow disinformation researchers for empathy and idea generation.

Another she finds here at UNC at the new Center for Information, Technology, and Public Life (CITAP). Co-founded in 2019 by Marwick, journalism professors Deen Freelon and Daniel Kreiss, and information and library science researcher Zeynep Tufekci, the center is dedicated to researching the growing impact of the internet, social media, and other digital information-sharing technologies.

“We started collaborating together pretty much immediately upon arriving at UNC,” Marwick says. “We’re committed to translating our work to the public. All of us write editorials, we blog, we tweet, we talk to the press. We’re all interested in creating knowledge based on empirical research that has real-world impact.”

While the work of Marwick and her fellow CITAP researchers is, at times, pretty heavy, she stresses the value the internet brings to people’s lives, like maintaining social connections over long distances, allowing average people to broadcast their thoughts and feelings to the world with little friction, and bringing attention to social issues like never before.

“I don’t want to throw the baby out with the bathwater,” she says. “I think the internet is a wonderful thing for the world, and I think social media has a lot of positive benefit. I think we need to think about how we can take advantage of those positive benefits while rejecting the parts that are problematic or are causing more trouble than they’re worth.”

Alice Marwick is an associate professor in the Department of Communication within the UNC College of Arts & Sciences. She is also a co-founder of and principal research at the UNC Center for Information, Technology, and Public Life.