Maci Jacobson was watching the world fall apart. With a public health crisis, intense political friction and racial violence, it seemed like everything had “hit the fan” in 2020 — and the movie she just watched hadn’t helped.
The credits were rolling on “The Social Dilemma,” a docu-drama featuring interviews with former executives from Facebook, Twitter and Google. Jacobson was no stranger to social media — its good and bad sides — but she was startled by the haunting question the executives asked:
“What have we done?”
Everything was starting to make sense. Jacobson had a wide circle of connections with different views, and she tried to be empathetic and open, but lately it was getting harder and harder.
As a Ph.D. candidate in neuroscience, she said her study of the brain has helped her be more understanding.
“Environment plus genetics creates kind of who we are,” she explained.
Even so, Jacobson noticed a disconnect between herself and some of her friends. Nobody in the country could agree on anything, but it seemed strange that the divide was creeping into her personal relationships.
And she didn’t know why.
One idea from the movie stuck with Jacobson. She said she kept thinking about how social media supposedly separates people into different realities. At first, Jacobson didn’t believe it was true. As the disagreements and confusion continued, she decided to put the theory to the test.
Jacobson created three different Instagram accounts: @macitheliberal, @macitheconservative, and @macithehuman to dive into the different sides of social media. On the liberal account, her posts featured feminism, Black Lives Matter, and disparaging comments about Donald Trump. Her conservative account posted positive things about Trump along with other buzzwords like “Sleepy Joe” and gun rights.
Within a week, Jacobson was blown away by the results. Instagram would suggest posts that completely contradicted each other. Looking at one account the message was consistent, but between accounts there was almost no common ground.
One topic was particularly disheartening. As she read posts about the Breonna Taylor case, she was shocked at how many pieces seemed to be missing. Jacobson said she realized the conservative side was logical at the expense of humanity, but the liberal view was emotional with no logic.
“On one hand, yeah, you might have the facts of the case, but you’re ignoring the fact that someone’s sister and daughter died. And then, on the other hand, you’re ignoring the facts of the case,” she said. “I was like, this is two different stories.”
As she jumped between accounts, Jacobson understood why it was so difficult to get on the same page as her friends — they were living in two psychologically different worlds.
Our own social dilemma
Advertising professional Ryan Smith teaches social media management at BYU, showing communications students how to use social media strategically for a brand. Smith teaches them how to navigate each platform’s unique algorithm to build a successful account.
Smith talks about “the algorithm” using air quotes and the same ominous tone one would expect from a campfire ghost story. He treats it like a mystery, but he understands it better than the average Tweeter or Instagrammer.
The algorithm is a complex program that collects data on users to give them a personalized experience on social media. Using this data, the algorithm finds and shows the posts, profiles and advertisements that will be most enticing to their users.
Interactions in the digital world are meant to be an expansion of the physical world. The World Wide Web means a wider circle of people and a broader scope of perspectives.
Smith, however, said most users are forgetting about social media’s main — and monetary — goal.
“They think the world is their audience,” he said. “They think they’re seeing world views from everybody but they don’t realize that the idea of social media is to keep you on it.”
The latest numbers say it’s working. A 2019 study by Common Sense Media found that on average, American teenagers spend 1 hour and 21 minutes on social media every day.
While his classes are focused on professional strategy, Smith does not gloss over personal use and responsibility. He encourages budding professionals to audit their social media accounts often, watch what they post and beware of echo chambers.
Living in a bubble
“Echo chamber” is a term social media marketers coined to describe the digital environments nearly all users find themselves in.
The imagery of an echo chamber illustrates how users are mostly hearing echoes of their own ideas on social media. If a post or a profile isn’t important or interesting to a user, they won’t interact with it — and platforms will do anything needed to keep their customers engaged.
“You are what you like,” Smith said. “You’re almost empowered by your own thoughts because nothing else is in front of you and you just really think what you’re saying is right.”
Echoes in a pandemic
To illustrate the power of social media in forming one’s world view, Nataliya Roman and John H. Parmelee published an article titled “Insta-echoes: Selective exposure and selective avoidance on Instagram” in the 2020 Elsevier Journal of Telematics and Informatics. They found that Instagram users had an unusual tendency to avoid disagreeable information, especially about politics.
“Only 15.3% of users follow leaders whom they disagree with politically,” Roman and Parmelee said in their study.
The researchers found that strong liberals and strong conservatives were most likely to avoid things that challenge their views. Jacobson’s experience supports that.
In a democracy, this is not an encouraging trend.
Avoiding ideas and people that challenge personal beliefs, especially for the most ideological, creates a risk of polarization, extremism, intolerance and “a dangerously narrow perspective on the possible solutions for key political issues,” Roman and Parmelee said.
As the world has moved from physical to digital spaces, spending more time online has increased polarization.
“The web has features that can exacerbate selective exposure, such as the large number of highly partisan sites and algorithmic filtering on search engines that steer users toward like-minded content,” Roman and Parmelee explained.
Carl Hanson is a professor of public health at BYU who is researching Twitter and the COVID-19 vaccines. Hanson said as online conversations about the pandemic have become the norm, the “infodemic” of ideas has damaged the possibilities of finding answers.
“Too much information can drown out reliable information about a problem,” he said.
Social media messages about COVID-19 and its vaccines are being posted, shared and liked by users on every stretch of the political spectrum, and often within echo chambers.
Along with spreading misinformation and division, Smith said online conversations about the pandemic are fueling widespread confusion.
“It’s really hard when the people we’re used to relying on for our info aren’t agreeing,” he said. “You’re not getting opinions that are differing, you’re getting information that feels like it should be, you know, ‘FDA approved’.”
Misinformation? Or disinformation?
Most social media users have probably seen an exchange like this at least once:
This quick but fiery Twitter debate encompasses the supercharged emotions surrounding a common question about social media: Who has the right solutions? As people try to find information online about the pandemic, ethical problems with censoring content — or not censoring it — are popping up more.
In their 2021 article for the American Journal or Public Health titled “First Do No Harm: Effective Communication about COVID-19 Vaccines,” David Broniatowski, John Ayers and Mark Drezde said strategies from social media platforms and communication professionals are doing more harm than good.
The “Streisand effect” is a phenomenon in which censoring or hiding content increases awareness of and interest in the censored information. When large social media platforms like Facebook, Instagram and Twitter hide information about the vaccines — true or otherwise — vaccine opponents and those with questions are likely to give more weight to the censored information.
“Social media companies lack formal training and clear accountability mechanisms for differentiating between blatantly false content and legitimate scientific uncertainty,” said Broniatowski, Ayers and Drezde. This only furthers confusion, frustration and mistrust of public health establishments.
Censored content tends to show up on alternative social media platforms, and any misinformation — or disinformation — spreads further.
Building the walls
After former U.S. President Donald Trump was banned from Twitter in January 2021, conservatives left Twitter in droves for Parler, an overwhelmingly right-wing social media platform. Parler was a mega-echo chamber and Twitter, which Pew Research Center said was already 60% left-leaning, was in danger of becoming the same on the other end of the political spectrum.
Jeanine Guidry, a behavioral scientist and assistant professor at Virginia Commonwealth University, said echo chambers are so tempting because they eliminate the stress of challenging personal beliefs.
A single Google search can return several million hits, which Guidry said is difficult to process, especially in an infodemic.
“So we go to what we can trust,” she explained. “We go to what we find accessible, and a lot of times that can be in an echo chamber.”
Guidry said the shift to everything being online in the pandemic has heightened the prevalence and power of echo chambers.
“We spend a lot of time at home where the high-speed internet lives,” she said. “We spend less time in person with people and then we’re trying to find solutions and then we are ending up with information that we think may sound right.”
The other problem with misinformation, Guidry said, is that often “the first thing we believe about something is the thing we’re going to stick with,” which makes correcting false information difficult. Once an idea has entered an echo chamber, it is difficult to get it out.
While ideological biases incubate in echo chambers, the group in the middle gets left behind. Guidry, whose research focuses on the use of social media in health communications, said social fear has played a significant role. Political moderates, or in the case of the pandemic, the vaccine-hesitant, are wary of joining the conversation.
“If I mention that, I’m going to get preached at or I’m going to get unfriended on platform X-Y-Z,” Guidry said, putting herself in the shoes of the vaccine-hesitant.
The vaccine-hesitant are frequently grouped with vaccine opponents, even though they haven’t taken a side — they’re just looking for answers. Broniatowski, Ayers and Drezde said by not engaging with the vaccine-hesitant on social media, public health professionals are creating a bigger problem for those with questions.
These researchers called the resulting divide a “void” which is filled by agitated users who are “not trained in effective communication and who mischaracterize the vaccine hesitant as stupid, science deniers or conspiracy theorists.”
When the vaccine-hesitant get on social media, they see the above idea from one side, and from the other, messages demeaning the vaccine. Broniatowski, Ayers and Drezde said this divide may be “ineffective and possibly harmful, making everyone more vulnerable.”
While the word “unprecedented” can describe just about everything with the pandemic, Guidry said the conversation around the COVID-19 vaccines is anything but.
In her study of vaccines and social media, she’s found that vaccine hesitancy and refusal is nothing new. With measles, flu and even polio vaccines people have been concerned about side effects, how the vaccine was developed and the motives behind the development.
Social media broadened the reach of public health messages in 2020, but the benefit may be lost if the messages bounce off echo chamber walls. Guidry said the consequences of the resulting divide could be further-reaching than just COVID-19.
“One of the big concerning things is we’ve seen actually the level of vaccinations go down for other vaccines,” a trend that could take years to correct, Guidry said.
Since her experiment making different Instagram accounts, Jacobson has become wary of social media’s influence. As she scrolled through political content, Jacobson said she realized it was making her feel isolated instead of informed. She deleted all but one of her profiles.
“It still would make me angrier than I enjoyed being,” she said. “At least I recognize that my confirmation bias is, well, strong and alive.”
After the 2020 election, Jacobson made a resolution: She would never talk about politics on social media. Talking to people and not a profile leaves more room for empathy, she said.
“There’s a lot of social neuroscience behind that,” Jacobson said, explaining her preference for face-to-face discussions of politics. “We’re more willing to acknowledge, all right, you’re right on that one or I see where you’re coming from with that kind of thing.”
Echo chambers reflect the real emotional response to disagreements and discomfort: hiding.
It isn’t easy to question personal opinions, and recognizing biases is a challenging route to take. Consciously or not, choosing to ignore discomfort is a common choice.
The solution may not be clear, but “We can’t afford not to address it,” Guidry said.
Smith said self-awareness and action are the key to change. The algorithm isn’t necessarily biased against “one lane,” but it does direct traffic to keep users in a certain place.
“If you say my lane is two-way, I want things coming from different directions, it’s going to do that for you,” he said.
Smith suggested being proactive in looking at posts and profiles that challenge long-held opinions. Searching for differing ideas and engaging in new conversations is a simple way to reprogram the algorithm.
From a human behavior perspective, Guidry said validating one another’s experiences is important in bridging the gap. Guidry was optimistic about the potential for improvement. Breaking down the walls of an echo chamber could be as simple as keeping an open mind.
“Don’t close your doors,” she said. “Especially in this polarized time.”