Skip to main content
Nation

Fake news, propaganda spread quickly on social media

 width=

From left: Spencer Christensen, Adam Durfee and K.C. Miller monitor the Utah County Fire Relief website. The website was designed as an information hub for those affected by the Bald Mountain and Pole Creek fires. (Ty Mullen)

See also 'Responsibility lies with users to fight hacking, online misinformation

As the Eagle Mountain and Pole Creek wildfires raged across Utah Valley in September, BYU YDigital Lab Managing Director Adam Durfee saw another destructive force raging across local news and social media: misinformation.

'A very popular, trustworthy Utah news outlet published a story about the fires that was blatantly inaccurate,' Durfee said. 'And then a second story misrepresented the amount of fire containment, which gave people a very scary amount of security they shouldn't have had.'

Durfee cited less informed sources as the origin point for most of the misinformation. To help combat the problem, he was asked by a communications team based out of a temporary fire headquarters in Salem, Utah, to help create a centralized data hub.

Working with Public Information Officers and BYU students Spencer Christensen and K.C. Miller, Durfee established the Utah County Fire Relief website, a webpage where PIOs and public safety departments could directly post fire updates and press releases.

'Somebody could read accurate information from the proper sources vetted by a crisis communications director in one spot,' Durfee said.

Within 24 hours, the site had gained about 25,000 views, Durfee said. However, after establishing a popular, trusted source of information, Durfee and his team were in a unique ethical position — anyone could have posted misinformation to the site, feeding back into the cycle they were trying to fight.

'It was important to recognize the power the site held once it was announced by Lt. Gov. Spencer Cox and the sheriff's office,' Durfee said. 'We had a bunch of 22-year-olds manning the site and any one of them could have spread misinformation. It would have been an abuse of power.'

https://soundcloud.com/samholden95/fake-news-and-social-media-1

'Fake News'

Durfee said in a misinformation vacuum, fake news — a concept seen often online and in social media — spreads. Incidents like Pizzagate, a social media misinformation campaign encouraged by conspiracy theorist Alex Jones, claimed then-presidential candidate Hillary Clinton ran a trafficking ring out of a pizza place in Washington, D.C., which led to an active shooter situation, according to the Rolling Stone.

This phenomenon isn't necessarily new. In the late '80s through the early 2000s, artist Joseph Matheny seeded false information on the Ong's Hat urban legend on the internet as a type of game. The legend claimed a cult had discovered means of inter-dimensional travel, according to Slate's Decoder Ring podcast.

While Matheny's game, dubbed 'The Incunabula Papers,' was initially harmless, it eventually led to Matheny being followed by obsessed and angry fans. According to Slate, Matheny pulled the plug on his project in 2001.

According to Durfee, social media misinformation usually spreads due to a user's status online. If users are trusted by a group of people, their posts are more likely to be accepted and shared as truth.

Utah Valley University assistant professor David Morin is the chair of the Department of Communications and manages the university's NUVI Social Media Command Center. He said fake news is often cultivated by extreme factions on the internet. From there, it only takes one 'pseudo-legitimate' website to give a false idea the credibility it needs to spread on social media.

Morin and Durfee also said bots, legions of artificial social media accounts, can often be easily set up to frequently post information and ideas. While the technique can be used by businesses for marketing posts, it can also be harnessed by internet trolls to spread fake news through velocity, volume and confirmation bias.

'You can buy bots for anything, and it's relatively easy for someone with basic programming understanding to create a simple bot that can tweet and retweet messages thousands of times,' Morin said.

Leading up to the 2016 presidential elections, Russian hackers used social media bot accounts to push misinformation that likely contributed to President Donald Trump's victory, according to a study by Ohio State University. According to the study, U.S. citizens who voted for Barack Obama in 2012 were most affected — 10 percent cast ballots for Trump, 4 percent changed political affiliation and 8 percent did not vote due to fake news statements.

Bots and trolls have also continued spreading fake news to influence politics, which has led to social media platforms taking action. In October, Twitter banned about 1,500 accounts created by right-wing internet trolls for spreading misinformation that midterm elections, to be held on Nov. 6, were on Nov. 7, according to The New York Times.

 width=

The anonymity provided by the internet often causes people to act out in ways they might not in person, according to J. Reuben Clark Law School associate professor Stephanie Bair. This infographic provides information on some malicious social media accounts and tactics used to bully and mislead others. (Sam Bigelow)

The accounts — dubbed NPCs after 'non-playable characters' in video games — originated from alt-right Reddit and 4chan message boards. The fictional personas, according to the Times, were identified by gray cartoon faces and their penchant for imitating and mocking liberal rhetoric.

Leading up to the Nov. 6 midterm elections, Facebook worked to block over 100 Facebook and Instagram troll accounts allegedly connected to the Russia-based Internet Research Agency, according to The New York Times. The group was indicted in February for attempting to illegally influence the 2016 presidential election through social media campaigns.

Another element of social media manipulation includes trolling and cyberbullying. According to a study from the University of Southern California, Russia's social media campaign leading up to the 2016 election involved heavy use of online bullying through bot accounts, trolls and sockpuppet accounts — or anonymous accounts used to hide the user's true identity.

According to the study, these tactics were primarily used to amplify extreme right-wing ideologies and politics, and 50.9 percent of these accounts imitated negative fan reactions to 'Star Wars Episode VIII: The Last Jedi' to help push politically-motivated views.

J. Reuben Clark Law School associate professor Stephanie Bair said the anonymity granted by the internet is a factor in emboldening people to act in ways they normally wouldn't.

'While someone might not feel OK bullying another in person, they may feel relatively safe doing so from behind a pseudonym on a computer,' she said. 'This may be because they have less to fear by way of social sanctions from others, or it could be that they don't fully realize the harm they are causing because the victim is not in front of them.'

Bair also said each state takes different approaches to cyberbullying. Utah requires schools to enact anti-bullying policies, but victims of online abuse generally rely on civil suits. However, emotional and psychological damages can be harder to prove.

Other effects of online abuse outside of social manipulation could include suicide, depression and social isolation, Bair said.

Hackers and their influence

While social media plays an important role in spreading fake news, sometimes the truth can be stolen and used to influence public opinion. According to CNN, the FBI discovered in September 2015 that Russian hackers compromised computers used by the Democratic National Committee. It was later discovered that confidential DNC documents and Clinton Campaign Chairman John Podesta's emails had been stolen and turned over to the whistleblowing platform Wikileaks.

According to CNN, stolen documents were shared up through the 2016 election with the intention of influencing voter decisions. In response to the hacks, President Obama issued sanctions against Russia and confirmed the nation's role in the hacks.

'Russia's cyber activities were intended to influence the election, erode faith in US democratic institutions, sow doubt about the integrity of our electoral process, and undermine confidence in the institutions of the US government,' a White House statement said.

J. Reuben Clark Law School professor Eric Jensen said the U.S. regularly combats foreign hackers working for rival nations. Jensen, who worked as an attorney for the Department of Defense for 20 years and specializes in cyber warfare, compared hacking to invading a medieval castle.

'To protect your city or your castle, the walls had to be thick and effective all the way around,' he said. 'All a foreign invader had to do was find one weak spot in the wall.'

Similarly, Jensen said defending cyber assets is much more difficult than attacking them, giving hackers an advantage. 'It's always harder to defend your stuff than it is to attack it,' he said. 'It's not a matter of if, it's just a matter of when, since every cyber wall has a weak point.'

Jensen also said different nations are interested in different information, from intellectual property to federal policy information. However, some hackers will often attempt to steal personal information from regular people, which can lower the nation's sense of security.

Voter manipulation

In elections, hacking and social media manipulation can be used to manipulate voters, according to election law attorney and BYU political science alumna Audrey Perry Martin. This was particularly seen during the 2016 general election, which she said was unique since 'foreign nationals were indicted for illegally trying to disrupt the U.S. political process via social media.'

Martin said American political campaigns often use social media to attempt to influence elections in their favor, a tactic that is 'widely accepted.'

'These groups spend a lot of money on digital ads and social media strategies to get their message to voters,' she said. 'They use all the social media platforms to organize events, find volunteers, remind people to vote and run advertisements.'

Political campaigns will also buy email addresses, send text messages and track social media posts to target major issues, according to Perry. However, political campaigns and groups are required to disclose these actions. According to Perry, these actions can be co-opted by adversarial groups.

According to The New York Times, Russian nationals purchased and ran a variety of political ads leading up to the 2016 election. These posts included positive images of then-candidate Donald Trump, Jesus arm-wrestling a Hilary Clinton-supporting Satan and attacks on Clinton and Sen. Bernie Sanders, D-Vermont.

 width=

This meme depicting Satan arm-wrestling Jesus in support of 2016 presidential candidate was run as an ad on Facebook by an account associated with Russia leading up to the general election. Ads and memes were used by Russian nationals to influence public opinion regarding the 2016 election. (Screenshot)

Twitter also released a statement that the Russian state-owned news site Russia Today spent $274,100 on ads targeting the U.S. market in 2016. Twitter also noted during the election that the company deleted tweets circulating false information, like claims that Clinton supporters could vote via text.

Although Facebook and Twitter officials have made efforts to clamp down on misinformation, Business Insider found they were able to purchase and run two fake ads on Facebook. The posts were credited as 'paid for by Cambridge Analytica,' a political consultancy that is currently banned from Facebook due to data concerns.

According to Martin, Russian nationals used targeted ads via fake social media accounts to mislead U.S. voters. She noted many of the ads simply focused on propaganda. Martin also noted the Internet Research Agency, a Russian agency specializing in social and political manipulation, spent $1.25 million a month interfering with the election and reached 11.4 million people with their advertisements.

Martin also said that despite Russian nationals' reach on social media, she doesn't think they had any more influence than a regular campaign or Super PAC would. The primary difference, though, is the transparency behind the messaging.

'Just like any ad, for or against a campaign, these ads would help or hurt a campaign if they had an effective message,' she said. 'However, unlike regular campaign ads, it was impossible to tell what entity was paying for the advertisements.'