News

Russian disinformation on Facebook targeted Ukraine well before the 2016 U.S. election

 “You know, over time it’s something that we might consider,” the chief executive responded. “So thank you for — the Ukrainian president — for writing in. I don’t think we’ve gotten that one before.”

In the three years since then, officials here say the company has failed to address most of their concerns about Russian online interference that predated similar interference in the 2016 U.S. presidential election. The tactics identified by officials, such as coordinated activity to overwhelm Facebook’s system and the use of impostor accounts, are the same as in the 2016 contest — and continue to challenge Facebook ahead of next month’s midterm elections.

“I was explicitly saying that there are troll factories, that their posts and reposts promoted posts and news that are fake,” Dmytro Shymkiv, then deputy minister of the presidential administration, said he told Facebook executives in June 2015. “They are promoted on your platform. By very often fake accounts. Have a look.”

Facebook has launched major reforms to its platform and processes since the 2016 U.S. presidential election made the company — and American users of Facebook — aware of how Russian actors were abusing it to influence politics far beyond their borders. But Ukraine’s warnings two years earlier show how the social media giant has been blind to the misuse of Facebook, in particular in places where it is hugely popular but has no on-the-ground presence. There is still no Facebook office in Ukraine.

Facebook officials defend their response to Ukrainian officials. They said Shymkiv did not raise the issue of Russian misinformation and other tactics in the meeting but that he talked instead about the company’s standards for removing content. They also said what they were alerted to in Ukraine was not a preview of what happened in the United States during the 2016 election.

Activists, officials and journalists from countries including Ukraine, the Philippines and Myanmar who reported abuses say Facebook took little or no action, according to an investigation for the documentary “The Facebook Dilemma,” airing Monday and Tuesday as part of the PBS series “Frontline.” It was not until after evidence that fake accounts from Russia were used to influence the 2016 U.S. election that the company acted, some said. This article is based on reporting done for the film.

“That was the moment when suddenly I got a lot of calls and questions,” said Shymkiv, who left the government recently to return to private industry. “Because we were one of the first ones who actually told them that this is happening.”

Facebook officials defend their response to Ukrainian officials. They said Shymkiv did not raise the issue of Russian misinformation and other tactics in the meeting but that he talked instead about the company’s standards for removing content. They also said what they were alerted to in Ukraine was not a preview of what happened in the United States during the 2016 election.

Activists, officials and journalists from countries including Ukraine, the Philippines and Myanmar who reported abuses say Facebook took little or no action, according to an investigation for the documentary “The Facebook Dilemma,” airing Monday and Tuesday as part of the PBS series “Frontline.” It was not until after evidence that fake accounts from Russia were used to influence the 2016 U.S. election that the company acted, some said. This article is based on reporting done for the film.

“That was the moment when suddenly I got a lot of calls and questions,” said Shymkiv, who left the government recently to return to private industry. “Because we were one of the first ones who actually told them that this is happening.”

In interviews, company executives said they were slow to act on other evidence that Facebook was causing what they called “real-world harm.”

“Mark has said this, that we have been slow to really understand the ways in which Facebook might be used for bad things. We’ve been really focused on the good things,” said Naomi Gleit, one of Facebook’s longest-serving employees and now vice president for social good. “It’s possible that we could have done more sooner, and we haven’t been as fast as we needed to be, but we’re really focused on it now.”

A team set up to safeguard the upcoming U.S. midterm elections will be reviewing and removing inappropriate posts in real time. Facebook in August removed 652 fake accounts and pages with ties to Russia and Iran aimed at influencing political debates — and an additional 82 Iran-backed accounts on Friday. False narratives about the Central American migrant caravan and mailed pipe bombs were rampant on the network this week.

Complaints and harm done overseas, where 90 percent of Facebook’s 2.2 billion users live, were not company priorities, experts say, and may have led to missed signals before the 2016 U.S. election.

“Facebook’s tactic is to say, ‘Oh, we were blindsided,’ when in fact people had been warning them — pleading, begging — for years,” said Zeynep Tufekci, associate professor at the University of North Carolina at Chapel Hill, who began urging Facebook to remove false rumors during the 2011 Arab Spring revolutions. “The public record here is that they are a combination of unable and unwilling to grasp and deal with this complexity.”

Some former Facebook employees say that they were aware early on of Russian online interference in Ukraine, but either did not have a full picture of the interference or were unable to move the warnings high enough up the chain of command.

Alex Stamos, Facebook’s recently departed chief security officer, said the company had acted in Ukraine against Russia’s traditional cyber unit, the military intelligence agency GRU, which later stole emails from the Democratic National Committee. “We knew that they were active during the Ukraine crisis” in 2014, he said in an interview, referring to the pro-democratic Maidan Revolution and subsequent Russian invasion. “We had taken action against a number of their accounts and shut down their activity.”

But, he said, “we had not picked up on the kind of the completely independent disinformation actors” behind phony accounts circulating false news and posts, the sort of activity Shymkiv and other officials were flagging.

Elizabeth Linder, until 2016 Facebook’s government and policy specialist in Europe, Middle East and Africa, based in London, said disinformation was “absolutely hugely worrisome to countries, especially in Eastern Europe” before the U.S. elections.

But “in a company that’s built off numbers and metrics and measurements, anecdotes sometimes got lost along the way,” she said. “And that was always a real challenge and always bothered me.”

As Facebook pushed into new markets around the world, in some places becoming in effect the Internet by serving as the primary source of information online, it took few measures to assure that its product would be properly used, critics said.“They built the city but then they didn’t put any traffic lights in, so the cars kept crashing into each other,” said Maria Ressa, editor of Rappler, a prominent journalism website in the Philippines, which Facebook last month contracted to identify fake news and hate speech in the country.

In an August 2016 meeting with Facebook in Singapore, Ressa showed three Facebook employees how close supporters of President Rodrigo Duterte were using the platform to circulate disinformation and call for violence against critics. Facebook had taught Duterte’s campaign how to use its platform to communicate with voters — training it offered other campaigns in other countries, too.

Facebook’s failure to heed the pleas of civil society groups on the ground in Myanmar, also known as Burma, as far back as 2015 has had an even more devastating result.

That was the year Australian tech entrepreneur David Madden, who was living in Myanmar, traveled to Facebook’s headquarters in Menlo Park, Calif., and gave a seminar for employees describing how the platform had become a megaphone for Buddhist leaders calling for killing and expelling the Muslim Rohingya minority. Facebook removed the particular posts Madden complained about at the time but “what we had not done until more recently is proactively investigate coordinated abuse and networks of bad actors and bad content on the platform,” the company said last week.

In March, the United Nations declared that Facebook had a “determining role” in the genocide. “Facebook has now turned into a beast, and not what it originally intended,” U.N. investigator Yanghee Lee said.

“I think we were too slow to build the right technical tools that can help us find some of this content and also work with organizations on the ground in a real-time fashion,” said Monika Bickert, head of global policy management.

As in many countries, Facebook had no employees or partnerships on the ground. It says this is changing but still refuses to disclose how many are deployed country by country — something of great concern to Ukraine, Myanmar and other nations that suspect its content moderators are biased, inadequately trained or lack the necessary language and cultural fluency.

“We are working here in Menlo Park,” said Gleit, Facebook’s vice president for social good. “To the extent that some of these issues and problems manifest in other countries around the world, we didn’t have sufficient information and a pulse on what was happening.” Hiring more people overseas “can give us that insight that we may not get from being here.”

But, she said, “It’s not that we were like, wow, we could do so much more here and decided not to. I think we . . . were just a bit idealistic.”

In Ukraine, Russian information warfare was in full swing on Facebook and a Russian social media network during the revolution in 2014, government officials say. There was a daily flood of fake news condemning the revolution and trying to legitimize the invasion by claiming Ukraine was an Islamic State safehaven, a hotbed for Chechen terrorists and led by Nazis.

 “We tried to monitor everything, but it was a tsunami,” recalled Dmytro Zolotukhin, then working for the new Ukrainian government’s Information Analysis Center of the National Security and Defense Council, which investigated online disinformation. “Thousands of reports of fake news on fake pages came in.” With the help of hackers and other cyber experts, he says he traced some of these accounts back to the Kremlin, which was also amplifying the false claims on dozens of fake online publications.

After the revolution in 2014, and again in 2017, Facebook suddenly banned dozens of accounts owned by pro-democracy leaders. Zolotukhin and others concluded that Russian bots were probably combing past comments and posts looking for banned terms and sending their names and URLs of the account owners to Facebook with complaints.

Another problem was someone — they believe it to be Russia — created impostor Facebook accounts of real government ministries and politicians, including Poroshenko. The impostor accounts posted incorrect and inflammatory information meant to make the government look bad, said Zolotukhin, now the deputy minister of information policy. He and others begged Facebook through its public portal to add verification checks next to the real accounts and remove the fakes. But usually no action was taken.

“I asked for six months for my verification,” said Artem Bidenko, state secretary of the Information Ministry, who said someone had created a fake account using his name.

All this overwhelmed the new Ukrainian government, which was dealing with corruption in its ranks, a Russian invasion and the continuing onslaught of Russian propaganda. Shymkiv and others met to figure out how to get Facebook’s attention when they learned of the May 2015 town hall meeting with the Facebook CEO.

One town hall question — with a record 45,000 likes — asked whether the Ukrainian accounts were the victim of “mass fake abuse reports.” Zuckerberg replied that he personally had looked into it. “There were a few posts that tripped our rule against hate speech,” he said. He did not say whether Facebook had checked on the authenticity or origin of the ban requests.

A month later, Facebook sent Gabriella Cseh, its head of public policy for Central and Eastern Europe based in Prague, to meet with Shymkiv, Bidenko and others in Kiev.

Shymkiv said he told Cseh that the government believed Russia was using Facebook accounts with fake names to post fictitious, inflammatory news reports and engaging in online discussions to stir up political divisions.

Facebook needed to send a team to investigate, he said. Ukraine’s stability as a new democracy was at stake.

Bidenko said Cseh agreed he could email her the names of civic leaders who believed their accounts had been wrongfully banned.

“People would come in here with tears in their eyes,” said Bidenko, seated in his crumbling Soviet-era office. “They would say, ‘I wrote nothing bad and they banned me.’ I would write to Gabriella.”

At the end of the meeting, according to Shymkiv, Cseh promised to review the cases, which Facebook says it did. Then she handed him a copy of its Community Standards policy, available online.

This appeals process worked well for about two years, Bidenko said.

But Cseh went silent, Bidenko said, since an email she sent him April 13, 2018, two days after Zuckerberg testified on Capitol Hill and public scrutiny of Facebook intensified. He figures she and the company became too busy with other problems to respond. But to his astonishment, she also unfriended him.

“I was like, what!? Why is Gabriella unfriending me?” he said. “Maybe I became a nuisance.”

Facebook declined to allow Cseh to be interviewed and didn’t respond to a question about why she unfriended Bidenko. In a statement they said, “Gabi has previously made it clear to Mr. Bidenko that she might not respond to every single one of his messages, but that doesn’t mean she isn’t escalating the issues he flags to the appropriate internal teams.”

In August, Zolotukhin met with Facebook officials and said he reiterated the same concerns. He sent them a list of pages that still needed verification marks and they complied soon thereafter.

Bidenko, Zolotukhin, hackers and journalists are eager to open their laptops and scroll through what they say are fabricated news that sometimes includes gruesome videos. “Phosphorus burns everything: Ukrainian militia is using illegal weapons,” said a repost of a YouTube video from 2017. “Executioners were harvesting internal organs for sale,” read a post from a Russian website.

More than 2,000 Ukrainians have been killed and an active war continues, making Russia’s continued clandestine attacks via Facebook an urgent national security matter.

Facebook recently posted a job for a public policy manager for Ukraine — based in Warsaw.

“Facebook is trying to stay on the sidelines” of the war between Ukraine and Russia, Zolotukhin said. “But now it is not about saying you’re for democracy. It’s about fighting for democracy.”

https://www.washingtonpost.com


Dana Priest  is a Washington Post reporter and a professor at the Merrill College of Journalism at the University of Maryland. Jacoby and Bourg are producers for the PBS investigative series “Frontline.” “The Facebook Dilemma,” a two-night documentary from “Frontline,” premieres Mon., Oct. 29 at 9/8c and Tues., Oct. 30 at 10/9c on PBS (check local listings) and online.

Dana Priest, a reporter at The Washington Post for 30 years, covers national security issues. Recently, she has investigated Russian disinformation operations, censorship around the world, the massive national security state, CIA operations and veterans issues. She is the Knight Chair in Public Affairs Journalism at the University of Maryland.

Fiți la curent cu ultimele noutăți. Urmărește TIMPUL pe Google News și Telegram!


Lasă un răspuns

Adresa ta de email nu va fi publicată. Câmpurile obligatorii sunt marcate cu *

babasının taş gibi bir kadını bulup evlenmeye karar vermesi porno ve onu bir de kendi yaşadıkları eve getirmesi sonucunda mobil porno hayalini kurduğu seksi kadının üvey annesi gibi sex hikayeleri olduğunu fark eden genç adam seks tecrübesinin ve üst sex izle seviye olduğu dışarıdan bakıldığında çok net şekilde porno seyret belli olan üvey annesini hayal edip 31 çekerek düşlerken porno izle odanın oradan geçen mature üvey oğlunu görür Odaya hd porno gelip sikini kimin için kaldırdığını soran üvey annesiyle porno farkında olmadan yakınlaşan genç adam boşalmak mobil porno izle için olgun bir kadının hayallerini kurduğunu söyleyince mature porno izle sikini eline alıp amına sokmadan çıplak şekilde üzerine çıktı