“We need your support — in 2022, modern technology is perhaps the best answer to the tanks, multiple rocket launchers … and missiles,” he wrote.
He also tweeted early Saturday that he had contacted Facebook parent company Meta, Google and Netflix, asking them to suspend services in Russia. He called on YouTube to block “propagandist” Russian channels.
Sen. Mark R. Warner, chair of the Senate Intelligence Committee, called on Twitter and Facebook’s parent, Meta, to “assume a heightened posture” against information operations linked to Russia. The Virginia Democrat warned that as the invasion advances, “we can expect to see an escalation in Russia’s use of both overt and covert means to sow confusion about the conflict and promote disinformation narratives that weaken the global response to these illegal acts.”
And on Twitter, users called on their followers to report a YouTube channel with more than 22,000 followers that has been sharing videos that appeared to reveal the movements of Ukrainian troops. YouTube did not respond to a request for comment about the channel or video.
Tech companies long have positioned themselves as beacons of free expression and democratic standards. But the war in Ukraine is testing those values in new ways. From the halls of Congress to the Twitter feeds of pro-Ukrainian activists, the companies are facing increasing clamor for a tougher line on Russia, which itself is renowned for using popular technology to influence geopolitics — most infamously in the 2016 U.S. presidential election.
“There is a growing sense they have a moral obligation to ensure their sites are not exploited at a time of crisis,” said Karen Kornbluh, director of the Digital Innovation and Democracy Initiative at the German Marshall Fund, a think tank. “The Russian playbook is clear — and the companies are under pressure not to wait to act against fake accounts or malign influence activity until after they are used to interfere with humanitarian assistance or inflame the conflict.”
When President Biden announced sanctions against Russia affecting high-tech imports on Thursday, he said they would “impair” Russia’s “ability to compete in a high-tech, 21st century economy.” But the sanctions were largely focused on semiconductors and other high-tech tools that benefit Russia’s defense sector. According to a Commerce Department statement, consumer communication devices are largely exempt.
But policymakers, journalists, technologists and human rights advocates now are pressing for the tech companies to act more aggressively.
Social media platforms particularly have come under scrutiny for their role in promoting Russian state media.
In a letter to Sundar Pichai, the CEO of Google parent Alphabet, which also owns YouTube, Warner accused the platforms of profiting from “disinformation.” He wrote that his staff had discovered that YouTube was running ads on videos about the Ukrainian conflict from RT, Sputnik and Tass, all Russian state media organizations. He also wrote that Google’s ad network was supporting Russian state media outlets by feeding ads to Sputnik and Tass. He said ads from “unwitting” brands like Best Buy, Allbirds and Progressive were being run by Google on those outlets’ webpages. Those companies did not immediately respond to requests for comment.
Others have called for RT and people affiliated with it to be banned from major social media sites, and they questioned why RT’s editor in chief was permitted to spread falsehoods on Twitter. Twitter labels the accounts of state-run media organizations and their senior staff members, and it does not allow state media to pay to promote tweets.
“It’s appropriate for American companies to pick sides in geopolitical conflicts, and this should be an easy call,” tweeted Alex Stamos, Facebook’s former chief security officer and now director of the Stanford Internet Observatory.
Amid the increased scrutiny, Twitter on Friday tweeted that it was “actively monitoring” for risks associated with Ukraine, and it temporarily paused advertisements in Russia and Ukraine to ensure that ads don’t detract from key information about safety.
Cameron Njaa, a spokesperson for Reddit, which also was singled out by Warner in his call for heightened awareness of Russian propaganda, said the company was “extending resources” to moderators in “affected areas” and working closely with governments and other platforms to “stay on top of any malicious or inauthentic activity.”
Late Friday, Meta announced that it would prohibit Russian state media from running ads or monetizing its platform anywhere in the world, and said it would continue applying fact-checking labels to posts from Russian state media. Earlier the same day, Nick Clegg, Meta’s head of global affairs, tweeted that Russian authorities had restricted the use of the company’s services after Facebook had labeled and fact-checked posts from four state-owned media organizations. Clegg said the Russian authorities had ordered the company to stop the fact-checking and labeling but that it had refused.
Alphabet, TikTok and Telegram did not respond to requests for comment.
Tech companies previously have bowed to pressure from Russia’s Internet censor. In September, Apple and Google removed an opposition voting app from their app stores as balloting began in the country’s parliamentary election, after the Russian censorship agency accused the firms of interfering in the country’s political affairs. The agency threatened fines and possible criminal prosecutions.
Amid mounting pressure on the platforms, Internet freedom advocates warned that tech platforms are a critical source of independent information for people in Russia, and limiting access to those platforms may leave people with only state propaganda that is inciting the war with Ukraine.
“Major tech companies have a responsibility to their Ukrainian and Russian users to respect their rights to freedom of expression and access to information, especially in the time of war and political crisis,” said Natalia Krapiva, the tech legal counsel of Access Now, a nonprofit that advocates for Internet freedom.
But she said tech companies still need to take precautions to ensure that their platforms aren’t abused.
“They do, however, also have responsibility to keep their users safe and identify and respond to any campaigns of disinformation that may result in violence and abuse,” she said.