The attorneys general say the companies have not cracked down hard enough on prominent anti-vaccine accounts that repeatedly violate the companies terms of service. They also say that falsehoods about the safety of coronavirus vaccines from a small pool of individuals has reached over 59 million followers on Facebook, YouTube, Instagram and Twitter, citing data from the Center for Countering Digital Hate, which studies online misinformation and disinformation.
They sent the letter the day before Zuckerberg, Dorsey, and Alphabet and Google CEO Sundar Pichai are expected to testify in front of the House Energy and Commerce Committee. The hearing is broadly focused on disinformation, and lawmakers and their staff have been in communication with leaders of Anti-Vax Watch, a collection of people and organizations concerned about vaccine disinformation.
Facebook and Twitter did not immediately respond to requests for comment.
Tong argues that lives depend on the companies ability to properly enforce their rules. He said online falsehoods are undermining public confidence in vaccinations, and he raised concerns about those against vaccines targeting Black Americans and other minority communities.
Coronavirus vaccines only work if people actually get them. Pseudoscience coronavirus conspiracy theories peddled by a small number of uninformed anti-vaxxers have reached tens of millions of social media followers, Tong said in a statement. These posts are in flagrant violation of Facebook and Twitter policies. Facebook and Twitter must fully and immediately enforce their own policies, or risk prolonging this pandemic.
The attorneys general of Delaware, Iowa, Massachusetts, Michigan, Minnesota, North Carolina, New York, Oregon, Pennsylvania, Rhode Island and Virginia also signed the letter.
Vaccine misinformation has plagued social media companies for years, as they struggle to maintain what they see as a balance between allowing free speech and cracking down on harmful material on the sites. In 2019, several social media companies, including Facebook, took steps to try to slow the spread of vaccine misinformation. Facebook created policies to reject ads with false claims and to stop recommending groups that were spreading misinformation.
In December, Facebook banned false and misleading statements about coronavirus vaccines.
But vaccine hesitancy, or delaying or refusing a vaccine, can be a tricky area to police because much of the online content could be people expressing concern as opposed to purposefully spreading false information.
Vaccine conversations are nuanced, so content cant always be clearly divided into helpful and harmful, wrote Kang-Xing Jin, Facebooks head of health, in an op-ed in the San Francisco Chronicle this month. Its hard to draw the line on posts that contain peoples personal experiences with vaccines.
Facebook is conducting its own thorough study into U.S. users vaccine doubts, The Washington Post reported this month. Early results show that a lot of content that does not break the companys rules could still be causing harm in some communities, where the information bounces around in an echo chamber.
Twitter said in December it would remove some tweets that included false claims about adverse effects of vaccines, or that vaccines are unnecessary because covid-19 is not serious. It expanded that policy earlier this month, saying it would label tweets that included misleading information about vaccines and lock people out of their accounts for escalating periods of time on a strike system.
Despite the social media companies efforts, vaccine misinformation is still readily found online.
Some Evangelical Christians and Christian ministries have spread false information about vaccines online, baselessly claiming that the vaccines contain microchips or fetal tissue.
And vaccine misinformation has caught the attention of supporters of conspiracy theory group QAnon on Telegram and other smaller social media sites, where supporters flocked after the mainstream social media sites cracked down on them.