WH’s Psaki: “We’re Flagging Problematic Posts For Facebook That Spread Disinformation”

White House press secretary Jen Psaki revealed at a press briefing on Thursday that the White House is working in coordination with Facebook to flag “problematic” posts that spread “disinformation” about the COVID-19 vaccine.

“Given as Dr. Murthy conveyed, this is a big issue of misinformation specifically on the pandemic,” Psaki said. “In terms of actions, that we have taken or we’re working to take, I should say, from the federal government, we’ve increased disinformation research and tracking. Within the Surgeon General’s Office, we’re flagging posts for Facebook that spread disinformation.”

“It’s important to take faster action against harmful posts. As you all know, information travels quite quickly on social media platforms. Sometimes it’s not accurate, and Facebook needs to move more quickly to remove harmful violative posts. Posts that would be within their policies for removal often remain up for days. That’s too long. The information spreads too quickly.”

“Finally, we have proposed they promote quality information sources in their feed algorithm,” the White House press secretary said. “Facebook has repeatedly shown that they have the leverage to promote quality information. We’ve seen them effectively do this in their algorithm over low-quality information, and they’ve chosen not to use it in this case.”

QUESTION: Thanks, Jen. Can you talk a little bit more about this request for tech companies to be more aggressive in policing misinformation? Has the administration been in touch with any of these companies? And are there any actions that the federal government can take to ensure their cooperation? Because we’ve seen from the start, there’s not a lot of action on some of these platforms.

PSAKI: Sure. Well, first, we are in regular touch with the social media platforms, and those engagements typically happen through members of our senior staff, but also members of our COVID-19 Team.

Given as Dr. Murthy conveyed, this is a big issue of misinformation specifically on the pandemic. In terms of actions, Alex, that we have taken or we’re working to take, I should say, from the federal government, we’ve increased disinformation research and tracking. Within the Surgeon General’s Office, we’re flagging posts for Facebook that spread disinformation.

We’re working with doctors and medical professionals to connect–to connected medical experts with popular–who are popular with their audiences with–with accurate information and boost trusted content. So, we’re helping get trusted content out there.

We also created the COVID-19–the COVID Community Corps to get factual information into the hands of local messengers. And we’re also investing, as you’ll have seen in the president’s, the vice president’s, and Dr. Fauci’s time in meeting with influencers who also have large reaches to a lot of these target audiences who can spread and share accurate information.

You saw an example of that yesterday. I believe that video will be out tomorrow. I think that was your question, Steve, yesterday, I did the full follow-up there. Where there’s also proposed changes that we have made to social media platforms, including Facebook, and those specifically are four key steps.

One, that they measure and publicly share the impact of misinformation on their platform. Facebook should provide publicly and transparently data on the reach of COVID-19–COVID vaccine misinformation.

Not just engagement, but the reach of the misinformation, and the audience that it’s reaching. That will help us ensure we’re getting accurate information to people. This should be provided not just to researchers, but to the public, so that the public knows and understands what is accurate/inaccurate.

Second, we have recommended, proposed, that they create a robust enforcement strategy that bridges their properties and provides transparency about the rules. So, about–I think this was a question asked before. There’s about 12 people who are producing 65 percent of anti-vaccine misinformation on social media platforms. All of them remain active on Facebook, despite some even being banned on other platforms, including Facebook–ones that Facebook owns.

Third, it’s important to take faster action against harmful posts. As you all know, information travels quite quickly on social media platforms. Sometimes it’s not accurate, and Facebook needs to move more quickly to remove harmful violative posts. Posts that would be within their policies for removal often remain up for days. That’s too long. The information spreads too quickly.

Finally, we have proposed they promote quality information sources in their feed algorithm. Facebook has repeatedly shown that they have the leverage to promote quality information. We’ve seen them effectively do this in their algorithm over low-quality information, and they’ve chosen not to use it in this case.

That’s certainly an area that would have an impact. So, these are certainly the proposals. We engage with them regularly, and they certainly understand what our asks are.

Originally found on Real Clear Politics Read More

Similar Posts