I want you to take a minute and look at the faces in the picture above.
Perhaps one of them reminds you of a family member, a co-worker, or a friend.
Now, what if I told you that none of those individuals actually exist?
They are each deep fakes created by a site called This Person Does Not Exist.
Nothing more than pixels generated and arranged by artificial intelligence (AI).
Each time you click the refresh button a new image of an AI-generated person is shown.
Welcome to the new reality of the internet. A reality where one cannot trust what they see, read, or hear.
Welcome to the age of Mis- and Disinformation.
Misinformation vs. Disinformation
As government communicators, we arguably deal with misinformation and disinformation more than any other industry.
Considering many of our information systems are corroding- we must be prepared for the inevitable. It’s not a matter of whether misinformation or disinformation will affect our organizations. It’s a matter of when.
Though occasionally used interchangeably mis- and disinformation have two slightly different meanings.
“Misinformation is false or inaccurate information—getting the facts wrong.
Disinformation is false information that is deliberately intended to mislead—intentionally misstating the facts.” (American Psychological Association)
Social Media Amplifies the Spread
Social media has amplified the spread of false information primarily because it is so easy to do so.
Many social media platforms have essentially three ways to engage with a piece of content:
1. Like or React
2. Comment
3. Share (publicly or privately)
Each of these actions aids the content to reach a wider audience than the last.
In addition, since the rise of social media and algorithms deciding what content we would be most likely to engage with, we have unknowingly polarized ourselves and our viewpoints further.
Examples of misinformation spreading quickly and uncontrollably on social media include information about public health during the COVID-19 pandemic and false information during election periods.
Government’s Role
For hundreds, if not thousands of years, governments have existed to create and maintain order.
However, in recent history, for better or worse, people have become increasingly skeptical of their governments.
According to the 2024 Edelman Trust Barometer, government is now only seen as trustworthy by 40% of U.S. citizens.
How do we begin to regain that trust?
I’m going to suggest that one way we can begin to make headway with this issue is by combatting mis- and disinformation gracefully and tactfully.
Strategies for Combatting Misinformation
Proactive Communication
Much of the work and the corresponding communications that governments put out is reactive. But, in today’s fast-paced world, it is not enough to be reactive. Our organizations need to be proactive.
Brainstorm ways that you can display important, timely, and accurate information in easy-to-find places on your website and social media profiles. Some locations could include a frequently updated FAQ page, pinning pertinent posts at the top of your social media profiles, and Instagram highlights that link to official news sources.
Establishing official channels that consistently provide verified facts helps build a foundation of trust and credibility with the public.
Monitoring and Responding
If it’s in your organization’s budget you should look into social listening tools such as Sprout Social, Brandwatch, or Meltwater.
Social media listening tools let you monitor and track social media conversations related to a specific brand or topic. Having access to such a tool could mean the difference between catching misinformation before it spreads and a PR nightmare.
If a third-party social listening tool is not in the cards you can also practice what some call “organic social listening.” This is done on the social media platforms themselves and can be done through keyword searches in the native search feature. Though time-consuming it could prove worthwhile.
Now, when it comes to responding I would suggest that you focus on two things: accuracy and speed.
Accuracy should be self-explanatory. Make sure that the information that you are putting out on your official government pages and accounts is accurate and peer-reviewed. The last thing we want while combating misinformation is to add to it.
Speed is something that government organizations need to improve on across the board. We need to improve our internal review processes, our response times to inquiries, etc. Research shows the modern consumer interprets speed as caring. I would take that a step further and say that the modern consumer interprets speed as trust. In a misinformation crisis every second counts. Make those seconds work in your favor.
Collaboration with Platforms and Other Stakeholders
Ideally government organizations both big and small would be working in tandem with leading social media, AI, and tech companies. But, let’s be honest, we can’t even get Meta to respond to our contact form from 6 months ago.
We’ll leave any major partnering to the federal government.
Initiatives such as Meta’s Third-Party Fact-Checking Program are helpful, but by no means will put an end to mis- and disinformation.
Again, we can’t necessarily wait for these partnerships to form and rely solely on social media platforms to help us. We need to be proactive.
How about starting small?
We could partner and collaborate with non-governmental organizations (NGOs), universities, and businesses.
Returning to the 2024 Edelman Trust Barometer, 60% of people responded, “If business partners with government, I would trust it more…”
By collaborating and partnering with subject matter experts in private, public, and higher education we add social proof to our communications.
Educational Campaigns
With the rapid development and improvement of technology, government organizations must educate their citizenry about the potential threats and dangers of AI-generated synthetic content.
Public Service Announcements (PSAs) on where to find your organization’s official communications and social media accounts are a must.
Educate those you serve about the very real possibility of your organization being the target of a mis- or disinformation campaign. Especially if you work closely with politicians or elected officials.
Best Practices and Recommendations
As if government social media and communications weren’t hard enough, the speed and adoption of AI-generated synthetic content are making it much harder.
You, your organization, and those you serve must be prepared.
Make sure that you take the time now to establish clear communication guidelines, invest in staff training, and regularly review and update your strategies.
Every single government organization needs to have an up-to-date crisis plan. One that covers all the basics, but one that now includes a section covering mis- and disinformation on social media, handling AI generated synthetic content, and deepfakes.
Keeping up with AI advancements should be a top priority for someone in your organization. Whether it be communications, information technology (IT), or someone else entirely is not important. Someone needs to be on top of this stuff.
Final Thoughts
The role of government social media managers and communicators in maintaining public trust and shaping discourse is more critical than ever.
By adopting a proactive, collaborative, and adaptive approach, we can effectively combat misinformation, ensure a well-informed public, maintain the integrity of our information systems, and begin to reestablish trust in our government organizations.
The battle against misinformation is ongoing, but with the right strategies, government communicators can safeguard the integrity of public discourse.
ZACK’S FACTS
Subscribe to “The Zack’s Facts” newsletter for my monthly take on the latest industry topics, a government account spotlight, and resources for you to become a better government communitcator.