Nancy Watzman, Media Strategist
Watch the Workshop
When it comes to battling online mis- and disinformation, “there’s no magical solution, but there are lots of things to try,” says Nancy Watzman. As a , that’s been her approach — to get involved with innovative projects trying lots of different things.
During her workshop, Watzman shared two of those projects, which take different approaches to the issue of disinformation. While both are informed by technical knowledge of how disinformation spreads through social media, they address the issue from different fronts.
The Human Front
In 2020, Watzman worked with on a fellowship program that trained journalists to better track and combat misinformation at the local level. She explained that while there are reporters at the national level (Buzzfeed, Politico, etc.) trained in the specialized data journalism techniques needed to identify and track sources of misinformation, local news outlets often don’t have the resources or expertise to reproduce those efforts in their communities. The program focused on training the fellows on the nuances of mis- and disinformation, as well as research-based approaches to successfully fact-checking and intervening. In addition to producing some original reporting on local issues, the fellows also held trainings for other journalists and for the public.
Further opportunities: Because research shows people trust local news outlets, Watzman believes it’s an area that’s ripe for research and intervention. Colorado is also a hotbed of local media innovation and potential partners, including organizations like the and the (COLab). This gives our center an opportunity to lend technical expertise to local media projects, and to tap into a wealth of Colorado media experts for technology projects.
Sandra Fish, the First Draft fellow from Colorado, also mentioned that she experimented with setting up a Google Form to allow the public to submit questions about elections and received a ton of inquiries. She said it showed her that people are hungry for accurate information about both elections and democracy in general. While that void is easily filled by bad actors, there could also be an opportunity to experiment with different ways of providing trustworthy information to people who are actively looking for it.
The Technology Front
This year, Watzman is working with project, which takes a cybersecurity approach to disinformation. They are working on the platform side, to identify vulnerabilities and mine data that they can then make available to journalists and the public. Several of their projects, including and , are focused on making Facebook and YouTube political advertising data more transparent and accessible to researchers, journalists and the public. Watzman encouraged workshop attendees to install the Ad Observer plug-in to contribute to the project.
Further opportunities: NYU’s project presents an interesting model for public engagement in research efforts. Citizen science tools like their Ad Observer plug-in can be a valuable tool both for data gathering and for increasing public interest in and buy-in to research efforts. In addition, making research data easily accessible and usable to journalists can help ensure that research results are more accurately and engagingly presented to the public.
First Draft Information Disorder Framework
- Disinformation - Intentionally creating false or misleading information for financial gain, political influence or generally cause harm.
- Misinformation - Sharing disinformation without realizing it’s false or misleading, often out of good intent.
- Malinformation - Sharing genuine information with an intent to cause harm.
Information Disorder Types
- Satire or parody
- False connection
- Misleading content
- False context
- Imposter content
- Manipulated content
- Fabricated content