By Pamela Bartar and Gabor Szudi, ZSI The European Commission has called fighting misinformation and disinformation one of the grand challenges of the 21st century. This has become even more obvious since the beginning of the COVID-19 pandemic: a new wave of misinformation, fake news and hoaxes about the pandemic has generated a dangerous infodemic. But there are solutions available to us – and our research in the TRESCA project, discussed recently at the SCIENCE & YOU conference in Metz, France, underscores the importance of developing long-term strategies for handling the problem. Crises like COVID-19 can be an opportunity to reframe conversations around politics, research funding and governance. Promoting robust and transparent scientific and transdisciplinary methods, supported by an independent research environment, can prove a valuable strategy so that consensual, sustainable policies informed by science can bring public value. A few clear conclusions stand out: During recent months, the need for strong strategies to increase people’s awareness of the appropriate use of media sources and fact-checking became visible. Public attitudes to science and technology are complex; therefore, it is necessary to open spaces for listening and dialogue. When talking about digital science communications, we need to create new environments for new strategies. One potential moment to overcome these challenges are the encounters between scientists and policy makers. Here, we elaborate on these points, which were aired at the SCIENCE & YOU conference. There, one of the panel discussions featured TRESCA project partners from Consejo Superior de Investigaciones Científicas (Sara Degli Esposti), Observa Science in Society (Giuseppe Pellegrini), Erasmus University Rotterdam (Marina Tulin) and the Centre for Social Innovation in Vienna (Gabor Szudi, Pamela Bartar). At the end of this blog, we ask you to contribute to our upcoming policy brief. Please read on to see how you can share your perspective of what can and needs to be done. 1. Digital science communication: New environment calls for new strategies Continuing digitalisation has made information on any topic widely accessible. In this new environment, the responsibility of choosing what or whom to trust is increasingly in the hands of the audience. On the dark side, the large amount of user-generated content makes it difficult to filter out errors or lies. While traditional mediators of information, such as journalists, are reaffirming their roles as gatekeepers, it is clear that the digital sphere presents challenges that are difficult to overcome. Even after a source of misinformation has been debunked or removed, the content continues to spread via re-posts and other types of online engagement. On the bright side of this development, the participatory web has provided platforms for numerous excellent science communication practises. A case in point is the animation studio “Kurzgesagt – In a Nutshell”, which has established itself as one of the biggest science channels on YouTube, with more than 15 million subscribers all over the world. But not everybody can make a good YouTube channel: the production quality of successful YouTube videos tends to be cutting-edge and constantly increasing, requiring collaborations across technical and academic disciplines. 2. Listening to citizens on public communication of science and technology Public attitudes to science are complex. One recent survey of Americans found 44% expressed “great confidence in science”, and another 47% had “some” confidence – both, little changed since the 1990s. The EU-funded CONCISE project on public attitudes to science showed that, in the case of health issues, the information channels preferred by many citizens are traditional news media and television, as well as word of mouth. This means that public communication strategies must be carefully evaluated and the role of the various actors, such as decision makers, experts and communicators, must be balanced to avoid confusing people. It is therefore necessary to open spaces for listening and dialogue. The CONCISE project offered this opportunity by involving 500 European citizens in a public consultation. This made it possible to identify some elements of trust and attitudes towards information channels that were also analysed during the TRESCA project. As part of TRESCA, three workshops were organised to involve citizens in an evaluation of videos in the context of COVID-19. Through comparison of the videos with and without audio, three groups of citizens from different cultural and social backgrounds were able to express their opinion on the quality of the materials, on the messages transmitted and on the methods of communication. This experiment made it possible to identify factors that influence public perception and can fuel trust in scientific communication. One factor emerging was the importance of emotions in assessing the type of images used, the role of the protagonist’s characters and the relationships shown in the videos. Participants stressed that adequate scientific communication should not depict extreme emotions or too strong contrasts; they should also avoid excessive spectacle. Another conclusion: experts and scientists, if easily recognisable, are indeed credible witnesses, at least when providing data and tools to understand phenomena, as opposed to trying to impose scientific truths. 3. Potential strategies to increase people’s awareness The pandemic and associated infodemic have made evident the dangerous impact that digital mass media manipulation of scientific facts can have on individual and collective behaviour and, thus, on public health. Digital platforms such as Facebook had to rush into adopting solutions to patrol political micro-targeting, hate speech, disinformation spreaders and fake accounts. Among the factors influencing people’s ability to distinguish accurate from inaccurate information is a person’s worldview. We know that individuals are more likely to accept or reject misinformation based on whether it is consistent with their pre-existing partisan and ideological beliefs. Previous research, such as a 2010 study on Facebook of US voter attitudes, found that showing people familiar faces in online posts could dramatically improve the effectiveness of political micro-targeting. If people saw on Facebook that close friends had voted, they were four times more likely to get others to vote – indeed, that social factor was more important than the voting message, itself. The study found Facebook social messaging had increased voter turnout directly by about 60,000 voters and indirectly through social contagion by another 280,000 voters, for 340,000 additional votes. That represents about 0.14% of the US voting age population of about 236 million in 2010. Thus, familiarity has an effect on people’s views and political decisions. The TRESCA project examines that, plus the role that fact-checking websites can play in debunking misinformation. It also looks at potential strategies to increase people’s awareness of their ideological biases and degree of vulnerability to misinformation. 4. Policymakers and scientists Appropriate and tailor-made scientific advice gains in importance in the design and implementation of sound public policy. However, the uptake of scientific evidence is undermined by a communications gap between scientists and policy makers. Using the specific characteristics of policy makers as a starting point, TRESCA researchers carried out primary and secondary research to better understand the consumption patterns of science communication by policy makers and how their values and interests influence their decisions. The research found that an increased institutionalisation of scientific advice in legislative and regulatory decision-making processes is well in progress at EU and national level. This involves a stronger engagement between experts and decision-makers through various national and supra-national institutional mechanisms fostering two-way dialogue. This new participatory model requires a more open, accessible and reliable science communication, which should contribute to trust-building between scientists and policy makers. A deeper understanding of how the other side in the science-policy nexus operates is essential for this trust-building. Trust is further enhanced by strengthening open science and access initiatives, the new innovative platforms of science-policy collaboration, the use of more digital and visual solutions, and the set-up or upgrading of ‘fact-checking’ websites. Contribute to the TRESCA policy brief! Clearly, we need to get more people fluent in the language of both science and policy. To accomplish that, as part of our TRESCA POLICY BRIEF, we are soliciting your views on what needs to be done. The policy brief aims to provide EU policy units with concrete and practical advice on how to better engage with experts and leverage scientific findings in their decision-making. The document is based on the results of TRESCA’s work package ‘Science Communication in Context’, in particular, a study on the science communication behaviour of policymakers, and an overview analysis of the (dis)incentives for scientists to engage in science communication. Based on TRESCA‘s findings so far, the policy brief recommends the following actions: Prepare short but comprehensive science communication guidelines Create training opportunities and tailor-made learning resources for scientists and policymakers Elaborate financial incentives for early-stage researchers to participate in science communication with policymakers Strengthen the EC’s Open Science Policy Leverage the use of digital media to create interactive two-way dialogue options between scientists and policymakers Promote the use of fact-checking websites and tools Stakeholders from all related disciplines, policy makers and policy influencers are invited to share their opinion. The consultation process ends on 31st of January 2022. Please follow the link – the survey will only take a few minutes of your time: https://survey.zsi.at/index.php/289265?lang=en
By Elisabeth Steib and Marc Zwiechowski, KURZ
Kurzgesagt is an animation studio and YouTube channel that specializes in explaining complex scientific topics in illustrated and animated videos. For the TRESCA project, we created a video about science communication and the challenges experts and science communicators face.
Initially, we wanted to talk about a completely different topic. But after working with the TRESCA team, who did some experiments on one of our videos to analyze which aspects influence their trustworthy impression, we realized that the overall subject of the TRESCA project should be our video topic instead:
Why do we need science communication in the first place? What can it and can it not do? And what is it we, as science communicators, struggle with when creating our content: how do we condense complex topics down to the perfect detail level – not too much so it is overwhelming, not so little that we are oversimplifying things? How do we deal with balancing the opinions of experts who do not agree on certain aspects?
The video we created gives some insight into the evolution of our research and our motivation to inspire people to get excited about science and wanting to dig deeper by themselves. See for yourself and check out the video below! https://www.youtube.com/watch?v=XFqn3uy238E
Facebook testimony highlighted the mental health dangers, but research suggests it matters plenty how you present scientific data on a contentious issue like this. By Sara Degli Esposti, CSIC Are social media bad for your mental health? That’s a question that dominated news headlines this week following explosive Congressional testimony by a former Facebook manager that the world’s biggest social media company commissioned and covered up research showing its products are harmful – especially to impressionable teenagers. Facebook tried to discredit the testimony; but as a group studying online misinformation, we know that this is a long-running, and much-disputed, area of social science. For instance, in a 2018 book ‘Ten arguments for deleting your social media accounts right now’, Jaron Lanier, an American tech visionary best known for his early work developing virtual reality, argues that social media are making people angrier, more fearful, less empathetic, more isolated and more tribal. Thus, he comes to the conclusion that deleting social media is highly advisable. Is it true? Are social media, which are supposed to bring us together, tearing us apart? Well before the dramatic Washington testimony, several published studies showed a small negative relationship between the use of social media and young peoples’ mental wellbeing – but this effect has been debated. For instance, in Nature magazine in February 2020, Jonathan Haidt of New York University cited evidence of a correlation between rising social media usage, and rising mental health problems among teens. But another researcher in the publication argued that there could also be positive effects. And separately, Cambridge University’s Amy Orben has argued that a lot of the research in this field is “low quality” and a more rigorous scientific approach is needed. Thus, scientists are (as usual) divided. So what do most people – citizens, rather than scientists – think about it? To find out, last year we asked 7,120 people across seven European Union countries: France, Germany, Italy, Spain, Poland, Hungary and the Netherlands. The result was unambiguous. Most people agree social media are harmful. But the story is more complicated than it might at first seem. We presented the respondents – diverse in age, gender and socioeconomic status – with a statement that “several studies show” a correlation between screen time and poor mental health. In all, 83% of respondents said they agreed there is a problem, and 57% said they would consider deleting some of their social media accounts. But then we offered them the chance to check out the assertion for themselves – in short, to make a more informed opinion. Surprisingly, given how cynical politicians can be about citizen engagement, 77% said they would indeed like to check the claim further. Among those who did so (we made it fast and easy for them), 86% said they’d like to know even more about the effects of social media. And after checking, the share of people who wanted to delete their social media accounts actually rose by 5 percentage points, to 62%. In short, with more information, the social media problem now appeared to them to be even worse than they had first thought. Figure 1: Survey results from WP3 activities of the TRESCA project (see also the work plan for WP3 for more information) So it matters how you present this kind of contentious social science data to people. But more important, it shows many people are willing, if given the chance, to look at the facts for themselves. Interestingly, there were some pretty clear national differences in this show-me attitude. Italians were the most likely to want to see the data for themselves. Germans were the least interested, but also those who trusted institutions the most. Why the difference? History, culture, media, politics – there are all kinds of factors differing from one country to the next that affect how people view contentious data. Our research into social media and misinformation is continuing, and we will be reporting more findings from our surveys and other activities. The work is part of the TRESCA Horizon 2020 project coordinated by Erasmus University. In fact, we are developing an app, Ms.W. (the Misinformation Widget) to help people check for themselves the veracity of claims about science that they see online. It responds to a well-document problem of “cloaked science”: false claims dressed up to look like science. In the pandemic, we have all seen how damaging this can be if unchecked: thousands of people have unnecessarily died because they believed false claims about vaccines, treatments, masks or social distancing. A widget like this won’t solve the problem of social media and teenage mental health. But it can slowly contribute to getting more people to realise they shouldn’t believe everything they read about science in social media.
By Adolfo Antón Bravo, David Arroyo and Sara Degli Esposti, CSIC The COVID-19 pandemic has made clear the risks of online misinformation. Are the vaccines safe? Should we keep wearing masks? On such questions, social media has become a minefield of wrong, misleading or false information – with real-world consequences. Surely there’s a way that we, when online, can more easily check the trustworthiness of a post? That’s the aim of Ms.W, the Misinformation Widget being developed by the EU-funded TRESCA project. It is a bit of software, that can run as a phone app, to help judge whether what you are reading is reliable or not. In short, it’s an online vaccine against today’s online “infodemic.” An infodemic, according to the World Health Organisation, is “”an overabundance of information —some accurate and some not— that makes it hard for people to find trustworthy sources and reliable guidance when they need it”. It can include misinformation: postings that are wrong, perhaps by accident, for many reasons. In posts about science, the technical vocabulary can be misunderstood or misinterpreted. The source’s expertise, reputation or reliability aren’t always easy to check. And misinformation frequently overlaps with disinformation – deliberately false or misleading information posted for political or economic gain. There are plenty of examples. In early 2020, an online hoax about the benefits of bleach-based alcohol for use against COVID-19 led to the hospitalisation of hundreds. The challenge of separating fact from “faction” gets even harder when dealing, not just with text, but also with “memes.” Though the term originated in evolutionary biology, most commonly today a meme is an image, video, or piece of text that is copied and spread rapidly by Internet users, often with slight variations. They are often funny; they catch your attention, and you forward them to your friends. A classic example: memes about next-generation 5G mobile phone systems – some harmless, some harmful. The latter, see figure 1 for an example, helped spread unfounded fears that 5G transmission towers were somehow connected with COVID-19. (Spoiler alert: they aren’t.) Figure 1: A typical online meme – in this case, making fun of false claims that 5G mobile phone towers somehow spread COVID-19. (Surprise: they don’t.) Source: https://www.reddit.com/r/meirl/comments/fxszo6/meirl/ Another problem: Images or text can be easily taken out of context online. According to Newtrals.com, a fact-checking news site, a WhatsApp text circulated widely claiming that in India only people who are vaccinated were getting COVID. It was based on an interview with a doctor in New Delhi – but taken entirely out of context. Why? To spread panic? Attract attention? Or was it part of something more nefarious? How to deal with this? The answer isn’t simple. Detecting, fact-checking and taking down dodgy content is harder than it might seem. There are already many online services that offer fact-checking help, in one way or another. But there are not always clear-cut lines separating good-faith information, opinions, fake news, disinformation, and incomplete or misleading information. Thus, there is a need for trustworthy oversight mechanisms that incorporate algorithmic fairness and accountability principles. The Misinformation Widget, or Ms.W, is a science-communication tool being developed by TRESCA, a Horizon 2020 project on scientific disinformation. Ms.W is both a methodology and a toolkit. It is a methodology in the sense that it indicates how to use a variety of online resources. It integrates a number of existing services that help users perform different tasks associated with content verification: news gathering, fact checking, blacklisting of fake sites, reverse image search, and more. It will also include new algorithms coming from current research about the application of machine learning to guide information curation and combat disinformation. It will help users track online news and their authors, and will help them evaluate the accuracy, credibility and trustworthiness of written and visual content. Ms.W can be used as a web service or a mobile app. Ms.W will be available in early 2022, as part of TRESCA’s free online course about scientific disinformation. Now in production, the TRESCA course, to be available on Coursera, will be a seven-week set of lectures and supporting documents, aimed at researchers, communicators and journalists, and exploring how to communicate trustworthy knowledge in the digital world. Stay tuned to join us there online – and stay away from COVID towers.