Algorithms in digital media and their influence on opinion formation
- Project team:
Britta Oertel (Project Manager); Steffen Albrecht,
Jakob Kluge, Diego Dametto - Thematic area:
- Topic initiative:
Committee on Cultural and Media Affairs
- Analytical approach:
TA project
- Startdate:
2017
- Enddate:
2022
- What significance do the algorithmic systems of the major online platforms have for opinion formation? (News from 22.12.2022)
- heise.de (23.12.2022) Robot journalism: Researchers call for labels for automatically generated texts. (in German)
- Plenary debate on the TAB report in the Bundestag on 10.02.2023 (in German)
With increasing importance for opinion formation, algorithms in online media are becoming the focus of research and regulation
In recent years, the use of digital media for news purposes and thus its significance for forming individual and public opinion has increased continuously. In these formats, which are accessed via the Internet and distributed via information intermediaries (major online platforms), algorithms determine the selection and structuring of the content displayed to users. In terms of the objectives and procedural models of the operators of online platforms and the data used, they are usually neither transparent nor comprehensible about how they finde the respective results – even for experts. But what influence do online platforms have on opinion formation? What characterizes the legal framework for media and what legislative course can be set to help strengthen the free individual and public opinion formation as a pillar of democratic societies?
sprungmarken_marker_1789
The increasing use of information intermediaries brings about upheavals in the mediation of news
In recent decades, the mass media - television, radio and the press - have played an important gatekeeper role in forminf public opinion. They determined
what news the public got to see or hear. Broadcasting and the press are guided by journalistic principles for ensuring information to the public and exercise functions of (political) control in a democracy. In principle, journalistic-editorial news is accessible to all members of society. Thus, they differ from messages posted nowadays by private individuals or groups on social media, often only in closed groups.
The role of information mediation has not only been increasingly taken over by online platforms, but also changed in the process. In particular, the large, internationally active information intermediaries such as Google, YouTube, Facebook, Twitter or Instagram mix public and personal news mediation. Algorithms are used to decide which messages are displayed to which people and in which order. The selection criteria are not necessarily based on values of care and diversity, but on the interests and attention of the users and the profit motive of the information intermediaries.
Effects of algorithmic systems on individual opinion formation can be assumed, further research is required
Algorithmic systems are thought to have an influence on opinion formation. This is because platforms that use such systems are used frequently and on a daily basis by many people for information, for example via smartphones. Algorithmic selection can potentially influence the importance of topics in public debate (topic setting), the communication of factual knowledge (knowledge acquisition), and the spectrum of opinions (opinion communication) on a topic. However, there is at present little robus scientific evidence on how algorithmically personalised information offerings affect individual opinion formation. Among other things, the role of fake news, echo chambers and filter bubbles for public communication are discussed.
Fake news can influence public opinion and political decisions thanks to digital media
Disinforming content is usually referred to as fake news in the public debate. The term covers manipulative, misleading or (demonstrably) false texts or images that attract high attention among readers due to their often sensational content and "news value." They are often forwarded. As a result, they are also prioritized by algorithmic processes. False news therefore spreads quickly via online platforms and often before the truth is checked and the content is corrected.
In large-scale disinformation and manipulation campaigns, false news is also spread intentionally. In some cases, even groups of users are targeted. The aim is to influence opinions and ultimately also political decisions.
A precise quantification of the extent of such false news among information intermediaries in Germany is not possible due to a lack of sufficient studies. Experts estimate that distorted representations, assertions without a factual basis and suggestive interpretations are particularly widespread.
Filter bubbles and echo chambers play a role in public debate, but there is no scientific evidence of their significance for opinion formation
Filter bubbles and echo chambers are used in the mass media and in politics as catchy buzzwords for information spaces in which the displayed content always confirms the opinions of the actors there are always confirmed and not questioned, taken up or discussed. While there is a theory that filter bubbles are formed when users prefer to communicate within a group, echo chambers emerge when users with the same opinions only follow each other and exclude other communication partners – thus also excluding different views.
However, the significance of filter bubbles has not been scientifically proven, at least with regard to search engine ads. Regarding social media, the assessments are inconsistent, but predominantly skeptical, especially in the publications relating to Europe. The vast majority of social media users do regularly come into contact with conflicting opinions on a regular basis.
Robot journalism is likely to gain in importance in the future; what is needed is a label for automatically generated texts
Robot journalism is the creation of journalistic-editorial content using automated processes. These involve algorithmic processes that transform structured data into narrative news texts. Only the initial programming is done by humans. The texts themselves are created on a current basis from the underlying data (e. g. on sporting events). More and more editorial offices are using automatically generated news texts, for example, to produce weather and financial reports, and can thus publish updates at short intervals. Due to the frequent updates, resulting texts or even videos, for example, are categorised to be more relevant in search engines and are displayed as the top rankings of the results lists.
Algorithmic systems are also used in journalism to test which headlines are most likely to be clicked on by users. News magazines or newspapers hope that such optimisations will generate higher advertising revenues from online ads. Technical progress is being made in the area of automatic speech, text and video generation. So, an increase in automated journalism is expected in the coming years. Readers of automatically generated texts cannot easily distinguish them from manually created news items.
First steps toward regulating information intermediaries have been taken in Germany and the EU
With the State Media Treaty, the German Länder, which are responsible for media regulation, included media intermediaries in the scope of application for the first time in 2020. The transparency obligations formulated therein represent a first step of legislative control, algorithmic intermediaries, however, have so far fallen neither under the broadcasting-centred model of concentration control nor under the platform regulation under the State Media Treaty, which shape the media order in Germany.
The European Parliament and the European Council adopted the Digital Services Act and the Digital Markets Act in 2022. The aim is to create more security in the digital space in the EU, protect users' fundamental rights and promote a level playing field for businesses. The laws will apply directly in Germany from 2 May 2023 (Digital Markets Act) and 17 February 2024 (Digital Services Act).
In particular, the Digital Services Act aims to counteract risks and dangers that arise for individuals and society as a whole from the use of – but also the dependence on – large
online platforms. The transparency measures also concern the algorithmic systems of the principal online platforms to show how algorithmic decisions are made and what effects these decisions have on society.
Downloads
TAB-Arbeitsbericht Nr. 204 (only in German) Algorithmen in digitalen Medien und ihr Einfluss auf die Meinungsbildung. Endbericht zum TA-Projekt (PDF) |
|
TAB-Fokus no. 42 Algorithms in digital media and their influence on opinion formation (PDF) |
Further publication
Kluge, J.; Oertel, B.; Evers-Wölk, M.
2018, August. Büro für Technikfolgen-Abschätzung beim Deutschen Bundestag (TAB)
In the Bundestag
- Vorgang - Bericht, Gutachten, Programm im Dokumentations- und Informationssystem für Parlamentsmaterialien (DIP)
- Consultation on the TAB report in the plenum of the Bundestag on 10.02.2023 (article and recording of the 40-minute debate)