Politics of Disinformation. Группа авторов

Читать онлайн.
Название Politics of Disinformation
Автор произведения Группа авторов
Жанр Социальная психология
Серия
Издательство Социальная психология
Год выпуска 0
isbn 9781119743316



Скачать книгу

1 1 0.2 Kim et al. 2018 Content analysis + interview 4 0 4 8 1.8 Silver and Matthews 2017 Content analysis + interview + ethnography 1 2 3 0.7 Ritonga and Syahputra 2019 Content analysis + interview + focus group 1 1 0.2 Mwesige 2009 Content analysis + network analysis 1 3 4 0.9 Khaldarova and Pantti 2016 Content analysis + survey 2 2 4 0.9 Jerit and Barabas 2006 Qualitative Quantitative Mixed Matches % Sample Content analysis + survey + interview 1 1 0.2 Palomo and Sedano 2018 Ethnography + literature review6 1 1 0.2 Aparici et al. 2019 Interview + case study 2 2 0.5 Gilboa 2003 Interview + focus group 2 2 0.5 Meyen et al. 2016 Interview + ethnography 5 5 1.2 Srinivasan 2014 Survey + interview 2 1 4 7 1.6 Blanco-Herrero and Arcila-Calderon 2019 TOTAL 241 139 54 434 100

      The most widely used instruments for data gathering are the survey (29.4%) and content analysis (21.7%). The majority of surveys are conducted using an online panel, making use of companies that specialize in market studies to ensure high participation. One of those most utilized by researchers is Amazon Mechanical Turk or MTurk (Furman and Tunç 2019; Edgerly and Vraga 2020), but others also appear on the list, like Qualtrics (Garrett and Poulsen 2019), GfK, YouGov, and Nielsen IBOPE. Only Weeks and Garrett (2014) have had recourse to telephone surveys, which they used to analyze rumors during the 2008 presidential campaign in the United States.

      Technological advance has facilitated the use of tools for gathering massive data, which speeds up content analysis. Several authors use Media Cloud, an open-source platform for media analysis. Monitoring news coverage is complemented by the identification of keywords, the creation of word clouds, word counts, and even geocoding all the stories and showing the results on a map. To recover historical archives published in a web format they have recourse to Archive.org, although some studies use the database of the GDELT Project, which extracts content from Google News (Guo and Vargo 2020).

      Resources used for social media monitoring include Brandwatch or Netlytic, a cloud-based analyzer that uses public APIs to collect posts from Twitter and YouTube. Botometer enables the detection of possible messages by bots. For analyzing social media they also use UCINET, Gephi, and NodeXL.

      Regarding the software that enables data to be stored, transcribed, and codified, mention is made in the sample of some as well known as ATLAS.ti, NVivo, and MAXQDA. R is also employed for computational text analysis. Additionally, the majority of the statistical analyses use SPSS software.

      New Opportunities for Research

      Acknowledgements

      This