Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 07.01.21 in Vol 23, No 1 (2021): January

Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/23805, first published Dec 11, 2020.

This paper is in the following e-collection/theme issue:

    Original Paper

    COVID-19 Misinformation Trends in Australia: Prospective Longitudinal National Survey

    Corresponding Author:

    Kristen Pickles, PhD

    The University of Sydney

    Rm 127A Edward Ford Building

    Sydney, 2006

    Australia

    Phone: 61 93512064

    Email: kristen.pickles@sydney.edu.au


    ABSTRACT

    Background: Misinformation about COVID-19 is common and has been spreading rapidly across the globe through social media platforms and other information systems. Understanding what the public knows about COVID-19 and identifying beliefs based on misinformation can help shape effective public health communications to ensure efforts to reduce viral transmission are not undermined.

    Objective: This study aimed to investigate the prevalence and factors associated with COVID-19 misinformation in Australia and their changes over time.

    Methods: This prospective, longitudinal national survey was completed by adults (18 years and above) across April (n=4362), May (n=1882), and June (n=1369) 2020.

    Results: Stronger agreement with misinformation was associated with younger age, male gender, lower education level, and language other than English spoken at home (P<.01 for all). After controlling for these variables, misinformation beliefs were significantly associated (P<.001) with lower levels of digital health literacy, perceived threat of COVID-19, confidence in government, and trust in scientific institutions. Analyses of specific government-identified misinformation revealed 3 clusters: prevention (associated with male gender and younger age), causation (associated with lower education level and greater social disadvantage), and cure (associated with younger age). Lower institutional trust and greater rejection of official government accounts were associated with stronger agreement with COVID-19 misinformation.

    Conclusions: The findings of this study highlight important gaps in communication effectiveness, which must be addressed to ensure effective COVID-19 prevention.

    J Med Internet Res 2021;23(1):e23805

    doi:10.2196/23805

    KEYWORDS



    Introduction

    False, misleading, or inaccurate health information can pose a serious risk to public health and public action [1]. Misinformation about COVID-19 is common and has spread rapidly across the globe through social media platforms and other information systems [2-5]. In February 2020, the World Health Organization’s Director-General declared the global “overabundance” of COVID-19 information an “infodemic” [6]. The term “misinfodemic” has since been coined to capture the corresponding increase in misinformation surrounding the virus [7].

    Misinformation, which is typically compelling, persuasive, and emotive, spreads on social media platforms significantly farther, faster, deeper, and more broadly than factual information [8]. This is particularly true within tight-knit communities, as has been observed with the spread of vaccine misinformation among some communities in the United States, Sweden, and the Netherlands [9-12]. Common COVID-19 beliefs circulating in mainstream media include framing the pandemic as a leaked bioweapon, a consequence of 5G wireless technology, a political hoax, and that the pandemic has been made up by governments to control people. Others detail ineffective measures that individuals can take to prevent or treat the disease, such as exposing themselves to sunlight or taking vitamin C [13].

    Misinformation can undermine public health efforts by shaping beliefs and attitudes, particularly if encountered within a social network, and reinforcing pre-existing values and positions [14]. Importantly, lower perceived risk or perceived efficacy of prevention behaviors and altered perception of social norms might influence individuals’ willingness to follow recommendations such as voluntary testing, isolation and, potentially, vaccination [15].

    Understanding what the public knows about COVID-19 and identifying beliefs based on misinformation can help shape effective public health communication to ensure effort to reduce its impact, such as debunking [16].

    This paper uses data from a longitudinal cohort study of the Australian public. Our aims were to: (1) investigate the prevalence of COVID-19 misinformation beliefs in the study sample; (2) examine whether any demographic, psychosocial, and cognitive factors are associated with these beliefs; and (3) investigate how these misinformation beliefs change over time.


    Methods

    Data Collection

    The data used in this study are from a prospective, longitudinal, national survey in Australia that aimed to evaluate variation in the public’s understanding, attitudes, and implementation of COVID-19 health advice during the first lockdown period in 2020 [17,18]. A total of 4362 participants were recruited between April 17 and 24, 2020; these participants completed the baseline survey (Round 1). This survey was administered 1 month after the first measures of prevention (physical distancing and quarantine) were introduced in Australia when an increasing number of COVID-19 cases were being reported. A subset (n=3214) of this sample was invited for a longitudinal follow-up to assess changes in attitudes, beliefs, and behavior over the course of the pandemic. Of the 3214 participants, 1882 (58.5%) were invited for the Round 2 survey, which was administered from May 8 to 15, 2020 (ie, 3 weeks after the baseline or Round 1). Round 3 survey was administered to 1369 of the 3214 participants (43%) from June 5 to 12, 2020 (ie, approximately 6 weeks after the baseline survey), when restrictions in Australia showed signs of easing, and the number of new COVID-19 cases and reported community transmission had drastically reduced. Round 3 survey was administered prior to the resurgence of COVID-19 cases in some areas of Australia.

    Recruitment

    Participants were recruited via advertisements on social media (ie, Facebook and Instagram) and by a market research company (Dynata). We used 2 different methods for recruitment with the aim of achieving a more diverse sample. Only those participants who were recruited via social media were invited for the longitudinal follow-up.

    Dynata is a market research company with access to a database of 600,000 members in Australia who are willing to be involved in online research studies. Dynata invites members to participate in a certain research study only when they meet the study’s eligibility criteria. For instance, only participants who met the following eligibility criteria were included in this study: adults (ie, age 18 years or older), currently living in Australia, and ability to read and understand English.

    Participants recruited via Dynata received points for completing the survey; these points could then be redeemed against gift vouchers, donations to charities, or cash. Participants recruited via social media were given the opportunity to enter into a prize draw for the chance to win one of ten $20 gift cards upon completion of each survey round.

    Ethics Approval

    Ethics approval was obtained from the Human Research Ethics Committee of The University of Sydney (2020/212). Participants were informed about the purpose of the study, confidentiality, and risks and benefits of participation at the beginning of the survey. Completion and submission of the online questionnaire were considered as evidence of consent.

    Measures

    The survey was built and administered using Qualtrics (SAP SE), an online survey platform, and it was piloted within the health literacy lab. Survey items included in each round were modified from the national longitudinal study [17] to reflect psychological, behavioral, and knowledge factors considered most relevant at that stage of restrictions. Relevant measures for this study are detailed in Table 1. Age, gender, education, language other than English (LOTE) spoken at home, and socioeconomic status were assessed in Round 1, as detailed in our previous study [17].

    Table 1. Measures evaluated in this study.
    View this table

    Statistical Analysis

    Analyses were conducted using Stata/IC (v16.1; StataCorp LLC). The threshold for statistical significance was set at P<.05. Descriptive statistics (means and SD for continuous variables, and frequency and relative frequency for categorical variables) were calculated for participant characteristics and study outcomes. To reduce the number of outcomes for analysis, misinformation beliefs at baseline were combined into a single measure using principal component analysis (PCA). Associations between the extracted misinformation component and possible explanatory variables were explored using truncated linear regression (with lower-bound truncation based on the minimum numerically possible value of the extracted misinformation component that would result from responding “strongly disagree” to all question items included in the PCA) controlling for sociodemographic factors previously shown to be associated with misinformation beliefs [17].

    Changes in misinformation beliefs across study rounds were examined using linear mixed models with random intercepts by the participant and robust standard errors. These items were analyzed individually owing to changes in the items included in each round.

    Dimension reduction using PCA was applied to the 10 specific COVID-19 myth items (included in Round 3 of the study). Multivariable truncated regression models (with lower-bound truncation as described above) were used to examine associations with the extracted components, using the same explanatory variables as for the analysis of misinformation beliefs from Round 1. Where survey items were repeated in Round 3 (ie, perceived threat of COVID-19, confidence in government, and use of social media as a “top-3” information source), this version of the variable was included; otherwise, the response at baseline was carried forward (ie, digital health literacy, institutional trust, and sociodemographic variables). An additional explanatory variable added in Round 3 (ie, rejection of official accounts) was also included in these models.


    Results

    Sample Characteristics (Cross-Sectional and Longitudinal)

    Sample characteristics by each month are summarized in Table 2. When compared to national data, our sample was slightly older, included more females, had higher educational attainment, and was less likely to speak a LOTE at home.

    Table 2. Sample characteristics by study round (1-3).
    View this table

    Misinformation Beliefs and Associations with Sociodemographic, Cognitive, and Psychosocial Variables (Cross-Sectional Sample in April)

    One month into lockdown in Australia, of the 4362 participants, 753 (17.3%) agreed that data about the effectiveness of vaccines is often made up (this survey question referred to vaccines in general, not a COVID-19 vaccine); 652 (15%) agreed that herd immunity would be beneficial for COVID-19, but this is covered-up; 603 (13.8%) agreed that the threat of COVID-19 is greatly exaggerated; and 595 (13.6%) agreed that the Australian government restrictions are stronger than required. Responses on these items were moderately correlated (pairwise r was between 0.36 and 0.63), with good internal consistency (Cronbach α=.78) and sufficient sampling adequacy (Kaiser-Meyer-Olkin or KMO=0.76). PCA of these items resulted in the extraction of a single component with an eigenvalue greater than 1, accounting for 60.7% of the variance (component loadings are provided in Table S1 in Multimedia Appendix 1). Estimated marginal mean values from the multivariable regression model of misinformation beliefs at baseline are provided in Table 3. Stronger agreement with misinformation beliefs was significantly associated with younger age, male gender, lower education, and primarily speaking a LOTE at home (P<.001 for all). After controlling for these variables, misinformation beliefs were found to be significantly associated (P<.001 for all) with lower levels of digital health literacy, perceived threat of COVID-19, confidence in the government, and trust in scientific institutions.

    Table 3. Multivariable truncated linear regression of strength of agreement with misinformation beliefs at Round 1a. Higher values of the outcome indicate greater support for misinformation.
    View this table

    Changes in Misinformation Beliefs Over Time (Longitudinal Sample in April-June)

    The prevalence of agreement with misinformation beliefs across the study period is shown in Figure 1, which appears to be generally consistent over time. Estimated mean values from the fixed portion of linear mixed models are presented in Table 4. A significant effect of time (P=.006) was identified for the misinformation belief that the threat of COVID-19 is greatly exaggerated, with pairwise contrasts showing an increase in this belief between April and May; however, this difference was not maintained in June. There was a decrease in the belief that herd immunity is beneficial for COVID-19 but is covered up between April and May (P<.001). No difference was observed in across the study period with regard to the strength of government restrictions belief (P=.41).

    Figure 1. Prevalence of agreement (i.e., responding as somewhat agree (5) to strongly agree (7) on the 1 to 7 Likert scale) with misinformation beliefs by study month. Error bars indicate 95% confidence intervals.].
    View this figure
    Table 4. Estimated means (95% CIs) of fixed effects from linear mixed models analyses (with random intercepts by the participant) of agreement with misinformation beliefs by study month and estimated mean differences (95% CIs) for pairwise comparisons to Round 1 (April).
    View this table

    Specific Misinformation Beliefs and Associations With Sociodemographic, Cognitive, and Psychosocial Variables (Longitudinal Sample In June)

    The level of agreement across the 10 COVID-19 misinformation items from the Australian Government website had moderate internal consistency (Cronbach α=.693) and sufficient sampling adequacy (KMO =0.761). Application of PCA (with varimax rotation) identified a 3-component solution with eigenvalues greater than 1, which cumulatively accounted for 51.15% of the variance (see Table S2 in Multimedia Appendix 1 for component loading and proportion agreement with each item). Examination of the contributing items to each component resulted in the following 3 labels:

    1. Symptom management and prevention misinformation: principal component (PC)1 (explaining 18.9% of the total variance)
    2. Causes and transmission misinformation: PC2 (explaining 16.7% of the total variance)
    3. Immunity and cure misinformation: PC3 (explaining 15.6% of the total variance)

    Regarding specific misinformation concerning symptom management and prevention, of the 1369 participants in Round 3, 301 (22%) participants agreed that hot temperatures kill the virus, 295 (21.5%) participants agreed that UV rays kill the virus, and 179 (13.1%) participants agreed that ibuprofen exacerbates COVID-19 (see Table S2 in Multimedia Appendix 1). Greater support for symptom management and prevention misinformation (PC1) was significantly associated with younger age and male gender, as well as with lower institutional trust and greater rejection of official accounts (PC1) after controlling for demographics (age, gender, education, and LOTE; see Table 5). For misinformation regarding causes and transmission, of the 1369 participants, 167 (12.2%) participants agreed that the virus causing COVID-19 was engineered and released from a Chinese laboratory in Wuhan, 57 (4.2%) participants agreed that parcels from China could spread the virus, and only 8 (0.6%) participants agreed that 5G networks are responsible for the spread of the virus. Causes and transmission misinformation (PC2) was significantly associated with less education and more social disadvantage. Greater belief in these statements was also associated with lower digital health literacy, reduced perceived public threat, reduced institutional trust, and greater rejection of official accounts after controlling for sociodemographic variables (PC2; see Table 5). Regarding misinformation about immunity and cure, of the 1369 participants in the sample, 62 (4.5%) participants agreed that vitamin C is an effective treatment, 55 (4%) participants agreed that there is a cure or vaccine for COVID-19, 32 (2.3%) participants agreed that hydroxychloroquine is an effective treatment, and 15 (1.1%) participants agreed that the flu shot provides immunity. Greater support for immunity and cure misinformation (PC3) was significantly associated with younger age. After controlling for sociodemographic factors, lower digital health literacy, reduced perceived public threat, reduced institutional trust, and greater rejection of official accounts were associated with greater belief in these statements (PC3; see Table 5).

    Table 5. Multivariable truncated linear regression of the misinformation beliefs in June (Round 3)a. Higher values of the outcome indicate greater support for these beliefs. Data are presented as estimated marginal mean differences (95% confidence intervals) and P values.
    View this table

    Discussion

    Principal Findings

    Our analysis showed lower institutional trust, lower digital health literacy, and greater rejection of official accounts were associated with a stronger agreement with COVID-19 misinformation beliefs. Misinformation was also more common among participants who primarily spoke a LOTE at home, in younger age groups, and in males. The most commonly held misinformation beliefs were concerning symptom management and prevention. We found small changes between April and May in two of the misinformation items: an increase in agreement with “COVID-19 is greatly exaggerated” and a decrease in agreement with “herd immunity is beneficial for COVID-19 but is covered up.” Despite these differences being statistically significant, they likely have little to no practical importance (ie, only a 0.08- and 0.12-unit change, respectively, on a 7-point scale). Notably, the proportion of participants agreeing with each item remained generally consistent over time during and after lockdown restrictions.

    The agreement rates of COVID-19 misinformation beliefs were lower than those reported in other countries [26,27], but we note that our study was not sampled to be representative of the Australian population. An Australian poll conducted in May 2020 found relatively high support (12%-77%) for misinformation beliefs relating to the creation, spread, and prevention of the virus [28]. Interestingly, compared with the results of this poll, we found a much lower prevalence of people agreeing that 5G networks are spreading the virus. The poll found demographic patterns similar to our findings, wherein male and younger participants agreed with a range of COVID-19 misinformation beliefs more than other groups. Studies have shown that in the United States and the United Kingdom, younger people are more likely to hold conspiracy beliefs about COVID-19 [29,30]. Moreover, other studies have found that American men are more likely to agree with COVID-19 conspiracy theories than women [31].

    The association between misinformation beliefs and lower education, LOTE, younger age, and male gender point toward important gaps in public health messaging to these specific groups. Our recent study highlights similar disparities in knowledge and behavior [17], as well as issues with the complexity of government health information about COVID-19. People with less education and LOTE had a poorer understanding of COVID-19 symptoms and were less frequently able to identify behaviors to prevent infection. Recently, attention has been focused on the importance of reaching people who do not speak English as their first language [32]. Our study further highlights the need for health information to be written to meet diverse health literacy requirements and targeted to specific study groups. For instance, young people and representatives of culturally and linguistically diverse groups should be involved in the design of COVID-19 messages to ensure appropriate tonality and delivery of the message. This can be achieved by testing communications with these groups, running consumer focus groups before releasing messages to the public, and ensuring representation on public health communication teams [33]. Ideally, a coproduction approach should be used to ensure targeted community messages about COVID-19 prevention are relevant and effective.

    The provision of quality information online is unlikely to be a sufficient strategy to counter the influence of misinformation if digital health literacy is not accounted for. Messaging and debunking must be delivered on multiple trusted channels [34], consistent in content and style, and conveyed in local languages to ensure engagement with all communities [35]. Emerging evidence supports the idea that psychological inoculation—pre-emptively exposing people to small doses of misinformation techniques—can build resistance to false information across cultures [36]. It will be important to invest in programs teaching digital health literacy and healthy skepticism of health news, including interventions nudging people to consider the accuracy of COVID-19–related news content before sharing it further [37]. Finally, partnerships between public health authorities and trusted organizations to deliver information and correct misinformation should be utilized where possible [38]. Corrective messages are most successful when they offer a coherent explanation for how and why a belief based on misinformation is incorrect [39]. Research shows that corrective information can counter misperceptions and improve belief accuracy after an individual has been exposed to misinformation [40].

    Timely, accurate, and transparent messaging is vital to gaining public trust in communication from authorities ahead of other, less credible sources [41]. Although there now is intense global interest aimed at limiting the spread of misinformation in the first place [2,36,42], this will require “a sustained and coordinated effort by independent fact-checkers, independent news media, platform companies, and public authorities to help the public understand and navigate the pandemic” [43].

    Around the world and in Australia, antilockdown protests have taken place in capital cities, with protesters voicing opposition to vaccination, telecommunication towers, and COVID-19 hoax. Researchers have recently investigated the degree to which misinformation about COVID-19 is associated with people’s willingness to adhere to public health recommendations and government-enforced measures; they found that willingness decreases significantly as the strength of misbeliefs increases [44,45]; this also includes decreased intentions to avail a COVID-19 vaccination [46]. In some cases, misinformation has led to serious harm, such as the Iranian methanol poisoning episode [47]. The spread of misinformation is an ongoing area of concern as Australia and other countries worldwide continue to live with the fluctuating realities of a global pandemic. Correcting misinformation should be viewed as a vitally important science and health policy activity [48]. Importantly, the more extreme conspiracy beliefs were rare; for example, fewer than 1% of the participants in our study sample endorsed the 5G conspiracy. However, other beliefs were held by over 20% of the participants in certain demographics, indicating widespread confusion or simply outdated information spread among people, such as that regarding the use of ibuprofen.

    Strengths and Limitations

    The study was large and diverse but not representative of the national population. Given this, caution is needed in generalizing from these prevalence findings. The sample was recruited via an online panel and social media. The majority of participants were well educated and a low proportion were from culturally and linguistically diverse groups. Therefore, this sample may not represent the demographics of all people concerned by COVID-19 and vulnerable to misinformation, including older adults. Participants recruited via Dynata were not included in the follow-up (ie, Rounds 2 and 3) due to funding constraints. Moreover, details of the specific social media platform(s) used by the participants (eg, YouTube, Twitter, and Facebook) were not captured in our survey, but it is important to note that both good- and poor-quality information may be obtained through these channels. (Mis)information can come from various sources such as family and friends, television, radio, print media, or misinformed health care providers (including primary, allied, alternative, and complementary health sectors). The use of social media as a “top-3” information source was comparable across education categories (ie, 45% for all 3 categories); however, given the abovementioned limitation, it is unclear which platform is being used by whom.

    The longitudinal design of this study enabled us to evaluate whether misinformation beliefs changed over the course of the pandemic. By design, the survey items changed across time; however, this prevented us from being able to determine longitudinal changes in the PC derived at the baseline. Finally, some of the misinformation items are likely contextual and subjective (eg, “the government restrictions are stronger than is needed”), which may have influenced the interpretation and responses of some participants.

    Incorrect information about COVID-19—whether labeled as misinformation, myth, conspiracy theory, or rumor—circulates every day, and our knowledge regarding the value of various preventive interventions has progressed during the course of the pandemic. While we acknowledge that some of the misinformation items included in this survey were subject to legitimate inquiry (eg, advice recommending against the use of ibuprofen was issued by the World Health Organization early in the pandemic but then retracted), they have since been demonstrated to be scientifically incorrect, classified as misinformation, and included on myth-busting lists of leading public health institutions. The broader implication is that the groups identified in this study are more likely to agree with misinformation, including younger age, male, lower education, lower health literacy, and LOTE, may not be receiving up-to-date, evidence-based advice.

    Conclusions

    Misinformation can undermine public health efforts. The findings of this survey-based study highlight important gaps in communication effectiveness in the context of the COVID-19 pandemic. In efforts to prebunk and debunk misinformation, public health authorities must urgently build new partnerships with trusted, influential stakeholders and social media companies to reach the groups identified in this study. Communicators must pay close attention to ensuring that all communities can access, understand, and act on reliable COVID-19 advice.

    Acknowledgments

    We would like to thank all participants of this longitudinal COVID-19 survey for their ongoing participation in this research.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Principal component loadings from principal component analyses for COVID-19 misinformation beliefs at (1) Round 1 and (2) Round 3 with percentage agreement and descriptive statistics presented.

    DOCX File , 16 KB

    References

    1. Tan ASL, Lee C, Chae J. Exposure to health (mis)information: lagged effects on young adults' health behaviors and potential pathways. J Commun 2015 Jul 06;65(4):674-698. [CrossRef]
    2. Zarocostas J. How to fight an infodemic. The Lancet 2020 Feb;395(10225):676. [CrossRef]
    3. Depoux A, Martin S, Karafillakis E, Preet R, Wilder-Smith A, Larson H. The pandemic of social media panic travels faster than the COVID-19 outbreak. J Travel Med 2020 May 18;27(3) [FREE Full text] [CrossRef] [Medline]
    4. Kouzy R, Abi Jaoude J, Kraitem A, El Alam MB, Karam B, Adib E, et al. Coronavirus goes viral: quantifying the COVID-19 misinformation epidemic on Twitter. Cureus 2020 Mar 13;12(3):e7255 [FREE Full text] [CrossRef] [Medline]
    5. Mian A, Khan S. Coronavirus: the spread of misinformation. BMC Med 2020 Mar 18;18(1):89 [FREE Full text] [CrossRef] [Medline]
    6. Novel Coronavirus (2019-nCoV) Situation Report - 13. World Health Organization. 2020 Feb 2.   URL: https:/​/www.​who.int/​docs/​default-source/​coronaviruse/​situation-reports/​20200202-sitrep-13-ncov-v3.​pdf [accessed 2020-12-16]
    7. Krause NM, Freiling I, Beets B, Brossard D. Fact-checking as risk communication: the multi-layered risk of misinformation in times of COVID-19. Journal of Risk Research 2020 Apr 22;23(7-8):1052-1059. [CrossRef]
    8. Vosoughi S, Roy D, Aral S. The spread of true and false news online. Science 2018 Mar 09;359(6380):1146-1151. [CrossRef] [Medline]
    9. Feemster KA, Szipszky C. Resurgence of measles in the United States: how did we get here? Curr Opin Pediatr 2020 Feb;32(1):139-144. [CrossRef] [Medline]
    10. Hall V, Banerjee E, Kenyon C, Strain A, Griffith J, Como-Sabetti K, et al. Measles outbreak - Minnesota April-May 2017. MMWR Morb Mortal Wkly Rep 2017 Jul 14;66(27):713-717 [FREE Full text] [CrossRef] [Medline]
    11. Ruijs WL, Hautvast JL, van der Velden K, de Vos S, Knippenberg H, Hulscher ME. Religious subgroups influencing vaccination coverage in the Dutch Bible belt: an ecological study. BMC Public Health 2011 Feb 14;11:102 [FREE Full text] [CrossRef] [Medline]
    12. Jama A, Ali M, Lindstrand A, Butler R, Kulane A. Perspectives on the measles, mumps and rubella vaccination among Somali mothers in Stockholm. Int J Environ Res Public Health 2018 Nov 01;15(11):2428 [FREE Full text] [CrossRef] [Medline]
    13. COVID-19 Mythbusting. Australian Government. 2020.   URL: https://www.australia.gov.au/covid-19-mythbusting [accessed 2020-12-16]
    14. Scheufele DA, Krause NM. Science audiences, misinformation, and fake news. Proc Natl Acad Sci U S A 2019 Apr 16;116(16):7662-7669 [FREE Full text] [CrossRef] [Medline]
    15. Sheeran P, Maki A, Montanaro E, Avishai-Yitshak A, Bryan A, Klein WMP, et al. The impact of changing attitudes, norms, and self-efficacy on health-related intentions and behavior: A meta-analysis. Health Psychol 2016 Nov;35(11):1178-1188. [CrossRef] [Medline]
    16. Lewandowsky S, Cook J, Ecker UKH. Under the Hood of The Debunking Handbook 2020: A consensus-based handbook of recommendations for correcting or preventing misinformation. Climate Social Science Network. 2020 Oct.   URL: https://www.climatechangecommunication.org/wp-content/uploads/2020/10/DB2020paper.pdf
    17. McCaffery K, Dodd RH, Cvejic E, Ayrek J, Batcup C, Isautier JM, et al. Health literacy and disparities in COVID-19-related knowledge, attitudes, beliefs and behaviours in Australia. Public Health Res Pract 2020 Dec 09;30(4):1-9 [FREE Full text] [CrossRef] [Medline]
    18. Dodd RH, Cvejic E, Bonner C, Pickles K, McCaffery KJ, Sydney Health Literacy Lab COVID-19 group. Willingness to vaccinate against COVID-19 in Australia. Lancet Infect Dis 2020 Jun 30;S1473-3099(20):30559-30554 [FREE Full text] [CrossRef] [Medline]
    19. Shapiro GK, Holding A, Perez S, Amsel R, Rosberger Z. Validation of the vaccine conspiracy beliefs scale. Papillomavirus Res 2016 Dec;2:167-172 [FREE Full text] [CrossRef] [Medline]
    20. Norman CD, Skinner HA. eHEALS: The eHealth Literacy Scale. J Med Internet Res 2006 Nov 14;8(4):e27 [FREE Full text] [CrossRef] [Medline]
    21. Wolf M, Serper M, Opsasnick L, O'Conor RM, Curtis L, Benavente JY, et al. Awareness, attitudes, and actions related to COVID-19 among adults with chronic conditions at the onset of the U.S. outbreak: a cross-sectional survey. Ann Intern Med 2020 Jul 21;173(2):100-109 [FREE Full text] [CrossRef] [Medline]
    22. My C, Danchin M, Willaby HW, Pemberton S, Leask J. Parental attitudes, beliefs, behaviours and concerns towards childhood vaccinations in Australia: A national online survey. Aust Fam Physician 2017 Mar;46(3):145-151 [FREE Full text] [Medline]
    23. Frew PM, Murden R, Mehta CC, Chamberlain AT, Hinman AR, Nowak G, et al. Development of a US trust measure to assess and monitor parental confidence in the vaccine system. Vaccine 2019 Jan 07;37(2):325-332 [FREE Full text] [CrossRef] [Medline]
    24. Uscinski JE, Enders AM, Klofstad C, Seelig M, Funchion J, Everett C, et al. Why do people believe COVID-19 conspiracy theories? HKS Misinfo Review 2020 Apr 28;1. [CrossRef]
    25. AQF qualifications. Australian Qualifications Framework.   URL: https://www.aqf.edu.au/aqf-qualifications [accessed 2020-12-22]
    26. Douglas KM, Uscinski JE, Sutton RM, Cichocka A, Nefes T, Ang CS, et al. Understanding Conspiracy Theories. Political Psychology 2019 Mar 20;40(S1):3-35. [CrossRef]
    27. Geldsetzer P. Knowledge and Perceptions of COVID-19 Among the General Public in the United States and the United Kingdom: A Cross-sectional Online Survey. Ann Intern Med 2020 Jul 21;173(2):157-160 [FREE Full text] [CrossRef] [Medline]
    28. Belief in Conspiracy Theories. Essential Research. 2020 May 19.   URL: https://essentialvision.com.au/belief-in-conspiracy-theories [accessed 2020-12-16]
    29. Allington D, Duffy B, Wessely S, Dhavan N, Rubin J. Health-protective behaviour, social media usage and conspiracy belief during the COVID-19 public health emergency. Psychol Med 2020 Jun 09:1-7. [CrossRef]
    30. Schaeffer K. Nearly three-in-ten Americans believe COVID-19 was made in a lab. Pew Research Centre. 2020 Apr 08.   URL: https:/​/www.​pewresearch.org/​fact-tank/​2020/​04/​08/​nearly-three-in-ten-americans-believe-covid-19-was-made-in-a-lab/​ [accessed 2020-12-18]
    31. Cassese EC, Farhart CE, Miller JM. Gender differences in COVID-19 conspiracy theory beliefs. Pol & Gen 2020 Jul 09:1-10. [CrossRef]
    32. Grey A. Australia's multilingual communities are missing out on vital coronavirus information. ABC News: The Conversation. 2020 Jun 29.   URL: https:/​/www.​abc.net.au/​news/​2020-06-29/​coronavirus-multilingual-australia-missing-out-covid-19-info/​12403510 [accessed 2020-12-18]
    33. The best research is produced when researchers and communities work together. Nature 2018 Oct;562(7725):7. [CrossRef] [Medline]
    34. Fridman I, Lucas N, Henke D, Zigler CK. Association between public knowledge about COVID-19, trust in information sources, and adherence to social distancing: cross-sectional survey. JMIR Public Health Surveill 2020 Sep 15;6(3):e22060 [FREE Full text] [CrossRef] [Medline]
    35. Ghio D, Lawes-Wickwar S, Tang MY, Epton T, Howlett N, Jenkinson E, et al. What influences people's responses to public health messages for managing risks and preventing disease during public health crises? A rapid review of the evidence and recommendations. PsyArXiv Preprint posted online October 5, 2020. [CrossRef]
    36. Roozenbeek J, van der Linden S, Nygren T. Prebunking interventions based on the psychological theory of “inoculation” can reduce susceptibility to misinformation across cultures. HKS Misinfo Review 2020 Feb 3. [CrossRef]
    37. Pennycook G, McPhetres J, Zhang Y, Lu JG, Rand DG. Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy-nudge intervention. Psychol Sci 2020 Jul;31(7):770-780 [FREE Full text] [CrossRef] [Medline]
    38. Bavel JJV, Baicker K, Boggio PS, Capraro V, Cichocka A, Cikara M, et al. Using social and behavioural science to support COVID-19 pandemic response. Nat Hum Behav 2020 May;4(5):460-471. [CrossRef] [Medline]
    39. Walter N, Murphy ST. How to unring the bell: A meta-analytic approach to correction of misinformation. Communication Monographs 2018 May 15;85(3):423-441. [CrossRef]
    40. van der Meer TGLA, Jin Y. Seeking formula for misinformation treatment in public health crises: The effects of corrective information type and source. Health Commun 2020 May;35(5):560-575. [CrossRef] [Medline]
    41. Legido-Quigley H, Asgari N, Teo YY, Leung GM, Oshitani H, Fukuda K, et al. Are high-performing health systems resilient against the COVID-19 epidemic? The Lancet 2020 Mar;395(10227):848-850. [CrossRef]
    42. Stephenson J. United Nations seeks to counter COVID-19 misinformation with digital first responders. JAMA Health Forum 2020 Jun 02;1(6):e200700. [CrossRef]
    43. Brennen JS, Simon F, Howard PN, Nielsen RK. Types, sources, and claims of COVID-19 misinformation. Reuters Institute. 2020 Apr 7.   URL: https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation [accessed 2020-12-18]
    44. Imhoff R, Bruder M. Speaking (un–)truth to power: conspiracy mentality as a generalised political attitude. Eur J Pers 2020 Dec 02;28(1):25-43. [CrossRef]
    45. Oliver JE, Wood T. Medical conspiracy theories and health behaviors in the United States. JAMA Intern Med 2014 May;174(5):817-818. [CrossRef] [Medline]
    46. Teovanović P, Lukić P, Zupan Z, Lazić A, Ninković M, Žeželj I. Irrational beliefs differentially predict adherence to guidelines and pseudoscientific practices during the COVID‐19 pandemic. Appl Cognit Psychol 2020 Dec 07. [CrossRef]
    47. Delirrad M, Mohammadi AB. New methanol poisoning outbreaks in Iran following COVID-19 pandemic. Alcohol Alcohol 2020 Jun 25;55(4):347-348 [FREE Full text] [CrossRef] [Medline]
    48. Caulfield T. Does debunking work? Correcting COVID-19 misinformation on social media. OSF Preprint posted online May 25, 2020. [CrossRef]


    Abbreviations

    KMO: Kaiser-Meyer-Olkin
    LOTE: language other than English
    PC: principal component
    PCA: principal component analysis


    Edited by G Fagherazzi; submitted 24.08.20; peer-reviewed by F Sartor, P Phiri; comments to author 26.09.20; revised version received 22.10.20; accepted 09.12.20; published 07.01.21

    ©Kristen Pickles, Erin Cvejic, Brooke Nickel, Tessa Copp, Carissa Bonner, Julie Leask, Julie Ayre, Carys Batcup, Samuel Cornell, Thomas Dakin, Rachael H Dodd, Jennifer M J Isautier, Kirsten J McCaffery. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 07.01.2021.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.