The Karma system is currently undergoing maintenance (Monday, January 29, 2018).
The maintenance period has been extended to 8PM EST.

Karma Credits will not be available for redeeming during maintenance.

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 19.02.15 in Vol 17, No 2 (2015): February

This paper is in the following e-collection/theme issue:


    Medical Wikis Dedicated to Clinical Practice: A Systematic Review

    1Département de médecine générale, Faculté de Médecine Lyon Est, Université Claude Bernard Lyon 1, Lyon CEDEX 08, France

    2Département de rhumatologie, Centre Hospitalier Lyon Sud, Pierre-Bénite, France

    3Equipe d’Accueil 4129 « Santé Individu Société », Faculté de Médecine Laënnec, Université de Lyon, Lyon, France

    Corresponding Author:

    Alexandre Brulet, MD

    Département de médecine générale

    Faculté de Médecine Lyon Est

    Université Claude Bernard Lyon 1

    8 avenue Rockefeller

    Lyon CEDEX 08, 69373


    Phone: 33 686411687

    Fax:33 778777288



    Background: Wikis may give clinician communities the opportunity to build knowledge relevant to their practice. The only previous study reviewing a set of health-related wikis, without specification of purpose or audience, globally showed a poor reliability.

    Objective: Our aim was to review medical wiki websites dedicated to clinical practices.

    Methods: We used Google in ten languages, PubMed, Embase, Lilacs, and Web of Science to identify websites. The review included wiki sites, accessible and operating, having a topic relevant for clinical medicine, targeting physicians or medical students. Wikis were described according to their purposes, platform, management, information framework, contributions, content, and activity. Purposes were classified as “encyclopedic” or “non-encyclopedic”. The information framework quality was assessed based on the Health On the Net (HONcode) principles for collaborative websites, with additional criteria related to users’ transparency and editorial policy. From a sample of five articles per wikis, we assessed the readability using the Flesch test and compared articles according to the wikis’ main purpose. Annual editorial activities were estimated using the Google engine.

    Results: Among 25 wikis included, 11 aimed at building an encyclopedia, five a textbook, three lessons, two oncology protocols, one a single article, and three at reporting clinical cases. Sixteen wikis were specialized with specific themes or disciplines. Fifteen wikis were using MediaWiki software as-is, three were hosted by online wiki farms, and seven were purpose-built. Except for one MediaWiki-based site, only purpose-built platforms managed detailed user disclosures. The owners were ten organizations, six individuals, four private companies, two universities, two scientific societies, and one unknown. Among 21 open communities, 10 required users’ credentials to give editing rights. The median information framework quality score was 6 out of 16 (range 0-15). Beyond this score, only one wiki had standardized peer-reviews. Physicians contributed to 22 wikis, medical learners to nine, and lay persons to four. Among 116 sampled articles, those from encyclopedic wikis had more videos, pictures, and external resources, whereas others had more posology details and better readability. The median creation year was 2007 (1997-2011), the median number of content pages was 620.5 (3-98,039), the median of revisions per article was 17.7 (3.6-180.5) and 0.015 of talk pages per article (0-0.42). Five wikis were particularly active, whereas six were declining. Two wikis have been discontinued after the completion of the study.

    Conclusions: The 25 medical wikis we studied present various limitations in their format, management, and collaborative features. Professional medical wikis may be improved by using clinical cases, developing more detailed transparency and editorial policies, and involving postgraduate and continuing medical education learners.





    Access to information is a daily concern for clinicians, especially in general practice where the expertise field is particularly wide. Clinicians have to apply evidence-based knowledge as far as possible to manage varied and complex medical issues [1]. The medical information they use for practice must be accurate, readable, reliable, and up to date. As the use of primary sources requires documentary research methods and is time-consuming, clinicians usually refer to available syntheses such as practice guidelines, educational journals, or medical textbooks. However, these resources are often limited by language barriers [2], missing evidence [3], low acceptability [4], and conflicts of interest [5].

    Wikis are websites characterized by a collaborative edition between users. A “wiki” is a type of content managing system differing from others in that the content is created without any defined owner [6]. Wikis belong to Web 2.0, which includes other interactive Web tools such as blogs (where users edit their own content), forums (where users discuss), and social networks (where users post comments) [7]. Since the wiki principle was initiated in 1995 on WikiWikiWeb, a site dedicated to programmers, hundreds of types of software have been developed to operate it [8]. Among them, MediaWiki is a worldwide reference that supports the 285 languages of the general encyclopedia Wikipedia. Subsequently, various medical wikis have emerged, including orphan diseases’ resources, terminology databases, care decision supports, and medical teaching resources [9-12]. Wikis may help to remediate other medical resources’ limitations by giving clinician communities the opportunity to build knowledge relevant to their practice [13].

    The recent review of the literature about wikis and collaborative writing applications in health care by Archambault et al broadly explored use patterns, quality of information, and knowledge translation interests, and brought out a need for primary research on these applications [14]. Among the 25 articles in this review assessing the quality of the information, all but one targeted Wikipedia [15], whose medical content is controversial [16-18]. In the study published in 2009 by Dobrogowska-Schlebusch [15], 52 health-related wikis were included without specification of purpose or audience and assessed using the online Health Summit Working Group Information Quality tool (HSWG IQ tool) [19]. It globally showed poor quality scores, except for a few wikis having implemented expert moderation or peer reviews. The “quality of information” in a website actually refers either to its framework, including transparency and policy considerations such as in the HSWG IQ tool, or to its content, especially its scientific value. Assessing the content in wikis is problematic as it is only a snapshot of a long-lasting interaction [20-22].

    Our study aimed at systematically reviewing medical wikis dedicated to clinical practices according to their purposes, platform, management, information framework, contributions, content, and activity.


    Screening Strategy

    In October 2011, we performed Google queries searching for the phrase “list of medical wikis” translated in the 10 most spoken languages on the Internet (English, Chinese, Spanish, Japanese, French, Portuguese, German, Arabic, Russian, and Korean), using the Google translation tool when necessary [23]. The phrase was expanded as far as possible within the limit of 500 resulting pages. The English query was filtered in order to remove an extensively cited page, which has been kept once for data extraction [24]. Every resulting page was browsed in order to extract Internet addresses (uniform resource locators [URLs]) linking to potentially relevant sites (Multimedia Appendix 1).

    Second, we searched PubMed and Web of Science (using “wiki” AND [“medic*” OR “clinic*”]) and Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS) (using “wiki”) in full texts for articles published until September 2012. Every open-access abstract and open access article was read, coupled with Web searches when necessary, in order to identify any potentially relevant URL (Multimedia Appendix 2).

    Finally, we included any other potentially relevant URL retrieved through Web extra-browsing or expert advice, until September 2012. One author (AB) made all data extractions of the screening.

    Sites’ Inclusion and Exclusion

    Websites were included if they were (1) accessible from a public Internet protocol address; (2) operating a wiki tool, defining a “wiki” as “a type of content managing system (CMS) used for collaborative edition, where the content is created without any defined owner” [6], excluding wiki-based platforms used as non-collaborative CMS, like Wikinu [25], and websites where a collaborative edition was allowed on owned contents, like Google Knols [26]; (3) aimed at building some knowledge relevant for a clinical practice, defining “clinical” as “of or relating to the bedside of a patient, the course of a disease, or the observation and treatment of patients directly” [27], excluding medical topics not directly linked to the care of patients (medical research, medical informatics, biomedical sciences, medical curriculums, pharmacology, public health), and topics not specifically interesting physicians (other health care disciplines, patient information, first aid); and (4) explicitly targeting physicians or medical students in audiences. Wikis orientated toward general public, like Wikipedia, were excluded [28]. In addition, websites were excluded if they were dysfunctional, explicitly interrupted, only aiming at displaying external resources. Some clinical-oriented wikis, like Medical Matters Wiki, were excluded as bibliographic resources [29].

    The inclusion and exclusion was done by 2 authors (AB and LL), and disagreements were solved by discussion.

    Sites’ Description and Assessment


    All data collections from the included sites were performed in October and November 2012. The main language interface of each wiki, that is, the one having the biggest amount of content, was used as a reference to collect data. No direct contact to sites’ administrators was undertaken. The data retrieval was done by 1 author (AB), and their assessments were performed by 2 authors (AB and LL). Disagreements were solved by discussion.


    Wikis’ main purposes were described on the basis of sites’ disclosures. Defining the term “encyclopedic” as a comprehensive reference work within a knowledge field [30], wikis were classified as “encyclopedic” or “non-encyclopedic” according to their statement of main purpose. Target audiences were described on the basis of sites’ disclosures, considering only physicians, medical students, and lay persons.


    Platforms were described according to software, user data, ergonomics, and clinically relevant utilities, by systematically browsing sites and using their functionalities.


    Management was described on the basis of sites’ disclosures and technical characteristics. The access for editing was systematically tested anonymously and after login whenever registration was possible. A user community was defined as “closed” when the editing rights accreditation was not publicly opened. The registration process was defined as “automated” when filling out a form triggered the login access, and “on credentials” when some personal information had to be first checked. In case of hierarchy between registered users, those having special rights were consistently named “super-users”, and their nomination procedure and specific roles were described. We named “administrators” those super-users having enlarged rights such as deleting or massively editing content, assigning or removing rights to users, blocking pages, blocking users, etc.

    Information Framework

    The Health On the Net ethical code of conduct (HONcode), as adapted for collaborative websites, was used as a reference to perform the information framework quality assessment [31]. However, the adaptation of its principle about the authoritativeness of the information only makes mandatory the disclosure of the credentials of “moderators”. The wiki context makes every editing user responsible for edited content, and in a professional context, more author details than just credentials should be disclosed. We therefore built a set of 16 criteria for assessing the information framework quality, including 11 derived from the HONcode and 5 fitted to medical wikis. An operational definition was assigned to each of these criteria, including four definitions validated by Bernstam et al (Table 1) [32]. The assessment of these criteria was performed by 2 authors (AB and LL). Their agreement was measured by calculating an r correlation coefficient [33].

    Table 1. The 16 information framework quality criteria.
    View this table

    Physicians were considered as contributors by default, except when they were not targeted in the audience. The contributions of medical learners (students or physicians) were described based on educational objectives, or when mentioned in super-users’ credentials. Lay persons’ contributions were described according to the registration requirements. The presence of clinical case reports was systematically searched by querying sites with the key word “case”. Any content reporting some clinical materials issued from users’ practice was considered.


    This part of the study aimed at describing the characteristics of the contents and assessing their readability. However, the scientific value of contents in itself was not assessed. From each wiki, we selected a sample of the 5 most revised articles. Articles were included if they had a clinically relevant topic and were written in the main language of the wiki. In sites where the numbers of revisions were not available, we subjectively selected the most finalized articles. We described characteristics related to content (presence of pictures, videos, diagrams, posology details, evidence levels and external resources, and numbers of words and references per article) and data related to edition (numbers of revisions and authors per article, and related talks). The sampled articles were assessed with Flesch’s reading ease test adapted to each language and performed with automated hyphenation [34]. Characteristics of articles were compared between encyclopedic and non-encyclopedic groups by using Fisher’s exact test for qualitative data and the Wilcoxon rank test for quantitative data.


    Wikis’ global activities were described on the basis of available data from sites (absolute numbers of content pages, revisions, and talk pages). Displayed numbers of users were considered globally inaccurate since we suspected tens of false user registrations across several sites, presumably due to vandalism attacks. In order to estimate annual activity, content pages were counted according to their last edition date by performing empty queries on Google, filtered on each URL, and for each year since the wiki’s creation. A recent editorial rate was estimated by reporting the number of pages last edited in the 365 previous days to that edited since creation. Rates higher than 50% were considered as “very high”, and rates lower than 10% were considered as “very low”. A recent editorial trend was estimated by reporting the number of pages last edited in the 365 previous days to that last edited in the 365 days before. Trends higher than 300% were considered as “sharply increasing” and trends lower than 33% as “sharply decreasing”.


    Sites’ Screening

    The Google search yielded 341 pages, including 27 linking to some potentially relevant URLs. After extraction and removing duplicates, 141 URLs were collected (Multimedia Appendix 1). The literature search yielded 133 articles, 104 after removing duplicates. After identification of potentially relevant URLs and removing duplicates, 38 URLs were collected (Multimedia Appendix 2). Four additional potentially relevant URLs were retrieved from other sources. Merging all results and removing duplicates, 176 potentially relevant URLs were finally collected (Figure 1, Multimedia Appendix 3).

    Figure 1. Site screening, exclusion, and inclusion flow diagram.
    View this figure

    Sites’ Exclusion and Inclusion

    Of the 176 collected URLs, 31 met the inclusion criteria. Six of them became inoperative during the study. Finally, 25 wikis were retained for analysis [35-59] (Figure 1; Multimedia Appendix 3).

    Sites’ Description and Assessment


    The main languages were English (19 wikis), German (3), French (2), and Chinese (1), and four wikis had a second language interface. The purpose was encyclopedic for 11 wikis, including one also aiming at reporting clinical cases. Among the 14 wikis having a non-encyclopedic purpose, five aimed at editing a textbook, three medical lessons, two oncology protocols, one a single focused article, and three at reporting clinical cases, including one also displaying a textbook-like wiki area. Whereas 16 wikis were specialized to specific themes or disciplines, nine were not. Physicians were explicitly targeted by 22 wikis, medical learners by 18, and lay persons by five (Table 2).

    Table 2. Wikis’ purposes.
    View this table

    MediaWiki in its native form was supporting 15 sites. Three sites were hosted by online “wiki farms”, that are ready-to-use multifunctional platforms [60-62]. The remaining seven sites had purpose-built platforms, including two developed upon MediaWiki. As opposed to every purpose-built platform, only one site using MediaWiki natively systematically managed users’ real names and credentials. Wiki farms and purpose-built platforms included various forms of forums and social networks. Editing on MediaWiki required using a specific mark-up language, whereas all other software had a “What You See is What You Get” editing interface. Three wikis had automated links to PubMed or Cochrane library external databases. Two wikis operated a semantic management for synonyms or keywords. Two wikis provided some medical imaging facilities (Table 3).

    Table 3. Wikis’ platform.
    View this table

    Sites’ owners were non-profit organizations (n=10), individuals (n=6), private companies (n=4), scientific societies (n=2) or universities (n=2), and one could not be identified. Six wikis restricted access to their talk pages and users’ profile areas, and one wiki restricted access to its articles. Two wikis allowed any visitor to edit without registering. Registration was automated in 11 wikis, based upon credentials in 10, and limited to a closed community in four. A hierarchy between registered users existed in 14 wikis, among which three restricted the edition (or the validation of edition proposals) to super-users only. Super-users could be organized in “editorial boards” (n=9), responsible for the whole content, in “lead authors” (n=4), responsible for some articles, or in “moderators” (n=2), responding on call. Super-users were nominated without any explicit procedure in 10 wikis, subjectively in consideration of users’ credentials or activity in two wikis, and following a systematic procedure based on a score or a vote in two wikis. Super-users were divided in more than two types of roles in four wikis (Table 4).

    Table 4. Wikis’ management.
    View this table
    Information Framework

    The owner’s identity was displayed on 19 wikis, its contact details on 21, its funding sources on 14, and its potential conflicts of interest on seven. A medical advisory statement was displayed on 17 wikis, a policy for users’ privacy on 17, and a policy about advertising on 10. A review policy was displayed on 10 wikis, a rule for the protection of patients’ data on 11, a rule for referencing information on nine, a rule for delivering true information on 11, and a rule for organizing content on five. The editing users’ identity was systematically displayed on nine wikis, their credentials on seven, their potential conflicts of interest on two, and the administrators’ identity was systematically displayed on three wikis, which were all made by students [51,52,59]. The total information framework quality score ranged from zero to 15 out of 16, with a median score of 6 (Table 5). The correlation between raters was fair (R2=.68). Beyond these criteria, only one wiki organized standardized peer-reviews [39].

    Table 5. Wikis’ information framework quality assessment.
    View this table

    Physicians were considered as contributors by default in all wikis except the three made by and for students [51,52,59]. Medical learners contributed according to a formal educational goal on four wikis, and as super-users on five wikis. Lay persons contributed to four wikis. Clinical cases were reported on nine wikis (Table 6).

    Table 6. Wikis’ contributions.
    View this table

    As only one wiki displayed a single article and another did not allow access to its relevant content, 116 articles were sampled, including 58 most revised and 58 most finalized. Numbers of authors were not available for five encyclopedic articles. Numbers of revisions and of authors were not available for five non-encyclopedic articles. Pictures, videos, and external resources were more frequent in articles from encyclopedic wikis. Posology details were more frequent in articles from non-encyclopedic wikis (P<.01). The Flesch reading ease scores were lower in encyclopedic wikis (Table 7).

    Table 7. Features of content, of edition, and readability of articles according to wiki purpose (N=116 articles).
    View this table

    Wikis had been created between 1997 and 2011 (median year: 2007). Content pages per wiki varied from 3 to 98,039 (median 620.5), revisions per content page from 3.6 to 180.5 (median 17.7), and talk pages per content page from 0 to 0.42 (median 0.015). Among five particularly active wikis, three had a high previous year editorial rate and three a sharply increasing editorial trend. Among six wikis almost unused, six had a low previous year editorial rate, and three a sharply decreasing editorial trend. The activity of one wiki having a sharply increasing trend upon a very low previous editorial rate was not interpreted (Table 8). Two wikis included in this review were discontinued after the completion of the study [35,47].

    Table 8. Wikis’ activities.
    View this table


    Principal Findings

    From this international review, we identified 25 medical wikis dedicated to clinical practices. The majority were in English and four were bilingual. They had various purposes, dominated by encyclopedic perspectives (44%), and most were specialized (64%). The MediaWiki software was commonly used (68%), often in its native form (60%). Site owners were mostly non-profit organizations (40%) and individuals (24%); only two were universities. While practicing physicians were major contributors (88%), medical learners (36%) and lay persons (16%) sometimes contributed.

    Cross-reading our results, the relevancy for clinicians of the medical wikis can be discussed according to four information properties: accuracy, readability, reliability, and currency. Accuracy may be impaired in wikis not displaying a review policy (60%) and in those not delivering rules for organizing content (80%) [63,64]. The articles from encyclopedic wikis presented characteristics less relevant for professional use than the others, including more pictures, videos, and external resources but fewer posology details. The Flesch reading ease scores were globally low, especially for encyclopedic articles. In regard to reliability, 64% of wikis fulfilled less than half of the information framework quality criteria. In addition, articles were poorly referenced, and evidence level notifications were exceptional. Finally, 88% of the wikis had fewer than 50% of articles revised in the last year, and 24% of the sites were almost unused.

    Strengths and Limitations

    Our review may not have been exhaustive as the Google search was restricted to lists of medical wikis and several sites reported in the health literature were not accessible. Furthermore, the Web 2.0 field is rapidly changing, and some new medical wikis may have emerged since October 2012. Re-browsing the lists of medical wikis used in this study, we found only one relevant wiki after the inclusion period: the Australian Cancer Guidelines Wiki [65]. Among the 25 included sites, Medpedia and WardWiki have been discontinued [35,47], and a few changes occurred in the structure of the others: Open Anesthesia has been reorganized [49], WikiEcho and MedRevise changed their “skin” [42,51], and Oncologik added a missing link to its owner [54].

    Among the tools available for assessing the quality of health information on websites, none is currently validated and none is fitted either to wikis or to a professional audience [66,67]. The HSWG IQ tool does not take into account collaborative features, as acknowledged by Dobrogowska-Schlebusch [15], and it has been removed from the Web [19]; the DISCERN tool targets health consumers and is restricted to information on treatments [68]; and the Bomba and Land Index has also been designed for health consumers [69]. Numerous items are common between these questionnaires and major guidelines such as the eHealth code of ethics [70], the American Medical Association guideline [71], or the eEurope 2002 quality criteria [72]. The HONcode ethical code of conduct is unique to provide specifications for collaborative websites [31,73]. For example, the item “is the information referenced?” will be transposed for collaborative websites as “is there a statement asking platform users to give references to the information they provide?”. Such specifications do not directly apply to the content, but indirectly through the editorial framework. However, the right influence of the framework on the content deserves to be investigated in future research projects.

    The relevancy of low readability scores, corresponding to college and higher, is arguable since medical doctors have de facto a high level of reading. It has been long demonstrated that readability impacts both the understanding and the cross-reading ability, even for highly educated readers [74], and the need for simplicity is expressed by clinicians themselves for practice guidelines [4]. The relevancy of the Flesch reading ease test for medical writings is also debatable, but more specific tools are not yet validated [75]. Although it includes adjustment parameters adapted to several languages [34], a linguistic bias cannot be excluded in this study since multilingual comparisons have not been documented.

    To check the validity of the estimation of annual editorial activities using Google, we measured the agreement between the number of content pages declared on the site and the corresponding estimate from the Google search engine, for 20 wikis. Although there was a strong agreement (Spearman correlation coefficient=.88, P<.001), automated page creation and vandalism may bias both figures.

    Unmet Clinical Needs

    Our results suggest that no medical wiki meets all four information properties needed by clinicians. The encyclopedic format does not seem to fit in terms of both accuracy and readability. However, whatever the wikis’ purposes, the organization of contents is often unclear, apart from very focused purposes such as oncology protocols, where the knowledge granularity is adapted to a particular audience [54]. The Medical Subject Headings (MeSH) indexing system is sometimes integrated, but it requires specific training for contributors, which is challenging in a multi-authoring context [76]. Whereas some semantic utilities can help manage indexation constraints [10,77], add-ons aimed at improving either medical knowledge management or ergonomics are rarely implemented in medical wikis. If such gaps impact both accuracy and readability, they may also hamper the involvement of users. Contrary to pure knowledge content, the frequent clinical case reports in medical wikis, supporting the emergence of concrete questions of practice, are likely to meet strong clinical interest.

    Reliability is widely, and sometimes critically, impaired by lack of management. Although authoring transparency requires both technical and policy supports [5], our framework assessment particularly shows gaps in users’ disclosures and editorial policies. Since almost only purpose-built platforms are able to manage detailed user data, technical issues are important. Among open communities, only 48% of medical wikis ask for credentials to register, with two requiring some proof [35,36]. As an alternative, users’ medical skills can be assessed during an automated registration including medical tests [58,78]. Interestingly, the fully opened Wikipedia’s articles are commonly consulted by clinicians and medical students [79], while their relevancy has been recurrently questioned [7,14,16-18,21]. However, Wikipedia, including its Wikiproject Medicine, cannot respond to specific clinical needs as it does not target any specific audience [28]. As an encyclopedic media, it is also likely to meet the limitations highlighted in this study.

    In most wikis, weak and poorly collaborative activity jeopardizes content updates. The talk pages, when available, are exceptionally used, and the discussion threads included in forums or social networks are not directly connected to content pages [80]. As a consequence, adversarial debates are lacking, although they are a foundation for building evidence [3].

    The Open Community Challenge

    Users’ regulation in wikis is complex since the lower the control of their editors, the higher their growth [81]. For example, Wikipedia’s English article on atrial fibrillation has been revised approximately 1345 times and discussed 150 times [82], and the article on the recent drug dabigatran 555 times and 35 times respectively [83]. Apart from the severe reliability issues due to anonymity in Wikipedia [84], it has been shown that its development, based only on volunteering, leads articles to be unevenly readable, complete, and reliable [17,20,85]. In our study, we paradoxically observed the highest page revision and discussion levels in a small wiki reserved to a closed community [54]. This finding suggests that a strong user commitment can overcome volunteering limitations.

    Although multi-authoring requires a thorough organization [86], communities attached to medical wikis are often poorly structured. Super-user nominations are usually opaque, and only one wiki provides a standardized peer-review process [39]. As implemented in two wikis, the extent of users’ rights can depend on their participation level [35,58], which represents a reward for authors [87]. However, in order to open scientific debates, the organization of bottom-up relations between users should be further considered [88]. In this way, the public expertise promoted by Wikipedia, which is based on consensus, uses a complex and democratic moderation system, detailed editorial rules, and standardized peer-reviews [21].

    While the HONcode principle about the authoritativeness of the information protects the moderators’ privacy by allowing their anonymity [31], it cannot guarantee the trustworthiness of what they have written [84,89]. The professional scope of our review highlights a lack of audience specifications in health information quality initiatives, in particular for collaborative applications where readers and writers are mixed altogether. The extensive review of social media by Grajales et al provides a useful tutorial for health care professional end users, which may be a first step to building more detailed guidelines for professional health information on the Internet [7]. Indeed, some professional knowledge may generate adverse outcomes, as information on drugs with potential for misuse is commonly sought on the Internet [90]. Therefore, as included in the wikiproject Medicine of Wikipedia [21], a policy specifying the nature and the limits of publicly accessible content is critical, and a model for displaying health information is needed [67,73].

    Educational Value Added

    Among eight medical wikis including learners’ contributions, five include spontaneous undergraduate or postgraduate students’ contributions. The three others have a formal educational goal, targeting postgraduate students or practicing physicians in continuous medical education [49,58,59]. Educational goals may represent an alternative to mere volunteering since learners’ contributions can be part of their curricula. As works performed in training are frequently based on clinical cases as starting points for gathering scientific evidence [91,92], the wiki principle seems particularly fitted to archive, share, discuss, and gradually improve the related materials [93]. From a theoretical point of view, the wiki medium, as an asynchronous communication tool, embodies learning principles based on constructivism and cooperation [94]. Nevertheless, if Internet-based educational programs can be an alternative to live interactive workshops [95], the effectiveness of collaborative writing applications in medical education requires further research [12,14].


    The 25 medical wikis reviewed present various limitations in their format, management, and collaborative features. Encyclopedic wikis have less accurate and readable content. Reliability is widely impaired by lack of transparency. Currency is commonly jeopardized by low editorial activity. Professional medical wikis may be improved by using clinical cases, developing more detailed transparency and editorial policies, and involving postgraduate and continuing medical education learners.


    We would like to thank Ewa Dobrogowska-Schlebusch for sharing her original work; Meik Michalke for improving the koRpus package of the R cran project; Martin Wedderburn for providing English editorial help; and Patrick M Archambault and Tom H Van de Belt for their valuable external review. We are grateful to the College of teaching general practitioners of Lyon (CLGE) to have supported this work.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Google search.

    PDF File (Adobe PDF File), 245KB

    Multimedia Appendix 2

    Literature search.

    PDF File (Adobe PDF File), 81KB

    Multimedia Appendix 3

    Site exclusions and inclusions.

    PDF File (Adobe PDF File), 186KB


    1. Shuval K, Linn S, Brezis M, Shadmi E, Green ML, Reis S. Association between primary care physicians' evidence-based medicine knowledge and quality of care. Int J Qual Health Care 2010 Feb;22(1):16-23 [FREE Full text] [CrossRef] [Medline]
    2. Fung IC. Open access for the non-English-speaking world: overcoming the language barrier. Emerg Themes Epidemiol 2008;5:1 [FREE Full text] [CrossRef] [Medline]
    3. Hay MC, Weisner TS, Subramanian S, Duan N, Niedzinski EJ, Kravitz RL. Harnessing experience: exploring the gap between evidence-based medicine and clinical practice. J Eval Clin Pract 2008 Oct;14(5):707-713. [CrossRef] [Medline]
    4. Carlsen B, Glenton C, Pope C. Thou shalt versus thou shalt not: a meta-synthesis of GPs' attitudes to clinical practice guidelines. Br J Gen Pract 2007 Dec;57(545):971-978 [FREE Full text] [CrossRef] [Medline]
    5. Manchikanti L, Benyamin RM, Falco FJ, Caraway DL, Datta S, Hirsch JA. Guidelines warfare over interventional techniques: is there a lack of discourse or straw man? Pain Physician 2012;15(1):E1-E26 [FREE Full text] [Medline]
    6. Wikipedia contributors. Wikipedia. 2014 Apr 26. Wiki   URL: [accessed 2014-04-27] [WebCite Cache]
    7. Grajales FJ, Sheps S, Ho K, Novak-Lauscher H, Eysenbach G. Social media: a review and tutorial of applications in medicine and health care. J Med Internet Res 2014;16(2):e13 [FREE Full text] [CrossRef] [Medline]
    8. WikiWikiWeb contributors. WikiWikiWeb. 2014 Apr 13. Wiki Engines   URL: [accessed 2014-04-27] [WebCite Cache]
    9. Moen A, Smørdal O. RareICT: a web-based resource to augment self-care and independence with a rare medical condition. Work 2012;41(3):329-337. [CrossRef] [Medline]
    10. He S, Nachimuthu SK, Shakib SC, Lau LM. Collaborative authoring of biomedical terminologies using a semantic Wiki. AMIA Annu Symp Proc 2009;2009:234-238 [FREE Full text] [Medline]
    11. Naik AD, Singh H. Electronic health records to coordinate decision making for complex patients: what can we learn from wiki? Med Decis Making 2010;30(6):722-731. [CrossRef] [Medline]
    12. Rasmussen A, Lewis M, White J. The application of wiki technology in medical education. Med Teach 2013;35(2):109-114. [CrossRef] [Medline]
    13. Guest DG. Four futures for scientific and medical publishing. It's a wiki wiki world. BMJ 2003 Apr 26;326(7395):932 [FREE Full text] [CrossRef] [Medline]
    14. Archambault PM, van de Belt TH, Grajales FJ, Faber MJ, Kuziemsky CE, Gagnon S, et al. Wikis and collaborative writing applications in health care: a scoping review. J Med Internet Res 2013;15(10):e210 [FREE Full text] [CrossRef] [Medline]
    15. Dobrogowska-Schlebusch E. Evaluation of the quality of 52 health-related websites, created with wiki technology. Zesz Nauk Ochr Zdrowia Zdr Publiczne Zarządzanie 2009;7(1):102-109.
    16. Rajagopalan MS, Khanna VK, Leiter Y, Stott M, Showalter TN, Dicker AP, et al. Patient-oriented cancer information on the internet: a comparison of wikipedia and a professionally maintained database. J Oncol Pract 2011 Sep;7(5):319-323 [FREE Full text] [CrossRef] [Medline]
    17. Clauson KA, Polen HH, Boulos MN, Dzenowagis JH. Scope, completeness, and accuracy of drug information in Wikipedia. Ann Pharmacother 2008 Dec;42(12):1814-1821. [CrossRef] [Medline]
    18. Mühlhauser I, Oser F. [Does WIKIPEDIA provide evidence-based health care information? A content analysis]. Z Evid Fortbild Qual Gesundhwes 2008;102(7):441-448. [Medline]
    19. Ambre J, Guard R, Perveiler FM, Renner J, Rippen H. Health Summit Working Group. 1997. Criteria for Assessing the Quality of Health Information on the Internet   URL: http:/​/www.​​contenidos_socios/​Informatica/​Guias_y_recomendaciones/​Criteria_Quality_Health_Inform_19971014.​pdf [WebCite Cache]
    20. Halavais A, Lackaff D. An Analysis of Topical Coverage of Wikipedia. J Comp Mediated Comm 2008 Jan;13(2):429-440. [CrossRef]
    21. Heilman JM, Kemmann E, Bonert M, Chatterjee A, Ragar B, Beards GM, et al. Wikipedia: a key tool for global public health promotion. J Med Internet Res 2011;13(1):e14 [FREE Full text] [CrossRef] [Medline]
    22. Endrizzi L. Wikipédia : de la co-rédaction au co-développement de la communauté. France: HAL; 2006 Sep Presented at: Document numérique et société; 2006; France p. 185-198   URL:
    23. World Internet Statistics. 2010. Top Ten Internet Languages   URL: [accessed 2014-04-27] [WebCite Cache]
    24. Rothman D. List of Medical Wikis   URL: [accessed 2014-04-27] [WebCite Cache]
    25. Université Paris 5   URL: [accessed 2015-02-09] [WebCite Cache]
    26. Google Knol.   URL: [accessed 2015-02-11] [WebCite Cache]
    27. Collins English Dictionary. 2015. Definition of “clinical”   URL: [accessed 2015-02-11] [WebCite Cache]
    28. Wikipedia contributors. Wikipedia. 2014 Nov 29. WikiProject Medicine   URL: [accessed 2015-02-11] [WebCite Cache]
    29. Biomedical Library - University of South Alabama. 2013. Medical Matters Wiki   URL: [accessed 2015-02-11] [WebCite Cache]
    30. Collins English Dictionary. 2015. Definition of “encyclopedic”   URL: [accessed 2015-02-11] [WebCite Cache]
    31. Health on the Net. 2012. Principles - Quality and trustworthy health information   URL: [accessed 2014-04-27] [WebCite Cache]
    32. Bernstam EV, Sagaram S, Walji M, Johnson CW, Meric-Bernstam F. Usability of quality measures for online health information: Can commonly used technical quality criteria be reliably assessed? Int J Med Inform 2005 Aug;74(7-8):675-683. [CrossRef] [Medline]
    33. Armitage P, Berry GD, Matthews J. Statistical methods in medical research. Malden, MA: Blackwell Science; 2002.
    34. Michalke M. The Comprehensive R Archive Network. 2012. koRpus: An R Package for Text Analysis - release 0.04-38   URL: [accessed 2014-04-27] [WebCite Cache]
    35. Web archive of 2015.   URL:*/ [accessed 2015-02-12] [WebCite Cache]
    36.   URL: [accessed 2014-04-27] [WebCite Cache]
    37. Physician Medical Wiki   URL: [accessed 2014-04-27] [WebCite Cache]
    38. DocCheck Flexikon. Medizinwissen, KnowHow teilen - suchen   URL: [accessed 2014-04-27] [WebCite Cache]
    39.   URL: [accessed 2014-04-27] [WebCite Cache]
    40. EyeWiki.   URL: [accessed 2014-04-27] [WebCite Cache]
    41. The wiki-based collaborative Radiology resource   URL: [accessed 2014-04-27] [WebCite Cache]
    42.   URL: [accessed 2014-04-27] [WebCite Cache]
    43. wikiRadiography.   URL: [accessed 2014-04-27] [WebCite Cache]
    44. Pathowiki.   URL: [accessed 2014-04-27] [WebCite Cache]
    45. Pathology Wikibook. Online open-access pathology books   URL: [accessed 2014-04-27] [WebCite Cache]
    46. Wikidoc.   URL: [accessed 2014-04-27] [WebCite Cache]
    47. Web archive of 2015.   URL:*/ [accessed 2015-02-12] [WebCite Cache]
    48. WikEM.   URL: [accessed 2015-02-12] [WebCite Cache]
    49.   URL: [accessed 2014-04-27] [WebCite Cache]
    50. ECGpedia.   URL: [accessed 2014-04-27] [WebCite Cache]
    51. Medrevise. free medical revision notes!   URL: [accessed 2014-04-27] [WebCite Cache]
    52. Le Wiki des ECN médecine   URL: [accessed 2014-04-27] [WebCite Cache]
    53. Wikia. Biomedizinisches Wiki   URL: [accessed 2014-04-27] [WebCite Cache]
    54. OncologiK.   URL: [accessed 2014-04-27] [WebCite Cache]
    55. OncoWiki.   URL: [accessed 2014-04-27] [WebCite Cache]
    56. Live Wiki.   URL: [accessed 2014-04-27] [WebCite Cache]
    57. Dermpedia.   URL: [accessed 2015-02-11] [WebCite Cache]
    58. 中华骨科网 [Orthochina].   URL: [accessed 2014-04-27] [WebCite Cache]
    59. Pediatric Imaging.   URL: [accessed 2014-04-27] [WebCite Cache]
    60. Wikia. Wiki communities for everyone!   URL: [accessed 2015-02-12] [WebCite Cache]
    61. WikiFoundry Central.   URL: [accessed 2015-02-12] [WebCite Cache]
    62. Wikispaces.   URL: [accessed 2015-02-12] [WebCite Cache]
    63. Grudin J, Poole ES. Wikis at work: success factors and challenges for sustainability of enterprise Wikis. New York, NY: ACM; 2010 Jul 07 Presented at: WikiSym '10 Proceedings of the 6th International Symposium on Wikis and Open Collaboration; July 7, 2010; New York, NY p. 5   URL: [CrossRef]
    64. Kim MacGregor S, Lou Y. Web-Based Learning. Journal of Research on Technology in Education 2004 Dec;37(2):161-175. [CrossRef]
    65. Cancer Guidelines Wiki.   URL: [accessed 2015-01-16] [WebCite Cache]
    66. Risk A, Dzenowagis J. Review of internet health information quality initiatives. J Med Internet Res 2001;3(4):E28 [FREE Full text] [CrossRef] [Medline]
    67. Fahy E, Hardikar R, Fox A, Mackay S. Quality of patient health information on the Internet: reviewing a complex and evolving landscape. Australas Med J 2014;7(1):24-28 [FREE Full text] [CrossRef] [Medline]
    68. Discern tool.   URL: [accessed 2015-01-16] [WebCite Cache]
    69. Bomba D. Evaluating the quality of health web sites: developing a validation method and rating instrument. In: HICSS'05.: IEEE; 2005 Presented at: System Sciences, 2005. Proceedings of the 38th Annual Hawaii International Conference on; Jan 3-6, 2005; Hawaii p. 139a   URL: [CrossRef]
    70. e-Health Ethics Initiative. e-Health Code of Ethics (May 24). J Med Internet Res 2000;2(2):E9 [FREE Full text] [CrossRef] [Medline]
    71. Winker MA, Flanagin A, Chi-Lum B, White J, Andrews K, Kennett RL, et al. Guidelines for Medical and Health Information Sites on the Internet. JAMA 2000 Mar 22;283(12):1600. [CrossRef]
    72. Commission of the European Communities‚ Brussels. eEurope 2002: Quality Criteria for Health Related Websites. J Med Internet Res 2002 Dec;4(3):E15 [FREE Full text] [CrossRef] [Medline]
    73. WHO. 2011. Global Observatory for eHealth series - Volume 4   URL: [accessed 2014-04-27] [WebCite Cache]
    74. Shaw D. Readability of documentation for end user searchers. Online Review 1989 Jan;13(1):3-8. [CrossRef]
    75. Kim H, Goryachev S, Rosemblat G, Browne A, Keselman A, Zeng-Treitler Q. Beyond surface characteristics: a new health text-specific readability measurement. AMIA Annu Symp Proc 2007:418-422 [FREE Full text] [Medline]
    76. Coletti MH, Bleich HL. Medical subject headings used to search the biomedical literature. J Am Med Inform Assoc 2001;8(4):317-323 [FREE Full text] [Medline]
    77. Meilender T, Lieber J, Palomares F, Herengt G, Jay N. A semantic wiki for editing and sharing decision guidelines in oncology. Stud Health Technol Inform 2012;180:411-415. [Medline]
    78. Ma ZS, Zhang HJ, Yu T, Ren G, Du GS, Wang YH. case-based orthopaedic Wiki project in China. Clin Orthop Relat Res 2008 Oct;466(10):2428-2437 [FREE Full text] [CrossRef] [Medline]
    79. von Muhlen M, Ohno-Machado L. Reviewing social media use by clinicians. J Am Med Inform Assoc 2012;19(5):777-781 [FREE Full text] [CrossRef] [Medline]
    80. Wagner C. Wiki: A technology for conversational knowledge management and group collaboration. Communications of the Association for Information Systems 2004 Mar;13(19):265-289 [FREE Full text]
    81. Roth C, Taraborelli D, Gilbert N. Measuring wiki viability: An empirical assessment of the social dynamics of a large sample of wikis. 2008 Jun 09 Presented at: The 4th International Symposium on Wikis, Proceedings; 2008; University of Porto, Portugal   URL: [CrossRef]
    82. Wikipedia contributors. Wikipedia. 2013 May 30. Atrial fibrillation   URL: [WebCite Cache]
    83. Wikipedia contributors. Wikipedia. 2013 Apr 27. Dabigatran   URL: [WebCite Cache]
    84. Santana A, Wood D. Transparency and social responsibility issues for Wikipedia. Ethics Inf Technol 2009 May 6;11(2):133-144. [CrossRef]
    85. McInnes N, Haglund BJ. Readability of online health information: implications for health literacy. Inform Health Soc Care 2011 Dec;36(4):173-189. [CrossRef] [Medline]
    86. Michlmayr M. Community management in open source projects. The European Journal for the Informatics Professional 2009;10(3):22-26 [FREE Full text]
    87. Hoffmann R. A wiki for the life sciences where authorship matters. Nat Genet 2008 Sep;40(9):1047-1051. [CrossRef] [Medline]
    88. Chen SL. Wikipedia: A Republic of Science Democratized. Albany Law Journal of Science and Technology 2010 Apr 30;20(2):247-325 [FREE Full text]
    89. ScienceRoll. 2007. Blogterview with Dr Flea   URL: [accessed 2014-06-01] [WebCite Cache]
    90. Law MR, Mintzes B, Morgan SG. The sources and popularity of online drug information: an analysis of top search engine results and web page views. Ann Pharmacother 2011 Mar;45(3):350-356. [CrossRef] [Medline]
    91. Beyer M, Gerlach FM, Flies U, Grol R, Król Z, Munck A, et al. The development of quality circles/peer review groups as a method of quality improvement in Europe. Results of a survey in 26 European countries. Fam Pract 2003 Aug;20(4):443-451 [FREE Full text] [Medline]
    92. Thistlethwaite JE, Davies D, Ekeocha S, Kidd JM, MacDougall C, Matthews P, et al. The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No. 23. Med Teach 2012;34(6):e421-e444. [CrossRef] [Medline]
    93. Cavanaugh SK, Calabretta N. Meeting the challenge of evidence-based medicine in the family medicine clerkship: closing the loop from academics to office. Med Ref Serv Q 2013;32(2):172-178. [CrossRef] [Medline]
    94. Parker KR, Chao JT. Wiki as a Teaching Tool. Learning 2007;3(3):57-72 [FREE Full text]
    95. Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, Spann SJ, et al. Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA 2005 Sep 7;294(9):1043-1051. [CrossRef] [Medline]


    CMS: Content Managing System
    HONcode: Health On the Net code of ethics
    HSWG IQ tool: Health Summit Working Group Information Quality tool
    MeSH: Medical Subject Headings
    URL: Uniform Resource Locator

    Edited by G Eysenbach; submitted 02.06.14; peer-reviewed by P Archambault, T van de Belt; comments to author 10.08.14; revised version received 08.12.14; accepted 16.01.15; published 19.02.15

    ©Alexandre Brulet, Guy Llorca, Laurent Letrilliart. Originally published in the Journal of Medical Internet Research (, 19.02.2015.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.