JMIR Publications

Journal of Medical Internet Research

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 17.07.17 in Vol 19, No 7 (2017): July

This paper is in the following e-collection/theme issue:

    Original Paper

    Web-Based Therapist Training in Interpersonal Psychotherapy for Depression: Pilot Study

    1Center for Telepsychology, Madison, WI, United States

    2Department of Psychology, Ben Gurion University of the Negev, Beer Sheva, Israel

    3Department of Psychiatry, New York State Psychiatric Institute and Columbia University College of Physicians & Surgeons, New York, NY, United States

    4Weill Cornell Medicine, Department of Psychiatry, New York, NY, United States

    Corresponding Author:

    Kenneth A Kobak, PhD

    Center for Telepsychology

    22 North Harwood Circle

    Madison, WI, 53717

    United States

    Phone: 1 6084062621

    Fax:1 6084062621

    Email:


    ABSTRACT

    Background: Training mental health professionals to deliver evidence-based therapy (EBT) is now required by most academic accreditation bodies, and evaluating the effectiveness of such training is imperative. However, shortages of time, money, and trained EBT clinician teachers make these challenges daunting. New technologies may help. The authors have developed the first empirically evaluated comprehensive Internet therapist training program for interpersonal psychotherapy (IPT).

    Objective: The aim of this study was to examine whether (1) the training protocol would increase clinicians’ knowledge of IPT concepts and skills and (2) clinicians would deem the training feasible as measured by satisfaction and utility ratings.

    Methods: A total of 26 clinicians enrolled in the training, consisting of (1) a Web-based tutorial on IPT concepts and techniques; (2) live remote training via videoconference, with trainees practicing IPT techniques in a role-play using a case vignette; and (3) a Web-based portal for therapists posttraining use to help facilitate implementation of IPT and maintain adherence over time.

    Results: Trainees’ knowledge of IPT concepts and skills improved significantly (P<.001). The standardized effect size for the change was large: d=2.53, 95% CI 2.23-2.92. Users found the technical features easy to use, the content useful for helping them treat depressed clients, and felt the applied training component enhanced their professional expertise. Mean rating of applied learning was 3.9 (scale range from 1=very little to 5=a great deal). Overall satisfaction rating was 3.5 (range from 1=very dissatisfied to 4=very satisfied).

    Conclusions: Results support the efficacy and feasibility of this technology in training clinicians in EBTs and warrant further empirical evaluation.

    J Med Internet Res 2017;19(7):e257

    doi:10.2196/jmir.7966

    KEYWORDS



    Introduction

    The importance of preparing mental health professionals to deliver evidence-based therapy (EBT) is now well established [1]. Accreditation bodies for academic programs in psychiatry [2], psychology [3], and social work [4] in both the United States and Canada [5] require demonstration of competence in EBTs [6]. This requirement helps to address the critical shortage of clinicians trained in EBTs, which has been identified as a major public health concern [7,8]. It also accords with the needs both of patients, who generally prefer talk therapy to medication [9], and trainees, who report wanting to spend more time delivering EBTs and to receive more EBT training than they now do [10].

    Interpersonal psychotherapy (IPT) is one of the oldest and best-studied EBTs [11,12] and one of the psychotherapies formally evaluated for efficacy by the National Institute for Mental Health (NIMH) [13]. Numerous professional and international guidelines recommend IPT for the treatment of major depression, including the American Psychiatric Association [14] and the Guidelines for Primary Care Physicians [15-17]. IPT focuses on understanding the interpersonal and social context in which the patient’s symptoms arose. This brief, time-limited approach focuses on feelings, validating them in the context of social situations, and helping the patient to understand and verbalize those feelings to change interpersonal encounters. A recent meta-analysis of 90 randomized controlled trials involving over 11,434 patients found large effect sizes for IPT in alleviating major depression. IPT was also effective in preventing relapse and had a preventive effect on subthreshold episodes [18]. Effect sizes were comparable with those found with cognitive behavior therapy (CBT) [19]. IPT has also demonstrated efficacy for other diagnoses including eating and anxiety disorders and for patients ranging from adolescents to the elderly [20]. Despite the substantial empirical literature supporting the efficacy of IPT, it has been far less disseminated than CBT.

    With the increasing emphasis on EBTs comes an increasing focus on how to implement and evaluate psychotherapy training. Accreditation bodies in psychiatry (Accreditation Council for Graduate Medical Education) and psychology (American Psychological Association) require not only training in EBTs but an evidence-based approach to evaluating the effectiveness of such training, that is, whether clinicians demonstrate competence in administering the treatment [3,6]. Thus, psychotherapy training involves the dual challenges of expanding the current curriculum to include EBTs and ensuring such training is effective. These shortages of time, money, and availability of trained EBT clinician teachers make these challenges daunting. The development of new methods in psychotherapy training has been suggested as one solution to these challenges [21].

    New technologies may help [22]. Internet-based training (e-learning) is cost-effective, scalable, and available upon demand. It overcomes limitations of trainer availability, especially in remote locations. Clinicians can work at their own pace, repeating and reviewing as desired. Enhancing training quality are the standardization of instruction (ensuring inclusion of key empirically proven components) and the use of multi-modal learning techniques the technology affords, which have been shown to increase knowledge uptake and retention [23]. Research has found Internet-based training has greater effectiveness than paper-based manuals alone, resulting in greater long-term knowledge retention [24,25], and in one study, Internet-based training was superior to face-to-face instruction [26]. Once the learner has mastered the didactic content, new technologies can teach and assess clinical proficiency in administering the newly learned skills. Both computer simulations using virtual patients [27,28] and remote live training via videoconference have demonstrated effectiveness in teaching applied clinical skills [29,30]. Posttraining, these same technologies can be used to help ensure proper implementation in clinical practice [22,31]. Research has found posttraining consultation an essential ingredient for successful implementation of EBT skills [32] and to predict clinician adherence and competence following EBT training [32,33]. By improving access to training, new technologies can help facilitate dissemination of a variety of EBTs, including IPT and provide patients with a wider choice of treatment options.

    In this pilot study, we developed the first comprehensive Internet training program for IPT to be empirically evaluated. This three-part, interactive therapist training protocol focuses on IPT for major depression and consists of (1) a Web-based tutorial on IPT concepts and techniques; (2) live remote training via videoconference, with trainees practicing IPT techniques in a role-play using a case vignette; and (3) a Web-based portal for therapists posttraining use to help facilitate integration of IPT into their clinical practice and maintain adherence and quality over time. The goal of the study was to examine the following hypotheses: (1) the training protocol would increase clinicians’ knowledge of IPT concepts and skills from baseline and (2) clinicians would deem the training feasible as measured by satisfaction and utility ratings.


    Methods

    Procedure

    Before training, trainees took a 38-item pretest on their knowledge of IPT concepts and principles. Following the pretest, they received a username and password to access the Web-based tutorial and completed it at their own pace. Trainees could email the instructors with questions about the material. After completing the Web-based tutorial, trainees took a posttest of IPT knowledge and a user satisfaction questionnaire. Trainees then received a 45-60 min live applied training session conducted via videoconference with an experienced IPT trainer (JDL, JCM, or KLB). During this session, the trainer portrayed a standardized depressed patient, whereas the trainee role-played as therapist (see below). After completing the video session, trainees completed a satisfaction questionnaire and received a link to the IPT posttraining website. The posttraining website was designed to facilitate implementation and adherence following training and to guide the clinician in structuring sessions with their first IPT patients.

    Description of Training Components

    Web-Based Tutorial

    The training components paralleled components of IPT. In IPT, the patient and IPT therapist together define a central interpersonal problem focusing on one of four categories: grief, role transition, role dispute, or interpersonal deficits [20]. IPT has three phases: the initial phase (evaluation, case formulation, and treatment plan), the middle phase (addressing and resolving the focal problem), and the termination phase (consolidating gains and transitioning from treatment). The tutorial contains seven modules covering these and other topics on the theory and practice of IPT for depression. A description of the content and learning goals for each of the modules is presented in Table 1.

    Table 1. Learning objectives by module: interpersonal psychotherapy (IPT) tutorial.
    View this table

    Because multi-modal learning and high levels of interactivity enhance learning [23,34], content was presented in varied formats including animations, graphical illustrations, and clinical vignettes with audio and interactive exercises. New material was often presented using “challenge questions,” as learning is enhanced when a user makes mistakes applying recently acquired information and gets specific feedback explaining the rationale for the correct answer [35]. Presentation of material was guided by principles of instructional design, such as “chunking” material based on limits to working memory to enhance retention [36]. It was estimated the tutorial would take 3 to 4 hours to complete. Interactive demos of the tutorial content can be found in Multimedia Appendices 1 and 2.

    Applied Training

    After completing the Web-based tutorial and Web-based posttest, trainees completed a supervised clinical training session using videoconferencing. During the applied session, the trainer portrayed a standardized depressed patient, whereas the trainee role-played the IPT therapist. The goal was to offer trainees practice in applying the skills learned during the tutorial. The trainer provided feedback and suggestions in real time as appropriate. Trainees also had the opportunity to ask questions about the IPT approach. The role play was designed to portray the patient’s second or third session, to provide the trainee an opportunity to practice developing an interpersonal formulation with the patient, and achieving agreement with the patient on the focal IPT problem area. We did not role play initial sessions because these typically focus on history gathering, which we finessed by providing summary information on the patient’s history up front to the trainee. Starting a session with “How have things been since we last met?” is a pattern that begins with session 2. A crucial point in IPT treatment is the therapist’s formulation of a focal problem area in session 2 or 3 and getting the patient’s agreement on it; this then organizes the remainder of the treatment [37]. Hence, the choice of session 2-3 from a time-limited, 12-16 session acute treatment framework. Although not sufficient for advanced clinical training in IPT, the applied training allowed us to evaluate the feasibility and user satisfaction with this approach. Applied training where trainees actually implement the skills learned has been found a critical, and often lacking, component of training [38-41].

    Posttraining Case Tracker Website

    A limitation of many professional training programs is lack of carryover to practice [32]. To address this, after completing the applied training session, trainees received access to the interactive IPT website portal (IPT Case Tracker). The IPT Case Tracker was designed to facilitate the transition from classroom and initial applied training to clinical practice and to help maintain adherence to IPT in initial and ongoing clinical use. The portal contained (1) interactive tools (eg, reminders) to help clinicians structure their IPT sessions, assist with case conceptualization, and provide an overall framework for conducting IPT with specific clients; (2) checklists for presession preparation and postsession tracking of client issues, client progress, and utilization of IPT skills; (3) a printable depression measure (Hamilton Depression Rating Scale [42]) to monitor treatment progress; and (4) case consultation via email, to facilitate uptake and troubleshoot implementation questions and issues. We obtained user satisfaction with the case tracker 1 to 3 months after completion of the applied training session. See Multimedia Appendices 3 and 4 for samples of pre- and postsession checklists.

    Measures

    Efficacy

    To assess trainees’ gains in knowledge of the concepts, principles, and techniques of IPT, the authors developed a 38-item pre- and posttest covering the core concepts in the tutorial. The text contained a combination of multiple-choice and true-false questions in proportions mandated by continuing education guidelines from the American Psychological Association and the National Association of Social Work. Testing served dual functions of assessing and reinforcing learning. Trainees received rationales for the correct answers after completing the posttest to reinforce learning. Posttests were given after completion of each module. The test had good internal consistency reliability (coefficient alpha =.79).

    Feasibility

    We evaluated user satisfaction from two perspectives: technical implementation, and clinical content. User satisfaction with technical aspects of the training tutorial was assessed using the System Usability Scale (SUS) [43,44], a reliable, well-validated 10-item scale designed to evaluate user satisfaction with technical aspect of Web-based applications and other technologies. It provides a score on a 0-100 scale regarding the effectiveness, efficiency, and satisfaction users experience using the system. A mean SUS score of 50.9 represents a rating of “okay,” 71.4 represents a rating of “good,” 85.5 represents a rating of “excellent,” and 90.9 represents a rating of “best imaginable.”

    Descriptive statistics assessed trainee satisfaction with the clinical content of the training components. Trainees rated each training component along six dimensions using a 4-point scale (strongly agree, agree, disagree, and strongly disagree). Trainees also rated global satisfaction and had opportunity for open-ended feedback. Scale items were developed in prior studies on user satisfaction with Web-based training [29,45] and have shown good internal consistency reliability (Cronbach alpha=.92). Ratings on the extent to which trainees felt the learning goals were met and global ratings of how much was learned were also obtained for the tutorial as another measure of feasibility.

    Statistical Analyses

    A two-tailed paired t test was computed to examine the mean change knowledge assessment from pretest to posttest on the Web-based tutorial. The standardized mean effect size for the t test was calculated using methods described by Cohen [46]. Effect sizes were considered large at 0.80, medium at 0.50, and small at 0.20 [46]. Descriptive statistics were used for the feasibility measure results.

    Compliance With Ethical Standards

    This study has been approved by the Allendale Institutional Review Board. All procedures performed in this study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments. Informed consent was obtained from all individual participants included in the study. This article does not contain any studies with animals performed by any of the authors.


    Results

    Participants

    Clinicians were recruited through advertisements in professional journals from National Association of Social Work and the American Psychological Association and through an announcement on the International Society for Interpersonal Psychotherapy (ISIPT) listserv. A total of 35 clinicians inquired about the study and were offered free participation. Of these, 26 (74%, 26/35) enrolled in the study and started the tutorial, and 22 (62%, 22/35) completed it. Furthermore, 18 (51%, 18/35) trainees participated in the live applied training session after completing the tutorial. The Allendale Institutional Review Board approved the study protocol, and all participants signed informed consent statements.

    Of the 26 community clinicians starting the Web-based tutorial, 23 (88%) were female, 22 (85%) white, 1 (4%) African American, 4 (15%) Hispanic, and 3 (11%) other or mixed racial categories. Participants came from 15 US states and one each from Mexico, Brazil, Canada, and the United Kingdom. Mean age was 41.6 years (standard deviation [SD]=11.5, range: 26-63 years), and mean years of clinical experience was 10.5 (SD=7.6, range: 1-26 years). Eleven (42%, 11/26) were social workers, 11 (42%, 11/26) psychologists, 2 (8%, 2/26) marriage and family therapists, 1 (4%, 1/26) a psychiatrist, and 1 (4%, 1/26) a psychiatric nurse. Additionally, 24 (92%, 24/26) were actively conducting psychotherapy with clients. Only 3 (11%) reported having received any prior formal training in IPT: one through a continuing education workshop, one as part of undergraduate coursework, and one in graduate coursework. Three participants (14%, 3/26) reported having used some IPT techniques in their practice before participating in the study.

    Web-Based Tutorial

    Efficacy: Improvement in Knowledge of Interpersonal Psychotherapy (IPT) Concepts

    The mean number of correct answers on the 38-item IPT concepts and skills quiz improved significantly from 16.5 (SD 4.6) on pretest to 27.5 (SD 4.0) on posttest, t21=15.7, P<.001. The standardized effect size for the change was large: d=2.53, 95% CI 2.23-2.92.

    User Satisfaction: Technical Features

    The SUS evaluated user satisfaction with the technical features of the Web-based tutorial. Mean SUS score was 90.6 (SD 11.4). This corresponds to a mean rating between “excellent” and “best imaginable.” Figure 1 shows ratings on individual SUS items. Users found the technical features of the tutorial easy to use and understand. The mean global rating on the user-friendliness item was 5.7 (between good and excellent; range: 1=worst imaginable to 7=best imaginable).

    Figure 1. System Usability Scale ratings of user satisfaction with technical features of Web-based tutorial.
    View this figure
    User Satisfaction: Clinical Content

    Figure 2 illustrates user satisfaction with the clinical content of the tutorial. Clinicians found the concepts clearly presented in an interesting fashion and found the content useful for helping them treat depressed clients. The mean global rating of user satisfaction with the clinical content was 3.2 (range: 1=very dissatisfied to 4=very satisfied).

    User satisfaction with the tutorial was also evaluated by whether clinicians felt the learning objectives of each module were met. Nineteen learning objectives were identified a priori for the seven modules (Table 1). Trainees rated whether learning objectives were met after completing each module. They rated the learning objectives as having being met 95% (18/19) of the time, on average.

    Global ratings of how much the trainee learned (scale range from 1=very little to 5=a great deal) was also obtained, as required by continuing education accreditation agencies. The mean rating of trainees learning from the Web-based tutorial was 3.91.

    Figure 2. Ratings of user satisfaction: clinical content of Web-based tutorial.
    View this figure
    Figure 3. Ratings of user satisfaction: applied training via videoconference.
    View this figure

    Applied Training

    User satisfaction with the applied training session via videoconference appears in Figure 3. Overall, trainees felt the program enhanced their professional expertise and felt able to apply the skills learned to actual treatment. Mean global rating of user satisfaction with the applied training was 3.5 (range: 1=very dissatisfied to 4=very satisfied). Mean rating of learning from the applied training was 3.94 (scale range from 1=very little to 5=a great deal).

    Web Portal

    Figure 4 shows user satisfaction with the Web portal. Trainees thought the website helpful in preparing for IPT sessions, identifying the client’s IPT focus area, developing an interpersonal formulation, and monitoring changes in the client symptoms. Overall, satisfaction rating was 3.3 (range: 1=very dissatisfied to 4=very satisfied).

    Figure 4. Ratings of user satisfaction: posttraining Web portal.
    View this figure

    Duration of Training

    The mean time it took trainees to complete the Web-based tutorial was 3.3 hours (SD=0.8, range 2.3-5.0 hours). The average module was 29.1 min (SD 18.3). The tutorial was completed over a mean of 27.4 days (SD=22.9, range 1-66 days). The mean duration for participation in the entire training protocol (start of Web-based tutorial to evaluation of Web portal) was 91 days (SD 18.5).


    Discussion

    Principal Findings

    This pilot study provides evidence to support the efficacy and feasibility of this technologically advanced, three-part therapist training intervention on IPT for major depression. Results supported both our hypotheses: the tutorial increased trainee knowledge of IPT, and trainees reported high levels of satisfaction with the three training components. User satisfaction has critical importance: if trainees do not like a training program, find it too difficult to use, or not useful, they will not complete it. In our study, trainees described high satisfaction with both technical aspects and clinical content of the training components and had a completion rate of 85% for the Web-based tutorial and 69% for the live applied training. This ranks somewhat higher than average compared with other Web-based trainings [47]. Although we could not assess the total number of possible participants who received notification of the study training opportunity, impressions from ease of recruitment further support the potential demands and interest in this vehicle (enrollment was limited to 35 participants due to budgetary constraints).

    If successfully disseminated, this intervention may assist academic training programs in solving the dual challenges they face in expanding training curricula to include EBT’s and ensuring the training is effective. IPT in particular is an EBT in which most practitioners do not receive training, despite its strong empirical standing. As no treatment is universally effective, expanding training options to include multiple EBTs helps produce more well-rounded clinicians and provides more treatment options for patients, some of whom may prefer or respond better to IPT than other EBTs. IPT has been far less disseminated than CBT (which has comparable supportive evidence) and psychodynamic therapy (which has far less empirical support). IPT uses the medical model of illness which makes it very compatible with clients treated with a combination of psychotherapy and medication.

    From a practical standpoint, the model this study used augments traditional approaches to training rather than replacing them [48-50]. Web-based tutorials can lay conceptual groundwork, like a textbook, but with the added advantages the technology affords in enhancing learning. Through the use of video recordings and illustrated vignettes, trainees can observe demonstrations of skills being correctly implemented in a variety of situations. Ongoing self-tests incorporated in the tutorial helps ensure and document that participants understand concepts, which has particular importance in evidence based practice. Videoconferencing can overcome logistical barriers to access to domain experts and be used for supervision, including role playing. The use of role plays with supervisors when learning new skills has been found superior to simple discussion [51]. Evaluation of clinical skills by actually observing trainees applying skills is critical to success but underutilized. In our sample, only 1 trainee reported prior live skills observation as part of their IPT training, a number consistent with the 10% rate reported in other studies [21,29]. Live observation has been found to enhance learning compared with observing videotapes of trainees [52].

    The technology may also enhance posttraining supervision in several ways. Videoconferencing facilitates access to training supervisors. Access to a training supervisor following training has been found an essential ingredient of posttraining success superior to other forms of posttraining options such as peer consultation [53]. The technology facilitates ongoing case discussion following training, which is critical to ensure therapist efficacy in patients seen posttraining. The Web portal helps transfer learning from classroom to practice by providing a tool to help structure sessions, monitor application of and adherence to techniques, and monitor treatment outcome. It may also help prevent “therapist drift,” as trainees could return to the website to refresh their knowledge and for ongoing retesting and monitoring [54]. While not included as part of this study, evaluation of ongoing posttraining therapist supervision using videoconference and Web portals would be informative.

    Results from this study are consistent with other studies on the use of these technologies for training clinicians on other EBTs such as CBT [29,55]. This is the first application of this technology to IPT training to be empirically evaluated that we are aware of (there has been one Web-based training pilot for Interpersonal Social Rhythms Therapy [IPSRT], an adaptation of IPT for patients with bipolar disorder [56]). Although IPT is less structured than CBT, there was no reason to think the methodology would have less success. In this study, we did not formally assess clinical competence, as the study goal was to assess user satisfaction with the methodology and improvement in conceptual knowledge. Thus, the degree of clinical competence the trainees attained is unknown. The number of live applied sessions necessary to achieve varying levels of competence in IPT using this methodology needs exploration, as does the differential impact of didactic and applied training on clinical skills. In a prior CBT study, the Web-based tutorial sufficed to improve clinical skills from poor to minimally acceptable, but the addition of four 1-hour videoconference role play sessions further improved rated clinical skills to between adequate and good [29]. Competence was assessed using role plays conducted via videoconference, which were recorded and evaluated for clinical skills by a blind rater. Such a methodology may be one way to assess competence via Web-based training. Further studies are needed to explore the relationship between the number of applied sessions and levels of competence and the relative impact of didactic and applied on-line training with IPT.

    Limitations

    Limitations of this pilot study include lack of randomization and a control group, such as comparison to current standard methods of clinical instruction. The small sample size limits generalizability of results. In addition, since we did not collect patient data, it is impossible to know how the training program affected clinical practice. Although trainees received no feedback following pretesting on conceptual knowledge, there is the potential for practice effects. In the absence of a control group, it is difficult to determine the extent to which improvement in scores on the tutorial was due to training as opposed to induction bias. However, some studies have found pretests increase learning by orienting learners to subsequent information [35].

    Conclusions

    Future studies could include a larger sample size, as well a cohort of recent graduates from academic internships who are interested in additional psychotherapy training following graduation. Such training could count toward continuing education credits [48]. In addition, further study of the stability of training is warranted. In summary, results support the use of these technologies in training clinicians in EBTs and warrant further study to empirically evaluate these training methods compared to and in conjunction with current methodologies.

    Acknowledgments

    The authors would like to thank Richard DeVouno for developing the eLearning component of the Web-based tutorial. This study was funded with federal funds from the National Institute of Mental Health, National Institutes of Health, Department of Health and Human Services, under Small Business Innovation Research (SBIR) grant number R43MH106169.

    Conflicts of Interest

    Drs Kobak, Lipsitz, and Markowitz have intellectual property rights and a proprietary interest in the Web-based training program described in this project. Dr. Markowitz receives salary support from the National Institute of Mental Health, the Earl Mack Foundation, and the New York State Psychiatric Institute; minor book royalties from American Psychiatric Publishing, Basic Books, and Oxford University Press; and an editorial stipend from Elsevier Press.

    Multimedia Appendix 1

    Example of Web-based tutorial multimedia content.

    PPTX File, 10MB

    Multimedia Appendix 2

    Example of clinical vignettes and challenge questions.

    PPTX File, 2MB

    Multimedia Appendix 3

    Example of pre-session checklist from IPT Web portal.

    PNG File, 232KB

    Multimedia Appendix 4

    Example of post-session checklist from IPT Web portal.

    PNG File, 106KB

    References

    1. Institute of Medicine. In: England MJ, Butler AS, Gonzalez ML, editors. Psychosocial interventions for mental and substance use disorders: A framework for establishing evidencebased standards. Washington, D. C. Washington, D.C: The National Academies Press; 2015.
    2. Accreditation Council for Graduate Medical Education. ACGME. 2016. ACGME Program Requirements for Graduate Medical Education in Psychiatry   URL: https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/400_psychiatry_2016.pdf [accessed 2017-06-26] [WebCite Cache]
    3. American Psychological Association. APA. Washington, D.C: American Psychological Association; 2006. Guidelines and Principles for Accreditation of Programs in Professional Psychology   URL: http://www.apa.org/ed/accreditation/about/policies/guiding-principles.pdf [accessed 2017-06-26] [WebCite Cache]
    4. Council on Social Work Education. CSWE. Washington, DC: National Association of Social Workers; 2015. Educational Policy and Accreditation Standards   URL: https:/​/www.​cswe.org/​getattachment/​Accreditation/​Accreditation-Process/​2015-EPAS/​2015EPAS_Web_FINAL.​pdf.​aspx [accessed 2017-06-26] [WebCite Cache]
    5. Royal College of Physicians and Surgeons of Canada. Royalcollege. 2015. Program requirements for graduate medical education in psychiatry   URL: http://www.royalcollege.ca/cs/groups/public/documents/document/mdaw/mdg4/~edisp/088025.pdf [accessed 2017-06-26] [WebCite Cache]
    6. Weerasekera P, Manring J, Lynn DJ. Psychotherapy training for residents: reconciling requirements with evidence-based, competency-focused practice. Acad Psychiatry 2010;34(1):5-12. [CrossRef] [Medline]
    7. APA Presidential Task Force on Evidence-Based Practice. Evidence-based practice in psychology. Am Psychol 2006;61(4):271-285. [CrossRef] [Medline]
    8. National Institute of Mental Health. Socialworkpolicy. Washington, DC: Institute for the Advancement of Social Work Research; 2007. Partnerships to Integrate Evidence-Based Mental Health Practices into Social Work Education and Research. Institute for the Advancement of Social Work Research   URL: http://www.socialworkpolicy.org/documents/EvidenceBasedPracticeFinal.pdf [accessed 2017-06-26] [WebCite Cache]
    9. van Schaik DJ, Klijn AF, van Hout HP, van Marwijk HW, Beekman AT, de Haan M, et al. Patients' preferences in the treatment of depressive disorder in primary care. Gen Hosp Psychiatry 2004;26(3):184-189. [CrossRef] [Medline]
    10. Kovach JG, Dubin WR, Combs CJ. Psychotherapy training: residents' perceptions and experiences. Acad Psychiatry 2015 Oct;39(5):567-574. [CrossRef] [Medline]
    11. Klerman GL, Dimascio A, Weissman M, Prusoff B, Paykel ES. Treatment of depression by drugs and psychotherapy. Am J Psychiatry 1974 Feb;131(2):186-191. [CrossRef] [Medline]
    12. Weissman MM, Prusoff BA, Dimascio A, Neu C, Goklaney M, Klerman GL. The efficacy of drugs and psychotherapy in the treatment of acute depressive episodes. Am J Psychiatry 1979 Apr;136(4B):555-558. [Medline]
    13. Elkin I, Shea MT, Watkins JT, Imber SD, Sotsky SM, Collins JF, et al. National Institute of Mental Health Treatment of Depression Collaborative Research Program. General effectiveness of treatments. Arch Gen Psychiatry 1989 Nov;46(11):971-82; discussion 983. [Medline]
    14. American Psychiatric Association. Psychiatryonline. Washington, D.C: American Psychiatric Association; 2010. Practice guideline for the treatment of patients with major depressive disorder   URL: https://psychiatryonline.org/pb/assets/raw/sitewide/practice_guidelines/guidelines/mdd.pdf [accessed 2017-06-26] [WebCite Cache]
    15. Trangle M, Gursky J, Haight R, Hadwig J, Hinnekamp T, Kessler D, et al. Adult Depression in Primary Care. Bloomington, MN: Institute for Clinical Systems Improvement; 2016.
    16. National Institute for Health and Clinical Excellence. NICE. London: National Institute of Health and Clinical Excellence; 2009. Depression: The treatment and management of depression in adults   URL: https://www.nice.org.uk/guidance/cg90 [accessed 2017-06-26] [WebCite Cache]
    17. Segal ZV, Whitney DK, Lam RW, CANMAT Depression Work Group. Clinical guidelines for the treatment of depressive disorders. III. Psychotherapy. Can J Psychiatry 2001 Jun;46(Suppl 1):29S-37S. [Medline]
    18. Cuijpers P, Donker T, Weissman MM, Ravitz P, Cristea IA. Interpersonal psychotherapy for mental health problems: a comprehensive meta-analysis. Am J Psychiatry 2016 Jul 01;173(7):680-687. [CrossRef] [Medline]
    19. Jakobsen JC, Hansen JL, Simonsen S, Simonsen E, Gluud C. Effects of cognitive therapy versus interpersonal psychotherapy in patients with major depressive disorder: a systematic review of randomized clinical trials with meta-analyses and trial sequential analyses. Psychol Med 2012 Jul;42(7):1343-1357. [CrossRef] [Medline]
    20. Markowitz JC, Weissman MM. Interpersonal psychotherapy: principles and applications. World Psychiatry 2004 Oct;3(3):136-139 [FREE Full text] [Medline]
    21. Weerasekera P. The state of psychotherapy supervision: recommendations for future training. Int Rev Psychiatry 2013 Jun;25(3):255-264. [CrossRef] [Medline]
    22. Khanna M, Kendall P. Bringing technology to training: web-based therapist training to promote the development of competent cognitive-behavioral therapists. Cogn Behav Pract 2015 Aug;22(3):291-301. [CrossRef]
    23. Gardner H. Multiple intelligences: the theory in practice. New York: Basic Books; 1993.
    24. Sholomskas DE, Carroll KM. One small step for manuals: computer-assisted training in twelve-step facilitation. J Stud Alcohol 2006 Nov;67(6):939-945 [FREE Full text] [Medline]
    25. Dimeff LA, Woodcock EA, Harned MS, Beadnell B. Can dialectical behavior therapy be learned in highly structured learning environments? Results from a randomized controlled dissemination trial. Behav Ther 2011 Jun;42(2):263-275. [CrossRef] [Medline]
    26. Dimeff LA, Harned MS, Woodcock EA, Skutch JM, Koerner K, Linehan MM. Investigating bang for your training buck: a randomized controlled trial comparing three methods of training clinicians in two core strategies of dialectical behavior therapy. Behav Ther 2015 May;46(3):283-295. [CrossRef] [Medline]
    27. Gaggioli A, Pallavicini F, Morganti L, Serino S, Scaratti C, Briguglio M, et al. Experiential virtual scenarios with real-time monitoring (interreality) for the management of psychological stress: a block randomized controlled trial. J Med Internet Res 2014;16(7):e167 [FREE Full text] [CrossRef] [Medline]
    28. Pantziaras I, Fors U, Ekblad S. Training with virtual patients in transcultural psychiatry: do the learners actually learn? J Med Internet Res 2015 Feb 16;17(2):e46 [FREE Full text] [CrossRef] [Medline]
    29. Kobak K, Wolitzky-Taylor K, Craske MG, Rose RD. Therapist training on cognitive behavior therapy for anxiety disorders using internet-based technologies. Cogn Ther Res 2016 Nov 15;41(2):252-265. [CrossRef]
    30. Kobak KA, Engelhardt N, Lipsitz JD. Enriched rater training using Internet based technologies: a comparison to traditional rater training in a multi-site depression trial. J Psychiatr Res 2006 Apr;40(3):192-199. [CrossRef] [Medline]
    31. Kobak KA, Lipsitz J, Williams JB, Engelhardt N, Jeglic E, Bellew KM. Are the effects of rater training sustainable? Results from a multicenter clinical trial. J Clin Psychopharmacol 2007 Oct;27(5):534-535. [CrossRef] [Medline]
    32. Edmunds JM, Beidas RS, Kendall PC. Dissemination and implementation of evidence-based practices: training and consultation as implementation strategies. Clin Psychol (New York) 2013 Jun 01;20(2):152-165 [FREE Full text] [CrossRef] [Medline]
    33. Beidas RS, Edmunds JM, Marcus SC, Kendall PC. Training and consultation to promote implementation of an empirically supported treatment: a randomized trial. Psychiatr Serv 2012 Jul;63(7):660-665 [FREE Full text] [CrossRef] [Medline]
    34. Vincent A, Ross D. Learning style awareness: a basis for developing teaching and learning strategies. Journal of Research on Technology in Education 2001;33(5):-.
    35. Roediger H, Karpicke J. Test-enhanced learning. Taking memory tests improves long-term retention. Psychol Sci 2006;17(3):249-254. [CrossRef]
    36. Mayer R, Moreno R. Nine ways to reduce cognitive load in multimedia learning. Educ Psychol 2003 Mar;38(1):43-52. [CrossRef]
    37. Weissman M, Markowitz J, Klerman G. Clinician's quick guide to interpersonal psychotherapy. New York: Oxford University Press; 2007.
    38. Kobak KA, Craske MG, Rose RD, Wolitsky-Taylor K. Web-based therapist training on cognitive behavior therapy for anxiety disorders: a pilot study. Psychotherapy (Chic) 2013 Jun;50(2):235-247 [FREE Full text] [CrossRef] [Medline]
    39. Wainer AL, Ingersoll BR. Disseminating ASD interventions: a pilot study of a distance learning program for parents and professionals. J Autism Dev Disord 2013 Jan;43(1):11-24. [CrossRef] [Medline]
    40. Kobak KA, Lipsitz JD, Williams JB, Engelhardt N, Bellew KM. A new approach to rater training and certification in a multicenter clinical trial. J Clin Psychopharmacol 2005 Oct;25(5):407-412. [Medline]
    41. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don't train in vain: a dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. J Consult Clin Psychol 2005 Feb;73(1):106-115 [FREE Full text] [CrossRef] [Medline]
    42. Williams JB, Kobak KA, Bech P, Engelhardt N, Evans K, Lipsitz J, et al. The GRID-HAMD: standardization of the Hamilton Depression Rating Scale. Int Clin Psychopharmacol 2008 May;23(3):120-129. [CrossRef] [Medline]
    43. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. J Usability Stud 2009;4(3):114-123 [FREE Full text]
    44. Bangor A, Kortum PT, Miller JT. An empirical evaluation of the System Usability Scale. Int J Hum Comput Interact 2008 Jul 30;24(6):574-594. [CrossRef]
    45. Kobak KA, Opler MG, Engelhardt N. PANSS rater training using Internet and videoconference: results from a pilot study. Schizophr Res 2007 May;92(1-3):63-67. [CrossRef] [Medline]
    46. Cohen J. A power primer. Psychol Bull 1992;112(1):155-159. [CrossRef]
    47. Park J, Choi H. Factors influencing adult learners' decision to drop out or persist in online learning. Educational Technology & Society 2009;12(4):207-217 [FREE Full text]
    48. Weerasekera P. Psychotherapy Training e-Resources (PTeR): on-line psychotherapy education. Acad Psychiatry 2013 Jan 01;37(1):51-54. [CrossRef] [Medline]
    49. Hickey C, McAleer S. Competence in psychotherapy: the role of e-learning. Acad Psychiatry 2017 Feb;41(1):20-23. [CrossRef] [Medline]
    50. Tan A, Philipp D, Malat J, Feder V, Kulkarni C, Lawson A, et al. Lost in transition: examining transitions in psychotherapy training. Acad Psychiatry 2015 Oct;39(5):580-584. [CrossRef] [Medline]
    51. Bearman SK, Weisz JR, Chorpita BF, Hoagwood K, Ward A, Ugueto AM, Research Network on Youth Mental Health. More practice, less preach? the role of supervision processes and therapist characteristics in EBP implementation. Adm Policy Ment Health 2013 Nov;40(6):518-529 [FREE Full text] [CrossRef] [Medline]
    52. Smith JL, Carpenter KM, Amrhein PC, Brooks AC, Levin D, Schreiber EA, et al. Training substance abuse clinicians in motivational interviewing using live supervision via teleconferencing. J Consult Clin Psychol 2012 Jun;80(3):450-464 [FREE Full text] [CrossRef] [Medline]
    53. Chu BC, Carpenter AL, Wyszynski CM, Conklin PH, Comer JS. Scalable options for extended skill building following didactic training in cognitive-behavioral therapy for anxious youth: a pilot randomized trial. J Clin Child Adolesc Psychol 2017;46(3):401-410. [CrossRef] [Medline]
    54. Fairburn CG, Wilson GT. The dissemination and implementation of psychological treatments: problems and solutions. Int J Eat Disord 2013 Jul;46(5):516-521 [FREE Full text] [CrossRef] [Medline]
    55. Kobak KA, Mundt JC, Kennard B. Integrating technology into cognitive behavior therapy for adolescent depression: a pilot study. Ann Gen Psychiatry 2015;14:37 [FREE Full text] [CrossRef] [Medline]
    56. Stein BD, Celedonia KL, Swartz HA, DeRosier ME, Sorbero MJ, Brindley RA, et al. Implementing a web-based intervention to train community clinicians in an evidence-based psychotherapy: a pilot study. Psychiatr Serv 2015 Sep;66(9):988-991 [FREE Full text] [CrossRef] [Medline]


    Abbreviations

    CBT: cognitive behavioral therapy
    EBT: evidence-based therapy
    IPT: interpersonal psychotherapy
    SUS: System Usability Scale


    Edited by G Eysenbach; submitted 02.05.17; peer-reviewed by J Greist, H Swartz; comments to author 31.05.17; revised version received 04.06.17; accepted 05.06.17; published 17.07.17

    ©Kenneth A Kobak, Joshua D Lipsitz, John C Markowitz, Kathryn L Bleiberg. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 17.07.2017.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.