MEFANET Journal 2017; 5(1): 19-27
ORIGINAL ARTICLE
Evaluation of student and tutor response to the simultaneous implementation of a new PBL curriculum in Georgia, Kazakhstan and Ukraine, based on the medical curriculum of St George’s, University of London
Luke Andrew Woodham, Ella Poulton, Trupti Jivram, Sheetal Kavia, Aurora Sese Hernandez, Christian Simon Sahakian, Terry Poulton*
St George’s, University of London, London, United Kingdom
* Corresponding author: tpoulton@sgul.ac.uk
Abstract
Article history:
Received 14 May 2016
Revised 10 April 2017
Accepted 11 April 2017
Available online 24 April 2017
Peer review:
Martin Komenda, Jakub Gregor
Download PDF
Background: In 2012, the European Commission funded a three-year TEMPUS project, ePBLnet, which set out to replace the traditional didactic medical curricula of 6 Medical Schools in Georgia, Kazakhstan and Ukraine, with a Problem-Based Learning (PBL) curriculum based on St George’s, University of London (SGUL) curriculum. SGUL has experience in adapting its curriculum to other language and cultural environments but this adaptation represented a much larger step in complexity and degree of cultural change.
Objective: To explore the outcomes of the implementation of PBL in 6 Medical Schools in Georgia, Kazakhstan and Ukraine, from the point of view of the PBL tutors and the PBL students in accordance with the initial evaluation plan of the project.
Methods: Two surveys were created and distributed amongst the PBL tutors and PBL students to analyze the impact of the PBL methodology. A total of 33 tutors and 144 students from the 6 institutions completed the survey. The surveys were created and distributed online and were available in Russian and English to avoid language and distance barriers.
Results: The results show that the ePBLnet project has created a solid foundation for the tutors, as well as for the students for successful implementation of PBL in all 6 institutions. Both the students and the tutors considered that implementation of the ePBLnet project has been of high quality. Furthermore, the data supports the affirmation that PBL increases the engagement of the students.
Conclusions: The outcomes of this implementation have been highly successful, and are being used to justify further use of PBL in countries with cultural similarities. Further evidence needs to be collected to explore whether learning is enhanced in comparison with traditional methods.
Keywords
Technology transfer; medical education; problem-based learning
Introduction
Lecture-based learning approaches have been dominant in most classrooms in traditional tertiary education for much of the twentieth century. This was particularly true in the Soviet Union where for several generations, a centrally-controlled content driven approach was used in admission, curricular and pedagogic policies. Training was based on scientific knowledge and specialisation [1] which was regarded as the most efficient and effective approach for preparing students for their future working environments in medicine and healthcare. From this position students would move directly to an apprenticeship phase, in clinical attachments.
Post-Soviet cultures retained a legacy of this didactic teaching and common structure, long after the dissolution of the Soviet Union. This was true in countries as regionally separated as Georgia in the Caucasus, Kazakhstan in Central Asia and Ukraine in Eastern Europe, where the curricula still retained a common traditional structure. However within all these regions it was recognised that conventional methods of teaching often failed to motivate students, or support them as active learners [2]. Teaching methods based on just acquisition of knowledge no longer appear to fully prepare students with the skills and attributes they would require in their future working environments. Gradually, training has moved away from a total concentration on scientific knowledge, towards a greater concentration on clinical management, skills, and practice or team working competencies. Many post-Soviet states followed these developments. In Central Asia, Caucasus and Ukraine, efforts continue to be made to accommodate international standards of medical education, and reforms were assisted by international agencies and promoted by the formulation of regional guidelines [3,4].
Curricula innovations in medicine are more recently built around enquiry-based collaborative approaches to learning, especially with Problem Based Learning (PBL), where students work in teams to explore, manage or solve a problem. Moreover, recent curricula have benefited from technological developments to introduce interactive forms of PBL using ‘virtual patients’ (VPs) [5].
Many teachers regard the use of PBL as controversial. A systematic review of the PBL in undergraduate, preclinical medical education had shown inconsistent results concerning the effectiveness of PBL relative to more traditional methods [6]. In particular there are concerns that basic knowledge may not be adequately acquired in such a system [7,8]. Moreover, there was some evidence that a learner-directed rather than teacher-directed educational system may bring its own issues to PBL in a previously didactic system.
Despite this evidence, PBL has been largely supported to improve medical curriculum as an active learning strategy alternative to the unidirectional teaching style, improving the quality of Medical Education and specifically team working and clinical reasoning. Both qualities are considered fundamental to clinical practice.
St George’s, University of London (SGUL) set out to address both these potential issues, the knowledge concerns, and the motivation to learn. SGUL established a more immersive PBL experience in which students had the opportunity to manage the patient in a more authentic way. Cases were converted into branched VPs which led to the transformation of the traditional PBL into decision PBL (D-PBL) [5,9].
Following this, in a European Commission Tempus-funded project developed by SGUL and coordinated by Aristotle University of Thessaloniki (AUTH), a consortium of 9 universities across Eurasia began the implementation of PBL in 6 medical schools from Ukraine, Kazakhstan and Georgia. The focus was on competence-based learning, built around PBL and VPs using a more immersive form of interactive PBL (or D-PBL). These countries in common with many post-Soviet countries, still shared common curriculum characteristics derived from the former Soviet Union’s centrist policies.
This paper evaluates the implementation procedures of the interactive PBL in these post-Soviet countries. This evaluation will be based on the stakeholders at the end-point of the curriculum change: PBL students and PBL tutors. It explores the experience of the students when implementing PBL and considers how the use of technology impact upon the student experience. It analyses the experience of the tutors when implementing PBL, in terms of impact upon the skills required by staff, the effectiveness of staff training and considers whether PBL curriculum alters the facilitation role of tutor, and whether PBL tutors have sufficient technology skills for the new course.
Methods
The ePBLnet evaluation covered all curriculum development activities and project outputs, aiming to provide a summary of the project progress and capture the experiences of key stakeholders, as well as to identify any unintended outcomes that resulted from the project work. The evaluation report was the foundation for disseminating key findings and recommendations that emerged from the project, this information providing a basis for future work.
We adapted the evaluation plan from an existing methodology for project evaluation [10]. The evaluation was primarily summative, to assess the effectiveness of the project and its outcomes. We used mixed-methods, with quantitative methods to gather feedback from the large student population, and qualitative methods to gather more in-depth opinions from smaller tutor and partner groups.
We initially created a conceptual model of the project, identifying the project inputs and long-term outputs, and mapped the project activities to the key short-term outputs and deliverables. This provided a clear overview of the project, and enabled the key stakeholders and evaluation questions to be identified, the questions relating to students and tutors are shown in Table 1. The conceptual model was primarily created by a member of the project team (LW), and the stakeholders and key evaluation questions were derived from this model. The nature of the questions were based upon experiences of a previous curriculum transformation project, the JISC-funded (Joint Information Systems Committee) project Generation 4 [11].
Table 1. Summary of student and tutor stakeholder evaluation questions
Stakeholders |
Audience’s key values, interests, expectations |
Key evaluation questions |
Students |
Student experience, student performance |
Does the use of PBL increase student motivation and engagement? |
PBL Tutors |
Training requirements |
Does the PBL curriculum impact upon the skills required by staff? |
Having identified the key stakeholders and evaluation questions we devised a strategy which would ensure that we were able to collect sufficient data to evaluate the project effectively. This strategy was heavily informed by practical concerns, and two key difficulties were identified in the data collection and data analysis stages of the project; the geographical separation of the partners and the barriers caused by language differences represented a significant challenge.
To address this, all data collection was performed using online tools, and instruments tailored to the requirements of both that particular evaluation activity and the target stakeholder group. The use of online tools for data collection negated the challenge represented by the geographical location of partners, as all data was stored in a central online repository. The evaluation instruments were developed with the collaboration of all project partners. The responsible partner for the evaluation activity developed a first draft for the questionnaire/interview question stem, and solicited feedback from other partners. Evaluation instruments were adapted from an existing validated resource [12] designed as an evaluation instrument for VP activities. It was agreed that the instruments would limit the use of free-text response questions, ensuring that the challenge of data analysis across multiple languages was reduced. Where translation was required, partners would be responsible for translating responses into English as a common language for analysis.
Evaluating the project from the perspective of students and PBL tutors required the gathering of data from a relatively large sample of participants, for which we used an online questionnaire developed using SurveyMonkey [13]. The choice of tool was due to the clear advantages offered over paper responses; the ability to store response data in a single online location thus reducing lost or incomplete responses, provision of mandatory questions, and data validation which ensures correctly formatted and legible responses. The questions were written in English, with a Russian translation provided directly below each question. Questions were predominantly closed-ended and represented using Likert items and multiple choice questions, thus limiting the impact of the language barrier for both analysis and completion by minimizing the number of free-text responses required.
Table 2. Evaluation participants and related evaluation instruments
Participant (evaluation focus) |
Data collection methods |
Evaluation instrument |
Students (student experience) |
Online |
Student experience survey |
PBL tutors (tutor experience) |
Online |
Tutor experience survey |
The PBL tutor online survey was formed by 9 questions that were to be completed by the tutors after the implementation of the interactive PBL and carrying out the PBL sessions. The questions were built to give answer to the key research questions mentioned above.
The student online survey was formed by 16 questions that were to be completed by the students at the end of the implementation of the changes in the curriculum and carrying out the PBL sessions. The questions were built to give answer to the key research questions mentioned above.
We analysed quantitative data collected from the online surveys using SurveyMonkey online tools. These tools provided aggregated responses, calculated means and percentage response rates, and also constructed tables and charts for visual analysis and confirmation of trends in the student and tutor experience. Bar charts were used for visual analysis, but results were primarily presented in frequency tables since these provided the most complete view of the data. We used descriptive statistics to summarise the data; means were calculated to reflect the central tendency of responses, and standard deviation was calculated to provide an indicator of the variability of the responses. We provided full frequency data and the use of means and standard deviation in line with evidence that this would provide a full picture of the responses, and that the use of parametric methods was appropriate for ordinal Likert-type data [14,15].
A process of manifest content analysis was conducted to analyse the limited number of open-ended responses. Three researchers (AS, EP and CS) identified a schema of codes based upon the conceptual model and key evaluation questions identified in the evaluation plan. Each reviewed the open-ended responses, coding for quotations that addressed the areas classified in the codes. Having done so individually, their analyses of the data were compared and merged, with discrepancies agreed by discussion. This process allowed key quotations in the open-ended responses to be identified that provided additional context to the findings in quantitative data.
Results
Students
A total of 144 students from 6 partner institutions of the ePBLnet project completed the online questionnaire. The distribution within the Partner Institutions (Ukraine, Kazakhstan and Georgia) is shown in Table 3.
Table 3. Distribution of student responses per institution
Name of the institution |
Number of students |
David Tvildiani Medical University (DTMU) |
24 |
Akaki Tsereteli State University (ATSU) |
13 |
Sumy State University (SSU) |
15 |
Zaporozhye State Medical University (ZSMU) |
39 |
Karaganda State medical University (KSMU) |
28 |
JSC Astana Medical University (AMU) |
25 |
For ease of analysis, and based on identified areas/directions of research, the 16 survey questions were collated into 5 thematic groups: engagement, improving performance, workload, use of technology, quality and overall evaluation (shown in table 4). Likert responses were classified numerically from 1 (strongly disagree) to 5 (strongly agree) for purposes of generating descriptive statistics.
Table 4. Results of the survey developed to evaluate student response
|
Answer options |
Strongly disagree (1) |
Disagree (2) |
Not sure (3) |
Agree (4) |
Strongly agree (5) |
Mean |
Standard deviation |
|
ENGAGEMENT |
|
|
|
|
|
|
|
Q1 |
I felt I had to make the same decisions a doctor would make in real life |
1 |
3 |
10 |
78 |
47 |
4.2 |
0.722 |
Q2 |
I felt I were the doctor caring for the patient |
1 |
3 |
25 |
74 |
36 |
4.0 |
0.768 |
Q3 |
I was actively engaged in gathering the information I needed to characterize the patient’s problem |
0 |
0 |
4 |
66 |
69 |
4.5 |
0.554 |
Q4 |
I was actively engaged in revising my initial diagnosis as new information became available |
1 |
1 |
7 |
67 |
63 |
4.4 |
0.680 |
Q5 |
I was actively engaged in creating a short summary of the patient’s problem using medical terms |
0 |
3 |
15 |
74 |
47 |
4.2 |
0.705 |
Q6 |
I was actively thinking about how the details of the case supported my differential diagnosis |
1 |
1 |
11 |
77 |
49 |
4.2 |
0.685 |
|
IMPROVES FUTURE PERFORMANCE IN REAL LIFE |
|
|
|
|
|
|
|
Q10 |
I feel better prepared to confirm a diagnosis and exclude differential diagnoses in a real life patient with this complaint |
1 |
4 |
11 |
69 |
54 |
4.2 |
0.771 |
Q11 |
After completing the cases I feel better prepared to care for a real life patient with this complaint |
1 |
1 |
15 |
71 |
51 |
4.2 |
0.720 |
|
WORKLOAD |
|
|
|
|
|
|
|
Q13 |
Participating in interactive PBL was a heavy workload |
8 |
28 |
42 |
40 |
21 |
3.3 |
1.118 |
|
USE OF TECHNOLOGY |
|
|
|
|
|
|
|
Q14 |
The use of technology in interactive PBL was effective and worked well |
1 |
2 |
12 |
78 |
46 |
4.2 |
0.708 |
Q15 |
The technology in interactive PBL was easy-to-use and reliable |
0 |
2 |
27 |
81 |
29 |
4.0 |
0.678 |
|
QUALITY AND OVERALL EVALUATION |
|
|
|
|
|
|
|
Q7 |
I felt that the cases were at the appropriate level of difficulty for my level of training |
2 |
12 |
33 |
61 |
31 |
3.8 |
0.939 |
Q8 |
The decisions I needed to make while working through the cases were helpful in enhancing my diagnostic reasoning |
1 |
2 |
7 |
58 |
71 |
4.4 |
0.718 |
Q9 |
The feedback I received from the case was helpful in enhancing my diagnostic reasoning |
1 |
1 |
8 |
57 |
72 |
4.4 |
0.699 |
Q12 |
Overall, working through the cases was a worthwhile learning experience |
1 |
1 |
3 |
50 |
84 |
4.5 |
0.648 |
Q16 |
I would be keen to participate in further interactive PBL sessions in the future |
1 |
0 |
5 |
56 |
77 |
4.5 |
0.639 |
In general the students felt both engaged and motivated while participating in the PBL sessions. The majority of students considered they were making the same decisions a doctor would make in real life (m: 4.2, SD: 0.722), and felt as if they were doctors caring for the patient (m: 4.0, SD: 0.768). The students also reported they felt engaged while: a) gathering information to characterise the patient’s problem (m: 4.5, SD: 0.554); b) reviewing the initial diagnosis (m: 4.4, SD: 0.680) c); creating a short summary of the patient’s problem using medical terms (m: 4.2, SD: 0.705); and d) developing the differential diagnosis.
Students believed after participating in the PBL sessions they were better prepared for real life, in terms of delivering a differential diagnosis in the PBL cases (m: 4.2, SD: 0.771), and in caring for a real patient (m: 4.2, SD: 0.720).
The students were satisfied with the implementation of the PBL considering the following areas: use of the technology, decisions to be made during the PBL, and the feedback received. Students reported that technology was used in an effective manner (m: 4.2, SD: 0.708) and that it was reliable and easy to use (m: 4.0, SD: 0.678). Students also reported that the decision making and the feedback received were helpful to enhance their diagnostic reasoning (m: 4.4, SD: 0.718 and m: 4.4, SD: 0.699 respectively). The results also indicated students thought that “working through cases was a worthwhile learning experience” (m: 4.5, SD: 0.648) and they “would be keen to participate in further interactive PBL sessions” (m: 4.5, SD: 0.639).
The students showed imperceptible disagreement regarding "the adequacy of the level of difficulty (of the PBL cases) for their level of training”. This item has received a lower score in its grouping with mean score of 3.8 and a higher standard deviation (SD: 0.939).
The lowest mean score amongst all the questions within the survey was given to Question 13 (Q13) which was anticipated since the question was phrased to be from a negative perspective. The results of this question indicate that the students consider that “participating in interactive PBL has required a very high workload” (m: 3.3). The standard deviation show a low consistency among the responses in that item (SD: 1.118).
Tutors
A total of 33 PBL tutors from the partners institutions participating in the ePBLnet project completed the online questionnaire. The response rate from each institution is shown in Table 5.
Table 5. Distribution of tutor responses per institution
Name of the institution |
Number of tutors |
David Tvildiani Medical University (DTMU) |
6 |
Akaki Tsereteli State University (ATSU) |
4 |
Sumy State University (SSU) |
2 |
Zaporozhye State Medical University (ZSMU) |
7 |
Karaganda State medical University (KSMU) |
5 |
JSC Astana Medical University (AMU) |
9 |
Based on identified areas/directions of research, 9 questions were collated into 6 thematic groups: student engagement, learning objectives, difficulty of tutor facilitation, use of technology, tutor training and resources, and willingness to further use PBL.
Table 6. Results of the survey developed to evaluate tutor response
|
Answer options |
Strongly disagree (1) |
Disagree (2) |
Not sure (3) |
Agree (4) |
Strongly agree (5) |
Mean |
Standard deviation |
|
STUDENT ENGAGEMENT |
|
|
|
|
|
|
|
Q1 |
The interactive PBL sessions provoked high-quality discussion amongst the group |
0 |
0 |
1 |
23 |
14 |
4.3 |
0.527 |
Q2 |
The group found the interactive PBL sessions engaging |
0 |
0 |
2 |
21 |
15 |
4.3 |
0.575 |
|
LEARNING OBJECTIVES |
|
|
|
|
|
|
|
Q3 |
The cases met all the required learning objectives |
0 |
3 |
4 |
26 |
5 |
3.9 |
0.732 |
|
DIFFICULTY OF TUTOR FACILITATION |
|
|
|
|
|
|
|
Q4 |
The use of interactive PBL made tutoring the session difficult |
5 |
21 |
7 |
4 |
1 |
2.3 |
0.926 |
|
USE OF TECHNOLOGY |
|
|
|
|
|
|
|
Q5 |
The technology used to support the interactive PBL was effective |
0 |
0 |
1 |
33 |
4 |
4.1 |
0.354 |
Q6 |
The technology used to support the interactive PBL was easy to use and reliable |
0 |
2 |
4 |
25 |
7 |
4.0 |
0.707 |
|
TUTOR TRAINING AND RESOURCES |
|
|
|
|
|
|
|
Q7 |
I was provided with all the resources I needed to tutor the interactive PBL sessions effectively |
0 |
3 |
4 |
22 |
9 |
4.0 |
0.811 |
Q8 |
I have received appropriate levels of training and support to be able to tutor the PBL sessions effectively |
0 |
0 |
1 |
26 |
11 |
4.2 |
0.497 |
|
WILLINGNESS FURTHER USE PBL |
|
|
|
|
|
|
|
Q9 |
I would be happy to tutor further interactive PBL sessions in the future |
0 |
0 |
0 |
19 |
19 |
4.5 |
0.500 |
The majority of tutors believed “the group found the interactive PBL sessions engaging” (m: 4.3, SD: 0.575), and the “PBL sessions provoked high-quality discussion amongst the group” (m: 4.3, SD: 0.527).
The responses suggested that the tutors agreed on the technology supporting PBL was effective (m: 4.1, SD: 0.354), reliable and easy to use (m: 4.0, SD: 0.707) and agreed they were given the necessary resources (m: 4.0, SD: 0.811) as well as the appropriate training and support to implement PBL effectively (m: 4.2, SD: 0.497). Furthermore, most tutors felt that the PBL “met all required learning objectives” (m: 3.9, SD: 0.732).
An important finding from the tutor responses is the lower mean score obtained in Question 4 (Q4) “the use of interactive PBL made tutoring the session difficult” (m: 2.3) as well as the higher standard deviation (SD: 0.026) which reflects the differing opinions amongst tutors. However, the tutors would be willing to continue tutoring further PBL sessions (m: 4.5, SD: 0.500).
Discussion
Since the first introduction of PBL in the 1960s, many studies have been performed to analyse the premise that PBL methodology results in enhanced learning. This study does not set out to add to that data, but rather to explore the impact of implementing PBL within the 6 institutions. Sharing of existing curriculum and PBL cases is common, and SGUL itself adapted its own curriculum from the University of Flinders [16] and will have faced such challenging circumstances, with all changes carried out at the same time across a range of culturally-distinct institutions.
This study is focused on the evaluation of the PBL experience as reported by students and tutors, and assumes that the repurposing of the PBL cases has been successful during the implementation of the ePBLnet project. The research has shown that the repurposing of VPs from other culture and language is effective, not only because it improves the cost effectiveness of the results, but also because it has been considered by the case adaptors as a learning process for the creation of VPs, for use in their more general teaching [17]. Further research has shown that repurposing VP from a different culture and language doesn’t imply a significant difference between the exam scores, compared to students that used cases that had been originally created in the same language and culture [18].
Given the previous assumption, this article has been constructed around the data during the implementation of the ePBLnet project, but before the longer-term success of this new curriculum project can be tested. It is therefore deliberately focused on the data reported from the students and the PBL tutors as the final end-point of that curriculum change. Though the process of creation is critical, at this stage the acceptability of change within the institution is measured chiefly by the experience of the students on the course, and the staff who run it.
The most significant and supported result of this article is the statement that the participation in the PBL sessions has been an engaging experience for the students, which can be considered an indicator of the effectiveness of the interactive PBL. Both students and tutors have stated that.
Bearing in mind that the students come from didactic educational cultures, it is perhaps surprising that both the tutors and students consider that the students have been highly engaged during their participation in the interactive PBL. The tutors reported that PBL provoked a high-quality discussion which can be considered as an indicator of the student’s engagement. The students reported feeling better prepared for the real life clinical situations after participating in PBL sessions. These two aspects (engagement and feeling better prepared for real life) are the most relevant and positive aspect of the results for the institutions, who have changed their curricula at considerable cost. The responses of course reflect a subjective opinion of the students, and cannot be extrapolated to suggest improvement of the performance in real life situations.
Some students reported that participating in interactive PBL lessons had implied a heavy workload and tutors consider that the interactive tool may require personal and professional qualities that are not normally used in contact with students. There was some suggestion that the original educational level of the case may not be perfectly matched in all cases to the level of the student's current year of academia. Nevertheless, they consider this tool to be effective, and were willing to continue using it.
Tutors reported that they were provided with the necessary resources and training to be able to implement the interactive PBL training and achieve the learning objectives in their own culture and language. Both tutors and the students considered that the use of technology to implement the interactive PBL was effective, reliable and easy to use.
In summary, the ePBLnet project has created the solid foundation amongst tutors and students for the successful implementation of PBL, and this foundation is largely supported by the engagement and motivation of the students, despite some increase in workload and the increased difficulty reported by the tutors.
Further collection of evidence is needed to prove whether the increased motivation has any benefit in terms of the acquired knowledge compared to traditional methods. Future research is necessary to show whether there is improved performance in real life situations from the students that participated in PBL, compared with the traditional curriculum.
Acknowledgements
We would like to thank the following institutions for their collaboration in the ePBLnet project and especially to their students and PBL tutors, who provided feedback to their PBL sessions to use this data for this article: David Tvildiani Medical University (DTMU); Akaki Tsereteli State University (ATSU); Sumy State University (SSU); Zaporozhye State Medical University (ZSMU); Karaganda State Medical University (KSMU); Astana Medical University (AMU). We would also like to thank the European Commission for funding the ePBLnet project and Aristotle University of Thessaloniki for coordinating the project.
Conflicts of Interest
The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.
References
1. Barr DA, Schmid R. Medical education in the former Soviet Union. Acad Med 1996; 71(2): 141-145.
2. Duch BJ, Groh SE, Allen DE. The power of problem-based learning : a practical “how to” for teaching undergraduate courses in any discipline. Stylus Pub: 2001.
3. Conaboy KA, Nugmanova Z, Yeguebaeva S, Jaeger F, Daugherty RM. Central Asian republics: a case study for medical education reform. J Contin Educ Health Prof 2005; 25(1): 52-64.
4. Luck J, Peabody JW, DeMaria LM, Alvarado CS, Menon R. Patient and provider perspectives on quality and health system effectiveness in a transition economy: evidence from Ukraine. Soc Sci Med 2014; 114: 57-65.
5. Poulton T, Conradi E, Kavia S, Round J, Hilton S. The replacement of ‘paper’ cases by interactive online virtual patients in problem-based learning. Med Teach 2009; 31(8): 752-758.
6. Hartling L, Spooner C, Tjosvold L, Oswald A. Problem-based learning in pre-clinical medical education: 22 years of outcome research. Med Teach 2010; 32(1): 28-35.
7. Albanese M, Mitchell S. Problem-based learning: A review of literature on its outcomes and implementation issues. Acad Med 1993; 68(1): 52-81.
8. Nandi PL, Chan JN, Chan CP, Chan P, Chan LP. Undergraduate medical education: comparison of problem-based learning and conventional teaching. Hong Kong Med J 2000; 6(3): 301-306.
9. Poulton T, Ellaway RH, Round J, Jivram T, Kavia S, Hilton S. Exploring the efficacy of replacing linear paper-based patient cases in problem-based learning with dynamic Web-based virtual patients: randomized controlled trial. J Med Internet Res 2014; 16(11): e240.
10. Hood S, Frechtling Westat J, Frierson H, Hood S, Hughes G. The 2002 User-Friendly Handbook for Project Evaluation. National Science Foundation, Directorate for Education and Human Resources: 2002.
11. Bakrania T, Poulton T, Beaumont C. G4 - JISC Final Report. [Online]. 2010. Available at WWW: <http://www.webcitation.org/6T88lKXia>.
12. Huwendiek S, De Leng B, Kononowicz A, Kunzmann R, Muijtjens AMM, Van Der Vleuten CPM, Hoffmann GF, Tönshoff B, Dolmans DHJM. Exploring the validity and reliability of a questionnaire for evaluating virtual patient design with a special emphasis on fostering clinical reasoning. Med Teach 2015; 37(8): 775-782.
13. SurveyMonkey: Free online survey software & questionnaire tool. [Online]. [cit. 15-Jul-2014]. Available at WWW: <https://www.surveymonkey.com/>.
14. Sullivan GM, Artino AR. Analyzing and interpreting data from likert-type scales. J Grad Med Educ 2013; 5(4): 541-542.
15. Norman G. Likert scales, levels of measurement and the ‘laws’ of statistics. Adv Health Sci Educ Theory Pract 2010; 15(5): 625632.
16. Prideaux D, McCrorie P. Models for the development of graduate entry medical courses: two case studies. Med Educ 2004; 38(11): 1169-1175.
17. Muntean V, Calinici T, Tigan S, Fors UGH. Language, culture and international exchange of virtual patients. BMC Med Educ 2013; 13(1): 21.
18. Kononowicz A, Stachoń AJ, Guratowska M, Krawczyk P. To Start From Scratch or To Repurpose: That Is the Question. Bio-Algorithms and Med-Systems 2010; 6(11): 57-63.
Please cite as:
Woodham LA, Poulton E, Jivram T, Kavia S, Sese Hernandez A, Sahakian CS, Poulton T. Evaluation of student and tutor response to the simultaneous implementation of a new PBL curriculum in Georgia, Kazakhstan and Ukraine, based on the medical curriculum of St George’s, University of London. MEFANET Journal 2017; 5(1): 19-27. Available at WWW: http://mj.mefanet.cz/mj-20160514.
This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License (http://creativecommons.org/licenses/by-nc-sa/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the MEFANET Journal, is properly cited. The complete bibliographic information, a link to the original publication on http://www.mj.mefanet.cz/, as well as this copyright and license information must be included.