Abstract
Written evaluations such as multiple choice, true or false, etc, have demonstrated to be valid, reliable, acceptable and with a high impact in education but these methods do not evaluate clinical reasoning or the abilities to develop certain skills. Summary of work: To remedy this, the Script (the script concordance test) is a tool designed to evaluate clinical reasoning and it places examinees in authentic clinical situations where they have to interpret data in order to make decisions, while the OSCE (objective structured clinical exam) evaluates clinical proficiency. At Maimonides University we use these three evaluation instruments separately, but we have never applied these evaluation methods in a combined form. The aim was to design a global evaluation technique (?Multiple Education Evaluation Process?) based on the experience in using different evaluation tools with second and third year students of Medicine at the Maimonides University. Summary of results: To evaluate second year Neumonology we used a written exam and also the OSCE. Of 38 students, 74% passed the written exam and 58% passed the OSCE. 16% passed the written exam but not the OSCE and all the students who passed the CE also passed the written exam.To evaluate third year Nephrology we used the written exam and the Script. Of 28 students, 64% passed the written exam and 75% passed the Script. 18% passed the written exam but did not pass the Script, and 29% passed the Script but did not pass the written exam. Conclusions: As we had already supposed, there was no relationship between the different types of exams. We think that by using the ?Multiple Education Evaluation Process? we will have a better possibility to evaluate in detail the knowledge, abilities and reasoning skills of our students.
Author(s): M Barrios, G Trigo, P Echegoyen, D Bonino, J Matz