Skip to main content
  • Research article
  • Open access
  • Published:

Integrated online formative assessments in the biomedical sciences for medical students: benefits for learning

Abstract

Background

Online formative assessments have a sound theoretical basis, and are prevalent and popular in higher education settings, but data to establish their educational benefits are lacking. This study attempts to determine whether participation and performance in integrated online formative assessments in the biomedical sciences has measurable effects on learning by junior medical students.

Methods

Students enrolled in Phase 1 (Years 1 and 2) of an undergraduate Medicine program were studied over two consecutive years, 2006 and 2007. In seven consecutive courses, end-of-course (EOC) summative examination marks were analysed with respect to the effect of participation and performance in voluntary online formative assessments. Online evaluation surveys were utilized to gather students' perceptions regarding online formative assessments.

Results

Students rated online assessments highly on all measures. Participation in formative assessments had a statistically significant positive relationship with EOC marks in all courses. The mean difference in EOC marks for those who participated in formative assessments ranged from 6.3% (95% confidence intervals 1.6 to 11.0; p = 0.009) in Course 5 to 3.2% (0.2 to 6.2; p = 0.037) in Course 2. For all courses, performance in formative assessments correlated significantly with EOC marks (p < 0.001 for each course). The variance in EOC marks that could be explained by performance in the formative assessments ranged from 21.8% in Course 6 to 4.1% in Course 7.

Conclusion

The results support the contention that well designed formative assessments can have significant positive effects on learning. There is untapped potential for use of formative assessments to assist learning by medical students and postgraduate medical trainees.

Peer Review reports

Background

Assessment has sufficiently powerful effects on learning to be the de facto curriculum [1]. This includes not only what is learnt, but also students' approaches to learning [2–4]. Formative assessments are designed for the purpose of giving feedback on performance and suggestions for improvement, and are intended to promote students' learning [5, 6]. Formative assessments that provide timely, relevant and supportive feedback (not just grades) can contribute to improved learning outcomes [7]. In contrast, summative assessments are predominantly utilized for grading and certification at the end of a period of study, often without providing feedback to students on their performance. Indeed, one of the major weaknesses of most modern higher education programs, as evidenced by course evaluation surveys, is failure to provide adequate feedback to students on their learning [8]. It should be noted that the provision of diagnostic and remedial feedback has been found to be one of the most potent influences on student achievement [9]. If the purpose of assessment is to foster better learning outcomes, it could be argued that formative assessment is the most important assessment practice [10].

Paper-based formative assessments have a number of limitations [11]: students must be gathered together and invigilated; individualized feedback is time-consuming, and might not be feasible with large class sizes [12]; and analysis of question reliability and validity can be tedious. In contrast, Web-based formative assessments offer clear advantages for students: immediacy of feedback; flexibility in time and place of undertaking the assessment; feedback can provide links to learning resources, thereby providing motivation to study; opportunity for repetition; and interactivity [11]. Furthermore, a comparative study has reported that online formative assessments might be of greater benefit for learning than paper-based equivalents [13]. These are persuasive arguments for moving from paper-based to online formative assessments.

Web-based formative assessments also support equity and inclusiveness by allowing students to attempt each assessment anonymously on multiple occasions, at any time, and from virtually anywhere. This permits students with family responsibilities and work commitments to access the assessments at times that are most convenient for them. Many students, particularly those from culturally and linguistically diverse backgrounds, fear embarrassment if found to be in error, which might inhibit their propensity to clarify misconceptions directly with a member of academic staff. Online formative assessments provide a safe environment, where trial and error is permitted.

Although online formative assessments such as quizzes and practice examinations are becoming increasingly common in higher education, there is little formal evidence of their educational effectiveness and available data are both contradictory and inconclusive [12]. Nevertheless, such assessments have proved to be popular with pre-clinical medical students [11, 14], medical students in clinical attachments [15], students of dentistry [16, 17], as well as medical specialist trainees [18]. Thus in modern self-directed medical curricula, formative assessments may be perceived as "a safety net in a self-directed learning course" [5]. However, the potential of these assessments to assist learning by postgraduate trainees has not yet been fully exploited.

We evaluated the impact of online formative assessments on learning by the cohort of students enrolled in the initial 2 years of our undergraduate Medicine program [19] in 2006 and 2007. This employs vertically integrated scenario-based learning for first and second year students during Phase 1, which is comprised of a sequence of nine 8-week courses. Following an introductory Foundations course, these are based on domains related to the life cycle: Beginnings Growth and Development; Health Maintenance; Ageing and Endings; as well as Society and Health. All courses are interdisciplinary: biomedical sciences are integrated with one another; with the social and psychological sciences; and with early clinical experience. Learning is assessed in cross-disciplinary end of course (EOC) written and practical examinations, which are aligned with desired graduate capabilities and support integrated learning [20]. For the purposes of this study, courses in 2006 and 2007 (excluding Foundations, which has no summative assessment) were numbered 1 to 7 in chronological order.

Methods

Design and implementation of formative assessments

One of the authors (GMV) developed integrated formative assessments in the biomedical sciences with automated individualized feedback, which were embedded in each of the sequential 8-week courses in the Medicine program, to facilitate take-up by students [21]. There was one formative assessment per course, which was intended to cover material presented throughout that course. Each assessment was made continuously available from week 5 or 6 until the EOC examination in Week 8. The formative assessments were based on clinical scenarios familiar to the students, providing an authentic context for learning, as well as emphasising the curriculum goals of integration between the biomedical sciences and integration of the biomedical with the clinical, social and behavioural sciences. All assessments were peer-reviewed by subject matter experts. The frequent use of image-based questions (Figures 1 and 2) was intended to increase the level of engagement and interactivity for students.

Figure 1
figure 1

Screenshot of scenario-based extended matching questions. Figure 1 shows the use of extended matching questions employing drop-down lists of alternative answers to test the interpretation of diagnostic investigations, as well as correlation with anatomical and pathological concepts underlying coronary artery disease.

Figure 2
figure 2

Screenshot of drag-and-drop question. Figure 2 shows the use of multiple "hot spots", employing drag and drop markers in an image map to test integrated understanding of Anatomy and Pathology in the context of coronary artery disease.

The software tool employed was Questionmark Perceptionâ„¢ (Questionmark, UK). This system is particularly easy to use for developing a wide variety of question types, including drag-and-drop; extended matching (selection and matrix questions); true/false; multiple choice (MCQs); multiple response; as well as text match and short-answer or essay questions. It also provides the capacity to add high-quality graphics, audio or other multimedia to questions and/or feedback. In order to maximize the impact of formative assessments on learning by students, a mix of short answer questions and objective items (primarily MCQs) was employed. This approximates the format of the EOC summative examinations [5], which consist of a combination of four short-answer questions and two blocks of objective items (MCQs), one of which addresses the practical component of the course. However, no questions from the formative assessments were included in the EOC examinations.

Access to the formative assessments was provided via secure links from the university's eLearning website and was made continuously available for several weeks through to the EOC summative examination. Students were able to repeat each formative assessment on multiple occasions if they wished.

Data collection and statistical analysis

Our study was approved by the Human Research Ethics Committee of The University of New South Wales. Consent for participation in the feedback component of the study was implied by response to the anonymous online survey. Students completed online formative assessments and EOC examinations as part of their learning in Phase 1 of the Medicine program. Correlations between participation and performance in online formative assessments and end of course exams were performed in retrospect. The academic standing of students was not influenced by this study, and students' identities were masked from the investigators in adherence with ethical principles. Therefore, consent was not sought from students.

Participation and performance statistics were gathered via web-based reports from the Questionmark Perceptionâ„¢ results database. Performance in formative assessments was analysed according to each student's best attempt at the assessment. The relationship between these data and EOC marks in seven consecutive Phase 1 courses in 2006 and 2007 was analysed using Student's t-tests, ANOVA and Bonferroni multiple comparisons as appropriate. Regression analyses were used to estimate the component of variation in EOC marks that could be explained by performance in formative assessments. Students who commenced the program in 2004 (n = 8) who were enrolled in Phase 1 in 2006 and for whom data was available, were included in the overall evaluation, but were excluded from the stratified analysis, which focussed on students commencing in 2005 (n = 234), 2006 (n = 235) and 2007 (n = 272).

Student perceptions of the value of online formative assessments were sought via online surveys at the conclusion of the final course in each year, using Likert scales and free-text comments. Comparisons of survey responses between years were performed using a Mann-Whitney U test.

Results

Participation and performance

Online formative assessments with automated individualized feedback have proved to be very popular with students – the participation rate for these voluntary assessments has, on average, been greater than 75% in all courses to date. Many students attempted the assessments on multiple occasions, until they achieved mastery of the material. For example, in Course 5 in 2007, of the 466 students who undertook the formative assessment, 233 (50%) made more than one attempt. The mean time students took to complete the formative assessment in each course was 32 ± 5 minutes (range 16 to 50 minutes).

Participation in formative assessments had a statistically significant positive relationship with EOC exam marks in all courses (Table 1). The mean difference in EOC examination marks for those who participated in formative assessments ranged from 6.3% (95% confidence intervals 1.6 to 11.0; p = 0.009) in Course 5 to 3.2% (0.2 to 6.2; p = 0.037) in Course 2 (Table 1). Interestingly, although any attempt at the formative assessments had a positive association with EOC examination marks, we found that repeated attempts at each formative assessment added little benefit (Table 2).

Table 1 All students enrolled in Phase 1 in 2006 and 2007 – Effect of participation in online formative assessments on End of Course examination marks
Table 2 All students enrolled in Phase 1 in 2006 and 2007 – Relationship between single and multiple attempts at online formative assessments and End of Course examination marks

For all courses, performance in formative assessments had significant but moderate correlations with EOC examination marks (p < 0.001). The variance in EOC examination marks that could be explained by marks in the formative assessments ranged from 21.8% in Course 6 to 4.1% in Course 7 (Table 3).

Table 3 All students enrolled in Phase 1 in 2006 and 2007 – Correlation between formative online assessment score and EOC examination mark

Perceptions

Student response rates for annual online evaluation surveys from 2005–2007 were 52.1%, 51.5% and 48.5% (n = 237, 238 and 246) respectively. Our response rates (which are underestimates, because they are calculated based on the entire cohort, rather than the number of students who completed the online formative assessment from which the survey was linked) are substantially higher than usually achieved in online course evaluation surveys, and approach those for paper-based surveys [22]. Data from student evaluations (Figures 3 and 4) demonstrated that as well as being challenging and enjoyable, the formative assessments were highly valued by students, both as a means of gaining feedback on learning and in planning their future study. From 2005 to 2007, there have been significant changes in students' positive perceptions of online formative assessments. In 2007, these included significantly higher scores for their utility as a guide to study for the end of course examinations (p < 0.001), as well as their overall value as a learning tool (p < 0.005). Improvements implemented as a consequence of student feedback obtained in 2005 and 2006 are likely to have led to increased positive perceptions by students. Examples of such improvements include: elimination of negative marking for incorrect responses; addition of more short-answer questions (similar to the format of EOC examinations) with suggested marking schemes included in the feedback; whenever possible, making formative assessments available earlier in each course to help guide students' study.

Figure 3
figure 3

Student evaluation of online formative assessments in 2007. The data shown in Figure 3 are based on an online survey with 246 respondents out of 507 students who could potentially have accessed the survey (response rate 48.5%).

Figure 4
figure 4

Comparison of student evaluations of online formative assessments over time. Figure 4 shows student evaluations of online formative assessments from 2005–2007. Response rates were 52.1%, 51.5% and 48.5% (n = 237, 238 and 246) respectively, based on online surveys. Statistical analysis by Mann-Whitney U tests: # = p < 0.005 compared with 2005 cohort; ## = p < 0.001 compared with 2005 cohort; * = p < 0.005 compared with 2006 cohort; ** = p < 0.001 compared with 2006 cohort.

Open-ended feedback comments by students in 2007 provided evidence that the formative assessments were achieving their aims. Relevant examples included assertions that each assessment "provides an opportunity to correct misconceptions and actually learn/relearn concepts." and that there "is integration of a variety of concepts learnt throughout the course", which "gives good insight into students' current performance and areas of potential improvement which can be addressed for the actual exam."

Year 1 vs. Year 2 students in vertically integrated courses

In order to determine whether students benefited more from formative assessments in our vertically integrated medical program during their first or second year, the analysis was stratified according to year of commencement (Table 4). We found no consistent association between the length of time students had been enrolled in the program and the impact on learning of participation in formative assessments. However, there was a tendency for students in the first year of their program to derive greater benefit from participating in formative assessments. For students who commenced in 2006, EOC examination marks were significantly influenced by attempting formative assessments only in their first two courses: Course 1 (mean difference 6.9%; 95% confidence intervals 2.1–11.6; p = 0.006); and Course 2 (6.3%; 1.4–11.1; p = 0.011). Similarly, for students who commenced in 2007, participation in formative assessments significantly influenced EOC examination marks in two out of their first three courses: Course 5 (12.3%; 7.6–17.0; p < 0.001); and Course 7 (4.5%; 2.1–7.0; p < 0.001). It is not clear why first year students in 2007 derived less benefit than second year students from attempting the formative assessment for Course 2. It is plausible that this discrepancy might be explained by the difficulty of the course content, which resulted in an unusually high failure rate, predominantly affecting first year students.

Table 4 Analysis of effect of participation in online formative assessments in 2006 and 2007 by year commenced in program

The statistically significant relationship between performance in formative assessments and marks in EOC examinations was maintained regardless of year of commencement in the program (data not shown).

Discussion

Our data indicate that the online formative assessments in Phase 1 of our undergraduate Medicine program have been effective in promoting learning by students. Students who participated in formative assessments were likely to achieve higher marks in EOC examinations. Better performance in each of the formative assessments was also consistently associated with higher marks in the respective EOC examinations. There was a trend, although not consistent across all courses, for first year students to derive greater benefit from the formative assessments at the commencement of their program, consistent with the notion that such assessments provide a "safety net" for novices in student-centred learning [5].

We believe this report provides much-needed evidence of a quantifiable effect of online formative assessments on learning. Our findings are in contrast to two recent studies in related settings. The first study demonstrated the value of online formative quizzes in improving preparation and participation in classes [23], but reported no effect on summative examination results. The second study was a randomized control trial of online formative assessments for medical students in clinical clerkships, which found no positive effect on learning [24]. It should be noted that the latter investigation, in contrast to our study, failed to gain significant numbers of participating students.

Our findings also contrast with reports suggesting that online formative assessments utilising objective items such as multiple choice questions have no effect on student learning outcomes [23, 25], or even a negative effect on learning [26]. The authors of the latter study asserted that multiple choice questions may be unsuitable for formative assessments, because the "lures" or distractors create "false knowledge" [26]. However, these adverse findings were based on the use of multiple choice questions without feedback. In that context, it is not surprising that incorrect answers could be "learned".

Importantly, our results substantially extend the observations of Krasne et al [14], who found that untimed "open-book" formative assessments were good predictors of performance in a summative examination, possibly related to factors such as the reduced pressure compared to a conventional examination format and an emphasis on assessment of higher order learning (e.g. application, evaluation, self-direction). Although our formative assessments employed multiple choice questions (as well as short-answer questions), they were in many respects similar in character to the "open-book" assessments described by Krasne and colleagues [14]. Unlike those assessments, however, ours were integrated across disciplines, broader in their scope, available for a longer period of time, and embedded throughout a program of study. These factors are likely to have increased their efficacy. Furthermore, recent data on the application of test-enhanced learning in higher education [27] validates our use of a combination of short-answer questions and multiple choice questions with immediate feedback in promoting retention of knowledge.

Student perceptions of the formative assessments were uniformly favourable, with consistent and increasingly positive evaluations in online feedback surveys. Correspondingly, there were high participation and repetition rates in the online assessments for each course. The fact that students on average completed the formative assessment within each course in less than one hour might have contributed to the popularity of the assessments, because they were perceived as an efficient means of study for time-poor students.

Our systematic approach to the design, development, implementation and continual improvement of online formative assessments is likely to have played a role in students' perceptions of the assessments, as well as their positive effect on student learning. For example, as part of a continuous improvement cycle, all items used in formative assessments were analysed with regard to difficulty, discrimination co-efficient and correlation with overall assessment outcome. Those that correlated poorly were edited or eliminated from the next cycle, while concepts that proved difficult for students to comprehend were flagged for course conveners.

A limitation of our study is that it is not possible to conclude whether there is a causal relationship between participation and/or performance in online formative assessments and EOC examination marks. It might be that "better" students, who were more highly motivated, were more likely to undertake the formative assessments [12, 16]. In our study, multiple attempts at each formative assessment were not associated with higher EOC examination marks. This might suggest that EOC examination performance was primarily influenced by the inherent properties of the students, rather than the salient effects of formative assessment with feedback. Nevertheless, although one reported study has demonstrated no relationship between the effect of online formative assessments and overall student performance in a program as measured by grade point average [16], the design of our study cannot exclude such a relationship. Proving a causal relationship would require a design in which students were randomly assigned to a "control group" within a cohort. This would be inequitable, because a group of students would be deprived of the opportunity to undertake formative assessments during the trial period [12].

Implications for practice and future research

The results of our study reinforce the impact on learning of well-designed online formative assessments. The highly computer-literate students to whom these assessments were targeted, a population for which web delivery of learning materials and resources is now the default, expressed a very high level of satisfaction. This could in part be related to the graphically intensive approach we used, particularly in visual disciplines such as Anatomy and Pathology. It is likely that this could not be matched by any other mode of delivery. We have evidence from feedback surveys that students pursued further reading and investigated linked resources, so the purpose of provoking further thought about the topics clearly was served.

Thus, while the effort and expense involved in this enterprise has been considerable, this investment is clearly justifiable because the assessments had a high take-up rate and evidently contributed to better learning outcomes for students.

These findings have important implications not only for education of junior medical students but also for continuing education of senior medical students in clinical attachments [15] and especially of junior doctors and specialist trainees. The latter two groups, who are notoriously time-poor, might be attracted to well-packaged formative assessments which they could undertake at their convenience. They might derive considerable benefit from non-threatening feedback on their knowledge and clinical decision-making.

From a research perspective, a question that remains of interest to us is whether the learning benefits of online formative assessments for junior medical students, which we have demonstrated, persist into senior years of medical programs, particularly with respect to understanding of the biomedical sciences. It would also be of interest to determine whether our formative assessments could have diagnostic value, i.e. are those students who perform poorly in formative assessments at their first attempt more likely to fail EOC examinations?

Conclusion

The results of this study support the contention that well designed formative assessments can have significant positive effects on learning. There is considerable untapped potential for use of formative assessments to assist learning by medical students and postgraduate medical trainees.

References

  1. Ramsden P: Learning to Teach in Higher Education. 1992, London: Routledge

    Chapter  Google Scholar 

  2. Marton F, Saljo R: On qualitative differences in learning. I-outcome and process. Br J Educ Psychol. 1976, 46: 4-11.

    Article  Google Scholar 

  3. Newble DI, Jaeger K: The effect of assessment and examinations on the learning of medical students. Med Educ. 1983, 17: 165-171.

    Article  Google Scholar 

  4. Entwistle N: Styles of Learning and Teaching. 1987, Chichester: Wiley, 2

    Google Scholar 

  5. Rolfe I, McPherson J: Formative assessment: how am I doing?. Lancet. 1995, 345: 837-839. 10.1016/S0140-6736(95)92968-1.

    Article  Google Scholar 

  6. Relan A, Uijdehaage S: Web-based assessment for students' testing and self-monitoring. Acad Med. 2001, 76 (5): 551-10.1097/00001888-200105000-00095.

    Article  Google Scholar 

  7. Gipps CV: What is the role for ICT-based assessment in universities?. Stud High Educ. 2005, 30: 171-180. 10.1080/03075070500043176.

    Article  Google Scholar 

  8. Gibbs G, Simpson C: Conditions under which assessment supports students' learning. Learn Teach High Educ. 2004, 1: 3-31.

    Google Scholar 

  9. Hattie JA: Identifying the salient facets of a model of student learning: a synthesis of meta-analyses. Int J Educ Res. 1987, 11: 187-212.

    Google Scholar 

  10. Black P, Wiliam D: Assessment and classroom learning. Assessment Educ. 1998, 5: 7-74.

    Article  Google Scholar 

  11. Velan GM, Kumar RK, Dziegielewski M, Wakefield D: Web-based assessments in pathology with QuestionMark Perception. Pathology. 2002, 34: 282-4. 10.1080/00313020220131372.

    Article  Google Scholar 

  12. Johnson GM: Optional online quizzes: College student use and relationship to achievement. Can J Learn Tech [serial online]. 2006, [http://www.cjlt.ca/index.php/cjlt/article/view/61/58]

    Google Scholar 

  13. Derouza E, Fleming M: A comparison of in-class quizzes vs. online quizzes on student exam performance. J Comput High Educ. 2003, 14: 121-134. 10.1007/BF02940941.

    Article  Google Scholar 

  14. Krasne S, Wimmers PF, Relan A, Drake TA: Differential effects of two types of formative assessment in predicting performance of first-year medical students. Adv Health Sci Educ. 2006, 11: 155-171. 10.1007/s10459-005-5290-9.

    Article  Google Scholar 

  15. Burch VC, Seggie JL, Gary NE: Formative assessment promotes learning in undergraduate clinical clerkships. S Afr Med J. 2006, 96: 430-433.

    Google Scholar 

  16. Olson BL, McDonald JL: Influence of online formative assessment upon student learning in biomedical science courses. J Dent Educ. 2004, 68: 656-659.

    Google Scholar 

  17. Henly D: Use of Web-based formative assessment to support student learning in a metabolism/nutrition unit. Eur J Dent Educ. 2003, 7: 116-122. 10.1034/j.1600-0579.2003.00310.x.

    Article  Google Scholar 

  18. Houghton G, Wall D: Trainers' evaluations of the West Midlands formative assessment package for GP registrar assessment. Med Teach. 2000, 22 (4): 399-405. 10.1080/014215900409519.

    Article  Google Scholar 

  19. McNeil HP, Hughes CS, Toohey SM, Dowton SB: An innovative outcomes-based medical education program built on adult learning principles. Med Teach. 2006, 28: 527-534. 10.1080/01421590600834229.

    Article  Google Scholar 

  20. Toohey S, Kumar RK: A new program of assessment for a new medical program. Focus Health Profession Educ. 2003, 5: 23-33.

    Google Scholar 

  21. Hamilton NM, Furnace J, Duguid KP, Helms PJ, Simpson JG: Development and integration of CAL: a case study in medicine. Med Educ. 1999, 33: 298-305. 10.1046/j.1365-2923.1999.00270.x.

    Article  Google Scholar 

  22. Nulty DD: The adequacy of response rates to online and paper surveys: what can be done?. Assessment Eval High Educ. 2007, 33: 1-13.

    Google Scholar 

  23. Urtel MG, Bahamonde RE, Mikesky AE, Udry EM, Vessely JS: On-line quizzing and its effect on student engagement and academic performance. J Scholar Teach Learn. 2006, 6: 84-92.

    Google Scholar 

  24. Palmer EJ, Devitt PG: Limitations of student-driven formative assessment in a clinical clerkship. A randomized control trial. BMC Med Educ. 2008, 8: 29-10.1186/1472-6920-8-29.

    Article  Google Scholar 

  25. Peat M, Franklin S: Has student learning been improved by the use of online and offline formative assessment opportunities?. Australas J Educ Tech. 2003, 19: 87-99.

    Article  Google Scholar 

  26. Roediger HL, Marsh EJ: The positive and negative consequences of multiple-choice testing. J Exp Psychol Learn. 2005, 31: 1155-1159. 10.1037/0278-7393.31.5.1155.

    Article  Google Scholar 

  27. McDaniel MA, Roediger HL, McDermott KB: Generalizing test-enhanced learning from the laboratory to the classroom. Psychon Bull Rev. 2007, 14 (2): 200-206.

    Article  Google Scholar 

Pre-publication history

Download references

Acknowledgements

The authors wish to thank Dr Alex Yueping Wang for his assistance with data entry and data analysis. No funding was required for this study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gary M Velan.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

GMV designed, developed and implemented the integrated online formative assessments for medical students described in this study. He was responsible for the study design, data collection and drafting the text of this article and is guarantor. PJ supported the development and implementation of integrated online formative assessments for medical students, as well as editing the text of this article.

HPM supported the development and implementation of integrated online formative assessments for medical students, as well as editing the text of this article. RKK assisted with study design and data analysis, as well as editing the text of this article.

Authors’ original submitted files for images

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Velan, G.M., Jones, P., McNeil, H.P. et al. Integrated online formative assessments in the biomedical sciences for medical students: benefits for learning. BMC Med Educ 8, 52 (2008). https://doi.org/10.1186/1472-6920-8-52

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1472-6920-8-52

Keywords