Is the Current BST ePortfolio fulfilling its Role in the Training of Clinical Medicine SHOs?

S Grennan1, S Crowley2, S Quidwai2, O Barrett2, M Kooblall1

1Trinity Centre for Health Sciences, AMNCH, Tallaght, Dublin 24

2Trinity Biomedical Sciences Institute, 152-160 Pearse St, Dublin 2


While the objective recording of clinical competencies in an electronic portfolio (ePortfolio) has become a key aspect of basic specialist training (BST), it continues to divide opinion. We surveyed medical trainees and their supervisors in the Dublin region examining their views of the ePortfolio and workplace-based assessments (WPBAs). Responses were received from 27 of 149 (18.1%) SHOs and 24 of 307 (7.9%) consultants. Our results highlight significant dissatisfaction amongst trainees with 20 (74.1%) stating that the ePortfolio is not an effective educational tool. Consultants had more mixed views. While 16 (66.7%) reported that feedback sessions were useful for trainee development, only 4 (16.7%) found the ePortfolio to be useful in highlighting trainees’ strengths and weaknesses. Although other studies have emphasised its educational potential, our results suggest that practical barriers, such as time constraints and a lack of training, lead to poor engagement and a negative view of the ePortfolio.


In Ireland, senior house officers (SHOs) undergoing basic specialist training (BST) are required to participate in workplace-based assessments (WPBA) and complete an electronic portfolio (ePortfolio). BST is a curriculum-based programme of supervised clinical training run by the Royal College of Physicians of Ireland (RCPI) for trainees in the field of general internal medicine1. SHOs are assessed by their supervising consultant and attend an annual review of their ePortfolio during which feedback is given. WPBAs consist of mini-clinical assessment exercises (Mini-CEXs), directly observed procedural skills (DOPS) and case-based discussions (CBD). Mini-CEXs involve the trainer observing a fifteen minute interaction between the trainee and a patient, assessing history-taking, physical examination skills, clinical judgement, professionalism, efficiency or overall clinical care. In DOPS, the situation is designed to assess the capabilities of the trainee as they perform a practical procedure while CBDs are for documenting conversations or presentations about the trainee’s cases and are used to assess clinical decision making2. Although subjective reports emphasise the positive educational benefits of WPBAs, a systematic review by Miller et al3 found no evidence that these WPBAs lead to improved performance in clinical practice.

The ePortfolio is used to track the progress and competency of trainees. It aims to provide a more objective method of assessment and to ensure that the programme is being completed to the required standard2. Prior to the introduction of the ePortfolio in 2011, training was less formal or centralised. While some argue that this approach is more flexible, it was difficult to assess whether all trainees possessed the necessary competencies. Since the introduction of the ePortfolio into the Irish BST programme, there have been few papers looking at its effectiveness or how it is viewed by doctors themselves. Our study aims to determine if trainees and their supervisors see the ePortfolio as an important learning exercise, or merely as a time consuming obligation. It is based on a similar British study by Tailor et al4 who concluded that, while the ePortfolio and WPBAs can have significant educational merit, improvements in mentoring and feedback are required. Our study aims to discover whether the ePortfolio is serving its purpose in the Irish context according to trainees and their supervisors.


Two online surveys were produced with the first aimed at SHOs completing BST in clinical medicine and the second aimed at consultants who have SHOs under their training. The aforementioned British paper by Tailor et al4 was used as a template in designing the surveys in order to enable direct comparison between the Irish and British responses. The SHO survey comprised of 24 questions exploring the accessibility and effectiveness of the ePortfolio and its role in facilitating specialist training for the SHOs. It also questioned perceptions of the WPBAs and whether they are fulfilling their role in promoting constructive criticism and feedback. The consultant survey consisted of 19 questions and addressed similar areas including the usefulness of the ePortfolio and whether or not they believe it plays an important role in medical training. Both surveys allowed for additional comments. Initially, an email with links to both surveys was sent out to all clinical medicine SHOs and consultants in St. James’s Hospital and The Adelaide and Meath Hospital, both of which are located in Dublin. Responses were collected over a one month period between February and March 2015. This was followed by distributing hard copies of the surveys by hand to consultants and SHOs before and after grand rounds in both hospitals. No demographic data was requested in order to ensure anonymity and limit responder bias. No statistical analysis was performed as the data collected was qualitative.


Of the 149 SHOs in both sites, 27 (response rate = 18.1%) completed surveys were received. These results are summarized in Table 1. Only 1 (3.7%) trainee believes that the ePortfolio is an effective educational tool while 24 (88.9%) disagree that their development has benefited from using it. As shown in Table 2, the opinions of the 24 of 307 (response rate = 7.9%) consultants follow a similar theme with only 8 (33.3%) finding the ePortfolio to be an effective educational tool. Their views are less negative overall though, with 12 (50%) consultants compared to 20 (74.1%) SHOs disagreeing that the ePortfolio is useful in highlighting the strengths and weaknesses of trainees. There were some positives found as 16 (66.7%) consultants and 11 (40.7%) SHOs agree that feedback sessions are useful while all 24 (100%) consultants believe that reflective practice plays an important role in medical training. Interestingly, consultants and trainees disagree over the supervision of WPBAs. Although, 17 (63%) SHOs stated that their assessor is rarely/never present during WPBAs, 18 (75%) consultants stated that they are often/always present. In a qualitative analysis of the additional comments, the major theme noted was that the ePortfolio is “cumbersome” and “time consuming” with both SHOs and consultants viewing it as a “box-ticking exercise”. Another theme that was evident from the SHO responses was the request for formal, structured training whether this be “one hour tutorials on an aspect of the consultant’s speciality” or a “course day” both of which could “provide a social platform to encourage education”.

Table 1: Summary of responses from SHOs

S-5930 pic 1.1

*Answer not received from all respondents.

Table 2: Summary of responses from Consultants

 S-5934 pic 2.1.0

 *Answer not received from all respondents.


The results of this survey highlight significant dissatisfaction amongst trainees and mixed views amongst their supervisors regarding the ePortfolio. Its effectiveness is called into question as approximately half of consultants and three quarters of SHOs surveyed find the ePortfolio difficult to use and inadequate in highlighting the strengths and weaknesses of trainees. However, it is not possible to generalise these results due to the small sample size and low response rate of this study. Portfolios and WPBAs play a wide role in medical training. The RCPI ePortfolio was designed with the aim of providing a secure record of professional competencies in addition to supporting clinical feedback and reflective practice. However, our results suggest that the majority of supervisors and trainees feel that it does not encourage reflection on clinical practice in its current format. These results are supported by two British studies. According to Tailor et al4 only 15% of trainees and 30% of supervisors view the ePortfolio as an effective educational tool, while Johnson et al5 found that 45% of trainees believe the ePortfolio does not facilitate rapid feedback. Yet in spite of these results, other evidence in the literature suggests that portfolios and WPBAs can enhance educational development if they are implemented correctly along with sufficient tutor support6. A Canadian study of obstetrics and gynaecology trainees found that recording an online portfolio significantly improved the likelihood of trainees undertaking self-directed learning7. In another questionnaire, over two thirds of British foundation year 2 (FY2) doctors rated the effectiveness of the portfolio in meeting their educational requirements as 6 or higher on a Likert scale of 1-98, while an observational study by Morris et al9, found that 65% of FY1 British trainees believed that carrying out DOPS would help further their careers. This contrasts starkly with the SHOs we surveyed. Although 40% found the verbal feedback to be useful, almost 90% thought that their development has not benefited from completing the ePortfolio.

Practical difficulties such as time constraints, insufficient training and a lack of supervisor engagement may contribute to a negative view of the ePortfolio in Ireland. The majority of consultants stated that they struggle to fit the required assessments into their clinical schedule. This finding is supported by Pereira et al10 who found that over 40% of surgical trainees felt that the time required to complete a mandatory online portfolio impacted negatively on their overall training. Insufficient teaching in how to use the ePortfolio may also be an issue as emphasised by the fact that almost three quarters of supervisors have not received training on the ePortfolio within the last three years. Finally, while three quarters of consultants stated that they are always/often physically present during the assessments, alarmingly, almost two thirds of trainees report that their supervisor was rarely/never present. Admittedly, responder bias could play a role in these differing views. However, this result is not found in the corresponding British study where only 13% of trainees report that their supervisor is rarely/never present4. Thus, a lack of mentor engagement may play a role in the greater dissatisfaction with the ePortfolio found amongst Irish trainees. Our results indicate the need for further research examining the ePortfolio and WPBAs which should focus on how to improve the portfolio process so that, as well as ensuring competence and adequate training, it also facilitates feedback and engagement from both trainees and supervisors.

Unfortunately, our study is hampered by a number of major limitations most notably the small sample size and low response rate. For this reason, and due to the narrow demographic area, our conclusions cannot be extrapolated to all training programs in Ireland. Furthermore, while we did gather responses from trainees who were at various stages of their BST, in order to ensure anonymity and encourage openness, we did not record any demographic data. In hindsight, the lack of trainee and consultant data weakens the study significantly. Finally, since both surveys were initially emailed to participants, the responses received were more likely to have been from those with stronger opinions on the topic. Thus, responder bias may have led to an exaggeration of the results. In conclusion, in this survey SHOs are quite critical of the ePortfolio. While consultants accept the importance of reflective practice and share a more mixed view, many find the ePortfolio and WPBAs to be ineffective also. Although other studies have emphasised the educational potential of portfolios and their importance in ensuring physician competence, our results suggest that practical barriers, such as time constraints, lead to poor engagement and a negative view of the ePortfolio. While our conclusions are limited by the small sample size, our results indicate the need for a larger study in this area. This may examine how the portfolio process can be improved in order to ensure clinical competence as well as encouraging training, feedback and reflection.

Correspondence: S Grennan

Trinity Centre for Health Sciences, AMNCH, Tallaght, Dublin 24



  1. Royal College of Physicians Ireland. Basic Specialist Training. (accessed 18 March 2015).
  2. Royal College of Physicians Ireland. Basic Specialist Training in General Internal Medicine. (accessed 18 March 2015).
  3. Miller A, Archer J. Impact of workplace based assessments on doctors’ education and performance: a systematic review. British Medical Journal 2010; 341: c5064.
  4. Tailor A, Dubrey S, Das S. Opinions of the ePortfolio and workplace-based assessments: a survey of core medical trainees and their supervisors. Clinical Medicine 2014; 14: 510-6.
  5. Johnson S, Cai A, Riley P, Millar LM, McConkey H, Bannister C. A survey of Core Medical Trainees’ opinions on the ePortfolio record of educational activities:beneficial and cost-effective? The Journal of the Royal College of Physicians of Edinburgh 2012; 42: 15-20.
  6. Driessen E,van Tartwijk J, van der Vleuten C, Wass V. Portfolios in medical education: why do they meet with mixed success? A systematic review. Medical Education 2007; 41: 1224-33.
  7. Fung MF, Walker M, Fung KF, Temple L, Lajoie F, Bellemare G, Bryson SC. An internet-based learning portfolio in resident education: the KOALA multicentre programme. Medical Education 2000; 34: 474-9.
  8. Ryland I, Brown J, O’Brien M, Graham D, Gillies R, Chapman T, Shaw N. The portfolio: how was it for you? Views of F2 doctors from the Mersey Deanery Foundation pilot. Clinical Medicine 2006; 6: 378-80.
  9. Morris A, Hewitt J, Roberts CM. Practical Experience of using directly observed procedures, mini clinical evaluation examinations, and peer observation in pre-registration house officer (FY1) trainees. Postgraduate Medical Journal 2006; 82: 285-8.
  10. Pereira EA, Dean BJ. British surgeons’ experiences of a mandatory online workplace-based assessment. Journal of the Royal Society of Medicine 2009; 102: 287-93.

Page 343