Skip to main content

Ensuring competency in focused cardiac ultrasound: a systematic review of training programs

Abstract

Background

Focused cardiac ultrasound (FoCUS) is a valuable skill for rapid assessment of cardiac function and volume status. Despite recent widespread adoption among physicians, there is limited data on the optimal training methods for teaching FoCUS and metrics for determining competency. We conducted a systematic review to gain insight on the optimal training strategies, including type and duration, that would allow physicians to achieve basic competency in FoCUS.

Methods

Embase, PubMed, and Cochrane Library databases were searched from inception to June 2020. Included studies described standardized training programs for at least 5 medical students or physicians on adult FoCUS, followed by an assessment of competency relative to an expert. Data were extracted, and bias was assessed for each study.

Results

Data were extracted from 23 studies on 292 learners. Existing FoCUS training programs remain varied in duration and type of training. Learners achieved near perfect agreement (κ > 0.8) with expert echocardiographers on detecting left ventricular systolic dysfunction and pericardial effusion with 6 h each of didactics and hands-on training. Substantial agreement (κ > 0.6) on could be achieved in half this time.

Conclusion

A short training program will allow most learners to achieve competency in detecting left ventricular systolic dysfunction and pericardial effusion by FoCUS. Additional training is necessary to ensure skill retention, improve efficiency in image acquisition, and detect other pathologies.

Background

Technological advancements have led to increasing availability of high quality, low-profile ultrasound devices at reduced costs [1]. One area that has seen tremendous growth is that of focused cardiac ultrasound (FoCUS), which describes point-of-care ultrasound that is intended to provide a rapid qualitative assessment of cardiac function. The use of FoCUS has expanded to a variety of practice settings, including emergency medicine, critical care, anesthesia, internal medicine, and primary care, owing largely to its relative ease of use [2]. Prior studies suggest that trainees and non-cardiologist physicians with limited prior ultrasonographic experience can gain proficiency in FoCUS with brief training, such as a 1-day workshop and 20–50 practice scans [3, 4]. FoCUS has proven useful for the assessment of ventricular function, valvular abnormalities, volume status, as well as for the detection of cardiac tamponade, aortic dissection or aneurysm, and pulmonary embolism [5]. The use of FoCUS has been shown to alter management in perioperative [6, 7], critical care [8, 9], and emergency [10, 11] settings and has been shown to improve outcomes in select patients [12].

While FoCUS can be beneficial for patient care and more effective allocation of healthcare resources, there is potential for harm with inappropriate use [13]. The implications of relying on a false negative exam could include delayed or missed diagnoses. Similarly, false positive findings or misinterpretations could lead to unwarranted testing or procedures and increased healthcare spending. Despite the potential for such consequences, formal training programs have not been widely embraced, and quality control metrics are often lacking [14, 15]. Surveys have revealed the fear of missed diagnoses and the lack of training or certification as important barriers to the adoption of FoCUS [16]. The adoption of robust parameters for assessing competency in image acquisition, analysis, and interpretation among physicians is needed to effectively train learners and ensure appropriate use [17].

Leaders in ultrasonography have recognized the need for training standards and have supported the development of structured certification programs for FoCUS as well as quantitative transthoracic echocardiography (TTE), shown in Table 1 [24, 31]. Current certifications in TTE require between 75 and 250 scans and passing one or more standardized examinations, while certification in FoCUS typically requires between 20 and 50 supervised scans. However, many of these recommendations are based on guidance developed for the use of FoCUS and/or TTE in emergency and critical care settings, and their applicability outside of these settings has not been well-demonstrated. There is also no consensus on the optimal method of training in FoCUS or the appropriate metrics for determining skill development. Many small-scale studies have documented and compared strategies for FoCUS education and evaluation among various sub-populations and clinical environments [32]. Among these are studies on trainees and licensed physicians working in intensive care units, medical wards, emergency departments, and perioperative areas for which very different scanning protocols are employed. The heterogeneity of studies has made it difficult to draw conclusions, and thus, the type and duration of training to allow most learners to achieve competency in FoCUS remains undetermined. We conducted a systematic review and meta-analysis to examine existing strategies for FoCUS training and to gain insight on the optimal amount and type of training that will allow for attainment of basic competency in adult FoCUS.

Table 1 Published accreditations in focused cardiac ultrasound and transthoracic echocardiography

Methods

This systematic review conformed to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [33].

Data search

Our search strategy utilized PubMed, Embase, and Cochrane Library databases from inception until June 2020. The following search terms were used: “echocardiography” or “transthoracic echocardiography” or “TTE” or “bedside ultrasound” or “cardiac ultrasound”, and “doctors” or “physicians” or “residents” or “fellows” or “medical students” or “attending” or “intensivist” or “internist” or “hospitalist”, and “competence” or “competency” or “certification” or “accreditation” or “evaluation” or “assessment” or “curriculum”. These terms were identified in the title or abstract (PubMed and Embase) or in the title, abstract, or keywords (Cochrane). We also examined the lists of references from relevant studies and review articles for any additional articles that might have been missed in our initial search.

Inclusion and exclusion criteria

Studies were included only if standardized training on FoCUS was provided followed by a formal assessment of competence, such as by expert review or comparison to an expert-performed echocardiogram. An expert was designated as a physician or sonographer with extensive training and/or certification in adult echocardiography. Included studies were required to have at least 5 learners who were medical students, trainees, or attending physicians without expertise or formal certification in transthoracic or transesophageal echocardiography. For inclusion, each study was required to outline the type and duration of training, describe which parameters were assessed, and identify a comparator for assessment of competency. Studies within pediatric populations and on non-physician learners were excluded.

Study selection and data extraction

Titles and abstracts were assessed independently by two reviewers (LEG and PJL) and were included in the full text review if selected by either. The same two reviewers performed full text review, with discrepancies resolved by a third reviewer (MGC). Two authors (LEG and GAW) independently extracted the following data using a standardized form: number and training level of learners, ratio of learners to instructors during training, type and duration of training, total study duration, views and pathology taught, ultrasound device used, clinical setting, selection of subjects for assessment, parameters assessed, measurement of competency, and outcomes.

Risk of bias assessment

The ROBINS-I tool [34] was used to assess risk of bias in our cohort of non-randomized studies of interventions. Risk of bias in seven pre-specified categories was independently assessed by two reviewers (LEG and GAW), with disputes resolved through joint discussion with a third reviewer (MGC).

Study outcome

The primary outcome was the performance of medical students or physicians in acquiring and/or interpreting cardiac and hemodynamic parameters using FoCUS relative to that of expert echocardiographers.

Data analysis

Summary tables are provided for included studies, accompanied by a qualitative discussion and evaluation of risk of bias. The relationship between three training parameters (didactic hours, hands-on practice hours, and scans performed) and reported level of agreement (kappa coefficient, κ) between learners and expert echocardiographers on identifying cardiac pathology was assessed by linear regression (SPSS version 24, IBM Corp.). Analysis was performed for parameters in which assessments were relatively uniform across studies, and for data sets containing ≥ 8 studies in order to minimize the likelihood of sampling error. The Pearson correlation coefficient (r) and p value for the linear fit were reported. The kappa coefficient (κ) was interpreted as [35]: perfect agreement (κ = 1), near perfect agreement (κ = 0.81 to 1), substantial agreement (κ = 0.61 to 0.8), moderate agreement (κ = 0.41 to 0.6), fair agreement (κ = 0.21 to 0.4), and slight to no agreement (κ = 0 to 0.2).

Results

Search results and study selection

Our search yielded 1479 unique studies to be screened, of which 1301 were excluded, leaving 178 full-text studies to be screened. Of these, 23 met inclusion criteria and were included in this systematic review (Fig. 1). Many studies met multiple criteria for exclusion.

Fig. 1
figure 1

Flow diagram showing the selection of studies for inclusion

Quality of included studies

All studies included in the analysis were non-randomized, observational studies of an intervention and thus were expected to have a substantial and unavoidable bias due to confounding. We identified several consistent sources of bias in the selection of participants, amount of training received by learners within each training program, number of scans performed, pre-existing knowledge of the clinical status of patient subjects, and interobserver variability. Bias was assessed for each study using the ROBINS-I tool (Table 2) [34].

Table 2 Risk of bias for each included study assessed using the ROBINS-I tool [26]

Study participants

Data were collected on a total of 292 learners across 23 studies (see Tables 3 and 4). Participants ranged from medical students to subspecialty physicians with up to 29 years of attending-level experience [43]. The most represented group was internal medicine residents (n = 174, 59.6%), followed by critical care fellows (n = 32, 11.0%), hospitalists (n = 27, 9.25%), emergency medicine residents (n = 23, 7.88%), emergency medicine attendings (n = 15, 5.14%), medical students (n = 10, 3.42%), intensivists (n = 6, 2.05%), trauma surgeons (n = 6, 2.05%), and anesthesia residents (n = 5, 1.71%). For the majority of learners, participation was on a voluntary basis. At least 9 learners (3.08%) across all included studies had some prior training in echocardiography, but none had expertise or formal certification.

Table 3 Characteristics of the learner population, training program, device used, and study duration for 23 included studies
Table 4 Characteristics of ultrasound skill assessment and overall findings for 23 included studies

Training format and duration

All studies had a standardized training program that included some combination of didactic and practical hands-on learning. Where reported, the didactic component ranged from 45 min to 18 h, and from 7 to 80% of the dedicated training time. Didactics included a component of in-person lectures, review of pre-recorded cases, and/or bedside demonstration in 21 of 23 studies (91%) and consisted of remote learning only with handouts or online modules in 2 of 23 studies (8.7%). Practical learning was reported either as a duration of time spent in small groups or 1-on-1 performing supervised echocardiograms, or as the number of supervised exams or exams performed independently with feedback. Where reported, the time spent on practical training ranged from 30 min to 20 h or from between 1 and 50 exams, with the exception of one study in which learners were encouraged to perform 100 independent exams as part of their training [51].

Subjects for assessment

Learners performed FoCUS on a total of 3794 subjects, which included 3785 patients, 4 healthy volunteers, and 5 simulated patient cases. Patients were examined in a variety of clinical settings, including the intensive care unit (n = 1077, 28.5%), inpatient medicine floor (n = 1002, 26.5%), intermediate care unit (n = 408, 10.8%), emergency department (n = 385, 10.2%), outpatient clinic (n = 257, 6.79%), and short-stay unit (n = 175, 4.62%). A total of 524 patients (13.8%) were on mechanical ventilation at the time of the exam. Clinical setting was not specified for 481 patients (12.7%). Most patients were selected for study inclusion based on having a clinical indication for FoCUS, and many were excluded due to the presence of injuries requiring immediate intervention, inability to tolerate repositioning, the sonographers’ inability to obtain adequate windows, or a prolonged duration (typically > 48 h) between learner and expert examinations.

Parameters for assessing competency

Learners were assessed on their skills in both acquiring and/or interpreting images. Parameters of acquisition ability included whether or not learners were able to obtain adequate images to make a diagnosis, the time required to obtain images, a subjective assessment of image quality, or an efficiency score (quality/time). One study also reported self-perceived workload for performing FoCUS [44]. Parameters of interpretation ability included accuracy in quantitative measurements (chamber or vessel sizes, ejection fraction, E/A ratio) and diagnostic accuracy (normal or abnormal function, presence or absence of pathology). Competency in these areas was assessed by comparison against the performance of an expert echocardiographer. This was typically a board certified cardiologist or a physician who had completed level 2 or 3 certification by the American Society of Echocardiography, although in two studies this was a cardiology fellow [50] or intensivist [55] with formal training and experience in echocardiography but without certification. Ideally, exams performed by learners were compared to a similar exam performed by an expert, with both exams performed using either a portable or traditional ultrasound. However, only in 8 of the 23 studies [40, 41, 45, 47, 49, 50, 56, 57] were the learner’s exam compared to another focused exam performed on the same or very similar type of device. Most studies included comparison of a learner-performed FoCUS exam with a standard TTE, and often with the learner performing the exam with a portable device with limited functionality and poorer image resolution than a traditional ultrasound machine. One study [37] compared learner and expert performance on an ultrasound simulator, while two others examined healthy volunteers [40, 56].

Quantitative assessment of training parameters

Of the 23 studies included in this review, 11 calculated a kappa coefficient (κ) for inter-rater reliability between learner and expert interpretation of at least one cardiac ultrasound finding and could be included for quantitative analysis. The most frequently assessed pathologies were left ventricular (LV) systolic dysfunction and pericardial effusion, followed by regional wall motion abnormalities, valvular abnormalities, and hypovolemia. LV systolic function and the presence of pericardial effusion were assessed in at least 8 studies, providing the largest sample sizes for meta-analysis. The other parameters had limited sample sizes with measures that were relatively less uniform across studies. The level of agreement with experts on learner assessment of LV systolic function (Fig. 2, left panel) and pericardial effusion (Fig. 2, right panel) is shown based on the number of didactic hours (Fig. 2a), number of hands-on practice hours (Fig. 2b), and total number of exams performed (Fig. 2c). Learners achieved near perfect agreement (κ > 0.8) with expert echocardiographers on the assessment of LV systolic function after 6 didactic hours and 6 h of hands-on training, and substantial agreement (κ > 0.6) after 2 h of didactics and 2 h of hands-on training. There was no correlation between number of scans performed and agreement with experts on the identification of LV systolic dysfunction. Learners achieved substantial agreement (κ > 0.6) with experts on the identification of pericardial effusion after 3 h of didactics, 3 h of hands-on training, and at least 25 scans. For the assessment of LV systolic function, agreement between learners and experts correlated with the amount of time (1 to 6 h) spent on didactics (r = 0.79, p < 0.05) and performing hands-on practice (r = 0.82, p < 0.05). For the identification of pericardial effusion, agreement between learners and experts correlated with the amount of time (1 to 6 h) spent on didactics (r = 0.82, p < 0.005) and the number of scans performed in each study (r = 0.51, p < 0.05).

Fig. 2
figure 2

Relationship between a number of didactic hours, b number of hands-on practice hours, and c number of scans performed during a standardized training phase on learner agreement with expert echocardiographers for the detection of left ventricular systolic dysfunction (left panel, navy) and pericardial effusions (right panel, light blue). The Pearson correlation coefficient (r) and p value for the linear fit are reported for each data set, and regression lines are shown with 95% confidence intervals (dashed lines). Agreement is expressed by the kappa coefficient, κ

Discussion

FoCUS is intended to provide qualitative or semi-quantitative assessment of major cardiac abnormalities, such as identifying LV systolic dysfunction, pericardial effusion, or valvular abnormalities [59]. As a goal-directed tool, data obtained needs to be reliable as it is used to guide immediate clinical management. Thus, the development of a FoCUS training platform that ensures competency is necessary for safe and meaningful use. Our systematic review has shown that existing training programs vary substantially in their duration of training (45 min to over 20 h), type of training provided, skills taught, and clinical setting in which FoCUS skills were assessed. Our analysis also showed that a short duration of training, i.e., 2–3 h didactics and 2–3 h of hands-on training, may be sufficient for most learners to achieve substantial agreement with experts in identifying two major cardiac abnormalities: LV systolic dysfunction and pericardial effusion. Meanwhile, near perfect agreement (κ > 0.8) for detecting these abnormalities could be achieved after 6 h of didactics and 6 h of hands-on training. Identification of other pathologies, particularly wall motion abnormalities, valvular lesions, and IVC enlargement, was often more difficult, and most learners were only able to achieve fair to moderate agreement with experts after brief training.

Many studies included in our review involved comparison of data obtained through FoCUS exams performed using a small portable or handheld device to data obtained from a TTE performed using ultrasound machines with high resolution and advanced features. FoCUS is not performed for the same diagnostic purpose, nor should it be expected to match the precision of a comprehensive TTE. Yet we felt that comparison to a well-established standard was likely to be the most reliable metric to assess learner competency and that results yielded from this higher benchmark should be interpreted within a margin of non-inferiority. FoCUS training should also include education on the intended use and inherent limitations of FoCUS versus TTE.

Our review examined the effect of three training parameters on learner performance. We showed that substantial agreement (κ > 0.6) between learners and experts on the assessment of LV systolic function could be achieved with only 2 h each of didactic and hands-on practice and a minimum (4–10) number of scans. Similarly, substantial agreement with experts on the identification of pericardial effusion could be achieved with only 3 h each of didactic and hands-on practice. The greater amount of time required for identifying pericardial effusions may be due to misidentification of pericardial fat as an effusion, or to the fact that small effusions can be missed in some views. Regardless, these findings are impressive, given that only moderate (κ > 0.4) to substantial (κ > 0.6) agreement exists between trained experts for assessments of LV function by FoCUS [60]. We also show that learner performance for identifying LV systolic dysfunction improves with time spent on didactics and time spent performing hands-on practice, at least for up to 6 h each, whereas the total number of scans performed did not correlate with improvement in identifying LV dysfunction. This may be due to the fact that there was already substantial agreement (κ > 0.6) between learners and experts after very few (4–10) scans. Also, identifying LV dysfunction by FoCUS is a skill that may be best taught through a combination of didactics and supervised practice, while the actual number of exams performed may be less important. In contrast, identification of pericardial effusion improved with time spent on didactics as well as with the number of scans performed, and substantial agreement with experts could be achieved after 25 scans. This suggests that the detection of pericardial effusion is a skill that is gained through additional experience rather than supervised practice and supports the completion of between 20 and 30 focused exams for achieving competency in FoCUS as recommended by existing governing bodies (Table 1). Overall, our quantitative findings confirm that learners may be able to achieve reasonable competency using ultrasound to assess LV function and identify pericardial effusion after a very short (4–6 h) duration of training that includes equal portions (2–3 h each) of didactic and hands-on learning. Our findings also suggest that a small number of scans (20–30) may be sufficient for learners to gain basic competency in FoCUS.

To our knowledge, ours is the first systematic review and meta-analysis to be published on training in FoCUS. A prior systematic review by Rajamani et al. [61] examined 42 studies with an aim of evaluating the quality of point-of-care ultrasound training programs and their ability to determine competence. Roughly half of all studies did not include a comparator group against which to assess learner competency. Another prior systematic review by Kanji et al. [32] examined 15 studies in the critical care setting, most of which assessed learning based on pre- and post-training test scores and also did not include assessment of competency against an accepted standard as was required in our review. In addition to requiring a comparator for assessing competency, we also took a broader approach in examining the training of a diverse group of learners. As FoCUS adoption continues to expand, we wanted to report findings that might guide appropriate guidelines for the education of providers from different backgrounds and skill levels.

Our review is the first to provide quantitative evaluation of the impact of various training parameters on learner performance. While established curricula exist for FoCUS training in critical care and in emergency medicine, such standards do not currently exist for other specialties. By including a heterogenous population of learners in our review, we hope that the findings may be generalizable to learners in other specialties such as internal medicine, anesthesiology, and general surgery who may be examining patients in settings ranging from outpatient clinic, in the operating room, or post-operatively in the hospital wards. Studies also ranged in their scope of training and parameters assessed, emphasizing that the determination of competency in performing and interpreting FoCUS is a challenging distinction that depends heavily on the clinical context. Because the goal of FoCUS will vary based on the clinical context to which they are applied, the specific metrics for competency will also vary [17]. For example, sensitivity for the detection of a reduction in left ventricular ejection fraction needs to be high in outpatient settings, such as in the study by Croft et al. [41], when determining the need for specialty referral and tailored management of chronic conditions. Meanwhile, a lower sensitivity is likely acceptable in the emergency department, such as in the studies by Farsi et al. [42] and Carrie et al. [39], when determining the presence of a cardiogenic cause for hemodynamic instability.

When considering the wide range of potential clinical applications for FoCUS, it is important to recognize that training clinicians with different skill levels for the use of FoCUS in a variety of settings is unlikely to be successful with a single standardized curriculum. Rather than content-based training that uses completion of a set of material as an endpoint, a competency-based program recognizes that learners will progress at different speeds and that some will require additional material to reach the same level of competency. Competency-based programs enable learners to move through topics at their own pace, progressing when they are comfortable with a new skill and deemed competent by their supervisor(s). This form of training has been successful for teaching other clinical skills such as central line placement and orotracheal intubation, in which clinical competency is not strictly linked to a number of lines placed or intubations performed and no formal accreditation is needed. The future practice of FoCUS may benefit from a convergence on competency-based training that is tailored to a particular application and/or specialty, rather than from pursuit of formal accreditation across specialties.

When considering the most effective ways to train physicians on the use of FoCUS, it is also important to recognize that the co-existent clinical demands on physician-learners can impede skill acquisition. Some of the strategies to support learners that were adopted by the studies in this review include offering one-on-one or small group sessions for additional supervised practice, providing supervision during clinical application, and establishing processes that give learners access to ongoing feedback from experts. Flexibility in training availability and integration of FoCUS practice with existing clinical workflows were two recurring strategies that seemed to cater to the needs of physician-learners.

The need to train new generations of physicians in adult FoCUS presents the opportunity for future study in this field. An important consideration when designing a training program is the prevention of skill decay, which has been noted to occur rapidly (within 1–3 months) after the completion of a brief training program [62]. One study [56] found that learners retained their imaging skills at 6 months post-training, but there was no data on skill retention beyond 6 months in any included studies. The duration of the training phase may be inversely related to the rate of decay, suggesting that longitudinal support through deliberate practice and mentored review may help learners to retain their skills [56]. By making ultrasound devices readily available and easily accessible within clinical environments, physicians can develop ways to incorporate FoCUS into their daily practice. Training programs must find ways to support learners beyond the initial training period in a manner that is structured yet flexible.

Limitations

It is important for the reader to recognize that all of the studies identified were non-randomized, observational studies with critical levels of bias. First, selection bias was often evident in both the selection of participants, many of whom were volunteers, and the selection of patient subjects for exams. For example, patients requiring urgent evaluation and treatment are those who are also most likely to benefit from rapid, point-of-care ultrasound, and yet many of these patients were excluded from learner examinations. Three studies reduced subject selection bias by using standardized patients or an ultrasound simulator [37, 40, 56], but at the expense of external validity. Second, few studies [39, 44, 45, 50, 55] acknowledged exams performed by each learner as dependent data points, and even fewer accounted for this through the use of linear modeling [45, 50]. Third, most studies were conducted in actual clinical settings, where time constraints, patient factors, and learner motivation are expected to introduce bias into the results. And lastly, while we report the minimum hours required for learners to detect LV systolic function and identify the presence of pericardial effusion, we were unable to determine the minimum training period required to achieve competency in other aspects of cardiac assessment due to insufficient data.

Conclusion

FoCUS is an important diagnostic tool and will likely soon be considered a standard skillset for any practicing physician. A formal training program that includes 2–3 h of didactic learning, 2–3 h of hands-on training, and requiring 20–30 scans is likely to be adequate for most learners to achieve competency in the detection of gross LV systolic dysfunction and pericardial effusion. Additional training is necessary for skill retention, efficiency in image acquisition, and the detection of more subtle abnormalities. The finding that reasonable proficiency can be obtained after only brief formal training should encourage physicians at any career level to pursue training in FoCUS.

Availability of data and materials

Aggregated data available by request

Abbreviations

FoCUS:

Focused cardiac ultrasound

κ :

Kappa coefficient

LV:

Left ventricular

NI:

No information

r :

Pearson correlation coefficient

PRISMA:

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

TTE:

Transthoracic echocardiography

References

  1. Clement GT. Perspectives in clinical uses of high-intensity focused ultrasound. Ultrasonics. 2004;42:1087–93.

    Article  CAS  PubMed  Google Scholar 

  2. Nelson BP, Sanghvi A. Point-of-care cardiac ultrasound: feasibility of performance by noncardiologists. Glob Heart. 2013;8:293–7.

    Article  PubMed  Google Scholar 

  3. Cowie B, Kluger R. Evaluation of systolic murmurs using transthoracic echocardiography by anaesthetic trainees. Anaesthesia. 2011;66:785–90.

    Article  CAS  PubMed  Google Scholar 

  4. Frederiksen CA, Juhl-Olsen P, Andersen NH, Sloth E. Assessment of cardiac pathology by point-of-care ultrasonography performed by a novice examiner is comparable to the gold standard. Scand J Trauma Resusc Emerg Med. 2013;21:87.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Spencer KT, Kimura BJ, Korcarz CE, Pellikka PA, Rahko PS, Siegel RJ. Focused cardiac ultrasound: recommendations from the American Society of Echocardiography. J Am Soc Echocardiogr. 2013;26:567–81.

    Article  PubMed  Google Scholar 

  6. Canty DJ, Royse CF, Kilpatrick D, Bowman L, Royse AG. The impact of focused transthoracic echocardiography in the pre-operative clinic: transthoracic echocardiography in the pre-operative clinic. Anaesthesia. 2012;67:618–25.

    Article  CAS  PubMed  Google Scholar 

  7. Kratz T, Steinfeldt T, Exner M, Dell´Orto MC, Timmesfeld N, Kratz C, et al. Impact of focused intraoperative transthoracic echocardiography by anesthesiologists on management in hemodynamically unstable high-risk noncardiac surgery patients. J Cardiothorac Vasc Anesth. 2017;31:602–9.

    Article  PubMed  Google Scholar 

  8. Orme RML, Oram MP, McKinstry CE. Impact of echocardiography on patient management in the intensive care unit: an audit of district general hospital practice. Br J Anaesth. 2009;102:340–4.

    Article  PubMed  Google Scholar 

  9. Hall DP, Jordan H, Alam S, Gillies MA. The impact of focused echocardiography using the focused intensive care Echo protocol on the management of critically ill patients, and comparison with full echocardiographic studies by BSE-accredited sonographers. J Intensive Care Soc. 2017;18:206–11.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Jones AE, Tayal VS, Sullivan DM, Kline JA. Randomized, controlled trial of immediate versus delayed goal-directed ultrasound to identify the cause of nontraumatic hypotension in emergency department patients. Crit Care Med. 2004;32:1703–8.

    Article  PubMed  Google Scholar 

  11. Ferrada P, Evans D, Wolfe L, Anand RJ, Vanguri P, Mayglothling J, et al. Findings of a randomized controlled trial using limited transthoracic echocardiogram (LTTE) as a hemodynamic monitoring tool in the trauma bay. J Trauma Acute Care Surg. 2014;76:31–7.

    Article  PubMed  Google Scholar 

  12. Canty DJ, Royse CF, Kilpatrick D, Bowyer A, Royse AG. The impact on cardiac diagnosis and mortality of focused transthoracic echocardiography in hip fracture surgery patients with increased risk of cardiac disease: a retrospective cohort study. Anaesthesia. 2012;67:1202–9.

    Article  CAS  PubMed  Google Scholar 

  13. Walton-Shirley M. Echocardiography: the good, the bad, and the ugly; 2018. [cited 2020 Jun 13]. Available from: http://www.medscape.com/viewarticle/904352.

    Google Scholar 

  14. Conlin F, Roy Connelly N, Raghunathan K, Friderici J, Schwabauer A. Focused transthoracic cardiac ultrasound: a survey of training practices. J Cardiothorac Vasc Anesth. 2016;30:102–6.

    Article  PubMed  Google Scholar 

  15. Macdonald MR, Hawkins NM, Balmain S, Dalzell J, McMurray JJV, Petrie MC. Transthoracic echocardiography: a survey of current practice in the UK. Q J Med. 2008;101:345–9.

    Article  CAS  Google Scholar 

  16. Conlin F, Connelly NR, Eaton MP, Broderick PJ, Friderici J, Adler AC. Perioperative use of focused transthoracic cardiac ultrasound: a survey of current practice and opinion. Anesth Analg. 2017;125:1878–82.

    Article  PubMed  Google Scholar 

  17. Via G, Hussain A, Wells M, Reardon R, ElBarbary M, Noble VE, et al. International evidence-based recommendations for focused cardiac ultrasound. J Am Soc Echocardiogr. 2014;27:683.e1–683.e33.

    Article  Google Scholar 

  18. Point-of-care ultrasound certificate of completion | Certificate of Completion Program. Am Coll Chest Physicians. [cited 2020 Jun 28]. Available from: https://www.chestnet.org/Education/Advanced-Clinical-Training/Certificate-of-Completion-Program/SHM-COC.

  19. POCUS certificate of completion. [cited 2020 Jun 28]. Available from: https://www.hospitalmedicine.org/clinical-topics/ultrasound/pocus-certificate-of-completion/.

  20. Ultrasound guidelines: emergency, point-of-care, and clinical ultrasound guidelines in medicine. [cited 2020 Jun 29]. Available from: https://www.acep.org/patient-care/policy-statements/ultrasound-guidelines-emergency-point-of%2D%2Dcare-and-clinical-ultrasound-guidelines-in-medicine/.

  21. POCUS practice guidelines. SPOCUS. 2019 [cited 2020 Jun 28]. Available from: https://spocus.org/admin-resources/practice-guidelines/.

  22. Pustavoitau A, Blaivas M, Brown SM, Gutierrez C, Kirkpatrick AW, Kohl BA, et al. Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography; 2016. [cited 2020 Oct 12]. Available from: https://journals.lww.com/ccmjournal/Documents/Critical%20Care%20Ultrasound.pdf.

    Google Scholar 

  23. British Society of Echocardiography. Focused Intensive Care Echocardiography (FICE) accreditation pack. Transthoracic TTE Accreditation. [cited 2020 Oct 12]. Available from: https://www.bsecho.org/Public/Accreditation/Personal-accreditation/Transthoracic%2D%2DTTE-/Public/Accreditation/Accreditation-subpages/Personal-accreditation-subpages/Transthoracic%2D%2DTTE%2D%2Daccreditation.aspx?hkey=a36acc22-8b5c-4ebc-be7e-378ef6d8fc35.

  24. Expert Round Table on Ultrasound in ICU. International expert statement on training standards for critical care ultrasonography. Intensive Care Med. 2011;37:1077–83.

    Article  Google Scholar 

  25. Ryan T, Berlacher K, Lindner JR, Mankad SV, Rose GA, Wang A. COCATS 4 task force 5: training in echocardiography. J Am Coll Cardiol. 2015;65:1786–99.

    Article  PubMed  Google Scholar 

  26. Diaz-Gomez JL, Perez-Protto S, Hargrave J, Builes A, Capdeville M, Festic E, et al. Impact of a focused transthoracic echocardiography training course for rescue applications among anesthesiology and critical care medicine practitioners: a prospective study. J Cardiothorac Vasc Anesth. 2015;29:576–81.

    Article  PubMed  Google Scholar 

  27. Wharton G, Steeds R, Allen J, Phillips H, Jones R, Kanagala P, et al. A minimum dataset for a standard adult transthoracic echocardiogram: a guideline protocol from the British society of echocardiography. Echo Res Pract. 2015;2:G9–24 Bioscientifica Ltd.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Fletcher SN, Grounds RMIII. Critical care echocardiography: cleared for take up. Br J Anaesth. 2012;109:490–2 Oxford Academic.

    Article  CAS  PubMed  Google Scholar 

  29. European Diploma in advanced critical care EchoCardiography. Eur Soc Intensive Care Med. 2017. Available from: https://www.esicm.org/education/edec-2/.

  30. Pontone G, Moharem-Elgamal S, Maurovich-Horvat P, Gaemperli O, Pugliese F, Westwood M, et al. Training in cardiac computed tomography: EACVI certification process. Eur Heart J Cardiovasc Imaging. 2018;19:123–6.

    Article  PubMed  Google Scholar 

  31. Expert Round Table on Echocardiography in ICU. International consensus statement on training standards for advanced critical care echocardiography. Intensive Care Med. 2014;40:654–66.

    Article  Google Scholar 

  32. Kanji HD, McCallum JL, Bhagirath KM, Neitzel AS. Curriculum development and evaluation of a hemodynamic critical care ultrasound: a systematic review of the literature. Crit Care Med. 2016;44:e742–50.

    Article  PubMed  Google Scholar 

  33. Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page M, et al. Cochrane handbook for systematic reviews of interventions version 6.0; 2019. Available from: www.training.cochrane.org/handbook.

    Book  Google Scholar 

  34. Sterne JA, Hernán MA, Reeves BC, Savović J, Berkman ND, Viswanathan M, et al. ROBINS-I: a tool for assessing risk of bias in non-randomised studies of interventions. BMJ. 2016;355:i4919.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–74.

    Article  CAS  PubMed  Google Scholar 

  36. Alexander JH, Peterson ED, Chen AY, Harding TM, Adams DB, Kisslo JA. Feasibility of point-of-care echocardiography by internal medicine house staff. Am Heart J. 2004;147:476–81.

    Article  PubMed  Google Scholar 

  37. Beraud A-S, Rizk NW, Pearl RG, Liang DH, Patterson AJ. Focused transthoracic echocardiography during critical care medicine training: curriculum implementation and evaluation of proficiency. Crit Care Med. 2013;41:e179–81.

    Article  PubMed  Google Scholar 

  38. Caronia J, Kutnick R, Sarzynski A, Panagopoulos G, Mahdavi R, Mina B. Focused transthoracic echocardiography performed and interpreted by medical residents in the critically ill. ICU Dir. 2013;4:177–82.

    Article  Google Scholar 

  39. Carrié C, Biais M, Lafitte S, Grenier N, Revel P, Janvier G. Goal-directed ultrasound in emergency medicine: evaluation of a specific training program using an ultrasonic stethoscope. Eur J Emerg Med. 2015;22:419–25.

    Article  PubMed  Google Scholar 

  40. Chisholm CB, Dodge WR, Balise RR, Williams SR, Gharahbaghian L, Beraud A-S. Focused cardiac ultrasound training: how much is enough? J Emerg Med. 2013;44:818–22.

    Article  PubMed  Google Scholar 

  41. Croft LB, Duvall WL, Goldman ME. A pilot study of the clinical impact of hand-carried cardiac ultrasound in the medical clinic. Echocardiography. 2006;23:439–46.

    Article  PubMed  Google Scholar 

  42. Farsi D, Hajsadeghi S, Hajighanbari MJ, Mofidi M, Hafezimoghadam P, Rezai M, et al. Focused cardiac ultrasound (FOCUS) by emergency medicine residents in patients with suspected cardiovascular diseases. J Ultrasound. 2017;20:133–8.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Ferrada P, Anand RJ, Whelan J, Aboutanos MA, Duane T, Malhotra A, et al. Limited transthoracic echocardiogram: so easy any trauma attending can do it. J Trauma. 2011;71:1327–31.

    Article  PubMed  Google Scholar 

  44. Gaudet J, Waechter J, McLaughlin K, Ferland A, Godinez T, Bands C, et al. Focused critical care echocardiography: development and evaluation of an image acquisition assessment tool. Crit Care Med. 2016;44:e329–35.

    Article  PubMed  Google Scholar 

  45. Hellmann DB, Whiting-O’Keefe Q, Shapiro EP, Martin LD, Martire C, Ziegelstein RC. The rate at which residents learn to use hand-held echocardiography at the bedside. Am J Med. 2005;118:1010–8.

    Article  PubMed  Google Scholar 

  46. Johnson BK, Tierney DM, Rosborough TK, Harris KM, Newell MC. Internal medicine point-of-care ultrasound assessment of left ventricular function correlates with formal echocardiography. J Clin Ultrasound. 2016;44:92–9.

    Article  PubMed  Google Scholar 

  47. Labbé V, Ederhy S, Pasquet B, Miguel-Montanes R, Rafat C, Hajage D, et al. Can we improve transthoracic echocardiography training in non-cardiologist residents? Experience of two training programs in the intensive care unit. Ann Intensive Care. 2016;6:44.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Lucas BP, Candotti C, Margeta B, Evans AT, Mba B, Baru J, et al. Diagnostic accuracy of hospitalist-performed hand-carried ultrasound echocardiography after a brief training program. J Hosp Med. 2009;4:340–9.

    Article  PubMed  Google Scholar 

  49. Manasia AR, Nagaraj HM, Kodali RB, Croft LB, Oropello JM, Kohli-Seth R, et al. Feasibility and potential clinical utility of goal-directed transthoracic echocardiography performed by noncardiologist intensivists using a small hand-carried device (SonoHeart) in critically ill patients. J Cardiothorac Vasc Anesth. 2005;19:155–9.

    Article  PubMed  Google Scholar 

  50. Martin LD, Howell EE, Ziegelstein RC, Martire C, Shapiro EP, Hellmann DB. Hospitalist performance of cardiac hand-carried ultrasound after focused training. Am J Med. 2007;120:1000–4.

    Article  PubMed  Google Scholar 

  51. Mjolstad OC, Andersen GN, Dalen H, Graven T, Skjetne K, Kleinau JO, et al. Feasibility and reliability of point-of-care pocket-size echocardiography performed by medical residents. Eur Heart J Cardiovasc Imaging. 2013;14:1195–202.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Mozzini C, Garbin U, Fratta Pasini AM, Cominacini L. Short training in focused cardiac ultrasound in an internal medicine department: what realistic skill targets could be achieved? Intern Emerg Med. 2014;10:73–80.

    Article  PubMed  Google Scholar 

  53. Ruddox V, Stokke TM, Edvardsen T, Hjelmesaeth J, Aune E, Baekkevar M, et al. The diagnostic accuracy of pocket-size cardiac ultrasound performed by unselected residents with minimal training. Int J Cardiovasc Imaging. 2013;29:1749–57.

    Article  PubMed  Google Scholar 

  54. Ruddox V, Norum IB, Stokke TM, Edvardsen T, Otterstad JE. Focused cardiac ultrasound by unselected residents—the challenges. BMC Med Imaging. 2017;17:22.

    Article  PubMed  PubMed Central  Google Scholar 

  55. See KC, Ong V, Ng J, Tan RA, Phua J. Basic critical care echocardiography by pulmonary fellows: learning trajectory and prognostic impact using a minimally resourced training model. Crit Care Med. 2014;42:2169–77.

    Article  PubMed  Google Scholar 

  56. Smith CJ, Morad A, Balwanz C, Lyden E, Matthias T. Prospective evaluation of cardiac ultrasound performance by general internal medicine physicians during a 6-month faculty development curriculum. Crit Ultrasound J. 2018;10:9.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Vignon P, Mücke F, Bellec F, Marin B, Croce J, Brouqui T, et al. Basic critical care echocardiography: validation of a curriculum dedicated to noncardiologist residents. Crit Care Med. 2011;39:636–42.

    Article  PubMed  Google Scholar 

  58. Yan BP, Fok JCY, Wong THY, Tse G, Lee APW, Yang XS, et al. Junior medical student performed focused cardiac ultrasound after brief training to detect significant valvular heart disease. IJC Heart Vasc. 2018;19:41–5.

    Article  Google Scholar 

  59. Neskovic AN, Edvardsen T, Galderisi M, Garbi M, Gullace G, Jurcut R, et al. Focus cardiac ultrasound: the European Association of Cardiovascular Imaging viewpoint. Eur Heart J Cardiovasc Imaging. 2014;15:956–60.

    Article  PubMed  Google Scholar 

  60. De Geer L, Oscarsson A, Engvall J. Variability in echocardiographic measurements of left ventricular function in septic shock patients. Cardiovasc Ultrasound. 2015;13:19.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Rajamani A, Shetty K, Parmar J, Huang S, Ng J, Gunawan S, et al. Longitudinal competence programs for basic point-of-care ultrasound in critical care: a systematic review. Chest. 2020;158:1079–89.

    Article  PubMed  Google Scholar 

  62. Yamamoto R, Clanton D, Willis RE, Jonas RB, Cestero RF. Rapid decay of transthoracic echocardiography skills at 1 month: a prospective observational study. J Surg Educ. 2018;75:503–9.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

None

Funding

No funding received

Author information

Authors and Affiliations

Authors

Contributions

PJL and MGC conceived the initial study design; LEG, PJL, and MGC identified and screened studies for inclusion; LEG and GAW extracted the data; LEG, GAW, and MGC performed bias assessments; LEG, GAW, PJL, SMB, EAB, and MGC wrote and reviewed the manuscript. LEG and MGC take full responsibility for the submitted work. The authors read and approved the final manuscript.

Authors’ information

EAB is the program director for the Critical Care Fellowship in Anesthesiology and the associate director of the Surgical Intensive Care Unit at Massachusetts General Hospital. MGC is the associate program director for the Critical Care Fellowship in Anesthesiology at Massachusetts General Hospital.

Corresponding author

Correspondence to Lauren E. Gibson.

Ethics declarations

Ethics approval and consent to participate

Not applicable

Consent for publication

Not applicable

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Lauren E Gibson and Gabrielle A White-Dzuro are co-first authors.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gibson, L.E., White-Dzuro, G.A., Lindsay, P.J. et al. Ensuring competency in focused cardiac ultrasound: a systematic review of training programs. j intensive care 8, 93 (2020). https://doi.org/10.1186/s40560-020-00503-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40560-020-00503-x

Keywords