Bedside sonography, usually performed by non-radiologists, is employed increasingly in emergency departments and inpatient settings to expedite care. This randomized controlled trial suggests it may not be ready for prime time use in pediatric abdominal blunt trauma.
Source: Holmes JF, Kelley KM, Wootton-Gorges SL, et al. Effect of abdominal ultrasound on clinical care, outcomes, and resource use among children with blunt torso trauma: A randomized clinical trial. JAMA. 2017; 317(22):2290-2296; doi:10.1001/jama.2017.6322. See AAP Grand Rounds commentary by Dr. Michelle Stevenson (subscription required).
A single emergency department site randomized over 900 hemodynamically stable children under 18 years of age to receive either a bedside abdominal sonogram (Focused Assessment with Sonography for Trauma, FAST) or just standardized trauma evaluation without FAST. The inclusion/exclusion criteria were designed to select a group at low risk (5%) of intraabdominal injury. They found no impact of FAST on the rate of abdominal CT use, missed injuries, emergency department length of stay, or hospital charges. In other words, at least in this 1 clinical setting in a low-risk population, FAST didn't help.
The study itself was well-designed, and I wanted to highlight a few features. First, the randomization was carried out in blocks of 20, an example of cluster randomization. This is particularly important for illnesses that are affected by seasonality or other "clustering," because if by bad luck a long string of subjects is randomized to the same intervention at a time when an unusual situation is present, it could skew the results. For example, trauma can be seasonal, with more episodes in summer or during holidays, or can be affected by other seasonal trends such as a busy influenza season where perhaps emergency physicians can be busier than usual and thus spend less time interpreting a bedside sonogram. Breaking up the strings of randomization into clusters lessens the likelihood that a long string of assignments to the same intervention will occur.
A second key feature of this study is the authors' attempts to determine the accuracy of FAST interpretations by the treating physicians, all of whom had been certified in FAST readings by a standardized protocol. For the current study, all FAST exams were reread by 1 of 2 "experienced" emergency department sonographers who were blinded to the original interpretations by the treating physicians. The researchers determined the degree of agreement with the kappa statistic. Statisticians argue over how to assess a particular value of kappa, but I think most would agree that the kappa value of 0.45 in this study isn't great. This relatively low degree of inter-rater reliability suggests that a major determinant of the negative findings of this study could be that emergency physicians need better training in FAST exam interpretations. Check out the accompanying editorial for more on this.