Page 47 - Simplicity is Key in CRT
P. 47

Absolute Intra-observer agreement on clinical judgement was good (P=0.93 (0.05)) for clinical judgement of LBBB, but only moderate relative agreement was found (k=0.76 (0.14)) (table 2). Agreement did not differ for patients with QRS duration below and above 150ms (P=0.96 (0.09) and P=0.88 (0.16), respectively). Similar conclusions were found for kappa coefficients.
Table 2. Intra- and inter-observer agreement in LBBB classification
Inter-observer agreement in LBBB classification
Absolute inter-observer agreement levels were good for all LBBB definitions (P range 0.81 – 0.88), however relative agreement (kappa) was minimal to weak (range k 0.19 – 0.44) (Table 2). Agreement level in AHA/ACC/ HRS definition was reduced by variability in scoring notching/slurring of the R-wave in leads I, aVL, V5 and V6, the absence of a Q-wave in leads I, V5, V6 and aVL and R-peak time criteria (P=0.73 (0.06), P=0.75 (0.07) and P=0.71 (0.07), respectively).
The same trend was visible for clinical judgement of LBBB. Whereas there was good absolute agreement of clinical judgement (P=0.81 (0.81)), relative agreement is weak (k=0.35 (0.20)) (table 2). QRS duration did not influence inter-observer variability (P=0.73 (0.34) and P=0.82 (0.19), for QRS duration below and above 150ms respectively) for clinical judgement.
Inter-definition agreement in LBBB classification
Highest inter-definition agreement was observed between ESC and MADIT definitions (P=0.95 (0.07)), whereas lowest agreement was seen between the AHA/ACC/HRS and the ESC, MADIT and Strauss criteria (P=0.40, (0.22), P=0.44, (0.23), and P=0.50 (0.23) respectively).
Correspondence of clinical judgement with available definitions
As shown in figure 2, the clinical judgement of the presence of LBBB correlated only modestly (phi coefficient range 0.10-0.68) to LBBB according to the available definitions of LBBB. Clinical judgement correlated best with Strauss definition (phi=0.52 (0.10)) and worst with AHA/ACC/HRS definition (phi=0.30 (0.10)).
   Criterion
Prevalence
intra-observer agreement
inter-observer agreement
Probability (P)
Kappa (K)
Probability (P)
Kappa (K)
ESC
0.75±0.29
0.94±0.05
0.67±0.22
0.85±0.08
0.27±0.25
AHA/ACC/HRS
0.20±0.27
0.87±0.08
0.47±0.28
0.81±0.09
0.19±0.25
MADIT
0.71±0.31
0.95±0.04
0.74±0.19
0.88±0.07
0.44±0.22
Strauss
0.65±0.32
0.92±0.06
0.65±0.22
0.85±0.08
0.40±0.22
Clinical judgment
0.50±0.35
0.93±0.05
0.76±0.14
0.81±0.08
0.35±0.20
          47



















































   45   46   47   48   49