Evaluation of intra- and interobserver reliability of the AO classification for wrist fractures

ABSTRACT Objective: This study evaluated the intraobserver and interobserver reliability of the AO classification for standard radiographs of wrist fractures. Methods: Thirty observers, divided into three groups (orthopedic surgery senior residents, orthopedic surgeons, and hand surgeons) classified 52 wrist fractures, using only simple radiographs. After a period of four weeks, the same observers evaluated the initial 52 radiographs, in a randomized order. The agreement among the observers, the groups, and intraobserver was obtained using the Kappa index. Kappa-values were interpreted as proposed by Landis and Koch. Results: The global interobserver agreement level of the AO classification was considered fair (0.30). The three groups presented fair global interobserver agreement (residents, 0.27; orthopedic surgeons, 0.30; hand surgeons, 0.33). The global intraobserver agreement level was moderated. The hand surgeon group obtained the higher intraobserver agreement level, although only moderate (0.50). The residents group obtained fair levels (0.30), as did the orthopedics surgeon group (0.33). Conclusion: The data obtained suggests fair levels of interobserver agreement and moderate levels of intraobserver agreement for the AO classification for wrist fractures.

Saved in:
Bibliographic Details
Main Authors: Tenório,Pedro Henrique de Magalhães, Vieira,Marcelo Marques, Alberti,Abner, Abreu,Marcos Felipe Marcatto de, Nakamoto,João Carlos, Cliquet Júnior,Alberto
Format: Digital revista
Language:English
Published: Sociedade Brasileira de Ortopedia e Traumatologia 2018
Online Access:http://old.scielo.br/scielo.php?script=sci_arttext&pid=S0102-36162018000600703
Tags: Add Tag
No Tags, Be the first to tag this record!