Automatic Classification of Screen Gaze and Dialogue in Doctor-Patient-Computer Interactions: Computational Ethnography Algorithm Development and Validation (Preprint)
Preprint
DOI:
10.2196/preprints.25218
Publication Date:
2021-05-10T13:46:38Z
AUTHORS (5)
ABSTRACT
<sec> <title>BACKGROUND</title> The study of doctor-patient-computer interactions is a key research area for examining doctor-patient relationships; however, studying these costly and obtrusive as researchers usually set up complex mechanisms or intrude on consultations to collect, then manually analyze the data. </sec> <title>OBJECTIVE</title> We aimed facilitate human-computer human-human interaction in clinics by providing computational ethnography tool: an unobtrusive automatic classifier screen gaze dialogue combinations interactions. <title>METHODS</title> classifier’s input video taken doctors using their computers' internal camera microphone. By estimating points doctor's face presence voice activity, we estimate type that taking place. classification output each segment 1 4 classes: (1) dialogue, wherein doctor gazing at computer while conversing with patient; (2) away from (3) gaze, without (4) other, no are detected. evaluated 30 minutes provided 5 simulating both semi- fully inclusive layouts. <title>RESULTS</title> achieved overall accuracy 0.83, performance similar human coder. Similar coder, was more accurate layouts than semi-inclusive <title>CONCLUSIONS</title> proposed can be used researchers, care providers, designers, medical educators, others who interested exploring answering questions related
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (28)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....