Evaluation of divided attention using different stimulation models in event-related potentials


Medical and Biological Engineering and Computing, vol.57, no.9, pp.2069-2079, 2019 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 57 Issue: 9
  • Publication Date: 2019
  • Doi Number: 10.1007/s11517-019-02013-x
  • Journal Name: Medical and Biological Engineering and Computing
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Page Numbers: pp.2069-2079
  • Keywords: Auditory stimulus, Classification, Divided attention, Event related potential, Visual stimulus
  • Istanbul Medipol University Affiliated: No


Divided attention is defined as focusing on different tasks at once, and this is described as one of the biggest problems of today’s society. Default examinations for understanding attention are questionnaires or physiological signals, like evoked potentials and electroencephalography. Physiological records were obtained using visual, auditory, and auditory-visual stimuli combinations with 48 participants—18-25-year-old university students—to find differences between sustained and divided attention. A Fourier-based filter was used to get a 0.01–30-Hz frequency band. Fractal dimensions, entropy values, power spectral densities, and Hjorth parameters from electroencephalography and P300 components from evoked potentials were calculated as features. To decrease the size of the feature set, some features, which yield less detail level for data, were eliminated. The visual and auditory stimuli in selective attention were compared with the divided attention state, and the best accuracy was found to be 88.89% on a support vector machine with linear kernel. As a result, it was seen that divided attention could be more difficult to determine from selective attention, but successful classification could be obtained with appropriate methods. Contrary to literature, the study deals with the infrastructure of attention types by working on a completely healthy and attention-high group. [Figure not available: see fulltext.].