Login is moving

Authentication for nemar.org is migrating from the legacy system to the new Cloudflare-backed identity. Until that ships, sign in via the CLI:

npm install -g nemar-cli
nemar login
nm000142 NEMAR-native dataset

Ear-EEG motor execution dataset from Wu et al 2020

This dataset comprises ear-EEG recordings from 6 healthy participants performing motor execution tasks (fist clenching) with simultaneous scalp and ear-based EEG acquisition. The study employed a two-class motor imagery paradigm with visual and auditory cues, recording 1,114 trials across left-hand and right-hand movement conditions at 1000 Hz sampling rate using 122 scalp channels. The dataset serves as a benchmark for investigating in-ear EEG sensing capabilities for motor task classification in brain-computer interface applications.

EEG

Compute on this dataset

Two routes today, with a third (in-browser one-click submission) landing soon.

  1. NeuroScience Gateway (NSG) portal.

    NSG runs EEGLAB / Brainstorm / MNE pipelines on supercomputing time donated by SDSC. Create an account, point a job at this dataset's S3 prefix (s3://nemar/nm000142), and submit.
    nsgportal.org →

  2. Local processing with nemar-cli.

    Pull the dataset to your machine and run any toolbox locally. Honors the published version pinning.

    npm install -g nemar-cli
    nemar dataset clone nm000142
    cd nm000142 && nemar dataset get
  3. Just the files.

    rclone, aria2c, or any HTTPS client works against data.nemar.org/nm000142/ — the manifest carries presigned S3 URLs.

Direct compute access is coming soon. One-click NSG submission from this page is scoped for a follow-up phase. Tracked on nemarOrg/website#6.

![DOI](https://doi.org/10.82901/nemar.nm000142)

Ear-EEG motor execution dataset from Wu et al 2020

Ear-EEG motor execution dataset from Wu et al 2020.

Dataset Overview

  • Code: Wu2020
  • Paradigm: imagery
  • DOI: 10.1088/1741-2552/abc1b6
  • Subjects: 6
  • Sessions per subject: 1
  • Events: lefthand=1, righthand=2
  • Trial interval: [0, 4] s
  • File format: Curry

Acquisition

  • Sampling rate: 1000.0 Hz
  • Number of channels: 122
  • Channel types: eeg=122, misc=10
  • Montage: standard_1005
  • Hardware: Neuroscan SynAmps2
  • Reference: scalp REF
  • Ground: scalp GRD
  • Sensor type: Ag/AgCl
  • Line frequency: 50.0 Hz
  • Online filters: {'bandpass': [0.5, 100]}

Participants

  • Number of subjects: 6
  • Health status: healthy
  • Age: mean=25.0, min=22.0, max=28.0
  • Gender distribution: female=4, male=2
  • Handedness: right-handed
  • Species: human

Experimental Protocol

  • Paradigm: imagery
  • Number of classes: 2
  • Class labels: lefthand, righthand
  • Trial duration: 4.0 s
  • Study design: Motor execution (fist clenching) with simultaneous scalp and ear-EEG recording
  • Feedback type: none
  • Stimulus type: arrow cues
  • Stimulus modalities: visual, auditory
  • Primary modality: visual
  • Synchronicity: synchronous
  • Mode: offline

HED Event Annotations

Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser

  left_hand
    ├─ Sensory-event, Experimental-stimulus, Visual-presentation
    └─ Agent-action
       └─ Imagine
          ├─ Move
          └─ Left, Hand

  right_hand
    ├─ Sensory-event, Experimental-stimulus, Visual-presentation
    └─ Agent-action
       └─ Imagine
          ├─ Move
          └─ Right, Hand

Paradigm-Specific Parameters

  • Detected paradigm: motor_imagery
  • Imagery tasks: lefthand, righthand

Data Structure

  • Trials: 1114
  • Trials context: S1: 240, S2: 160, S3: 160, S4: 80, S5: 234, S6: 240 = 1114

Signal Processing

  • Classifiers: EEGNet

Cross-Validation

  • Evaluation type: within_subject

BCI Application

  • Applications: motor_control
  • Environment: laboratory
  • Online feedback: False

Tags

  • Pathology: Healthy
  • Modality: Motor
  • Type: Research

Documentation

  • DOI: 10.1088/1741-2552/abc1b6
  • License: CC-BY-4.0
  • Investigators: Xiaoli Wu, Wenhui Zhang, Zhibo Fu, Roy T.H. Cheung, Rosa H.M. Chan
  • Institution: City University of Hong Kong
  • Country: HK
  • Repository: Zenodo
  • Data URL: https://zenodo.org/records/18961128
  • Publication year: 2020

References

Wu, X., Zhang, W., Fu, Z., Cheung, R. T. H., & Chan, R. H. M. (2020). An investigation of in-ear sensing for motor task classification. Journal of Neural Engineering, 17(6), 066029. https://doi.org/10.1088/1741-2552/abc1b6 Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896

Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8


Generated by MOABB 1.4.3 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb

Files

12 top-level entries · 4.93 GB total