Login is moving

Authentication for nemar.org is migrating from the legacy system to the new Cloudflare-backed identity. Until that ships, sign in via the CLI:

npm install -g nemar-cli
nemar login
nm000128 NEMAR-native dataset

Dong2023 – 59-subject 40-class SSVEP dataset

A 59-subject EEG dataset comprising 8-channel recordings from healthy adolescents (aged 10-16 years) performing a 40-target steady-state visually evoked potential (SSVEP) brain-computer interface task using joint frequency and phase modulation stimuli. The dataset contains 160 trials per subject recorded at 250 Hz with occipital electrode montage, designed for benchmarking SSVEP-based BCI classification algorithms.

EEG

Compute on this dataset

Two routes today, with a third (in-browser one-click submission) landing soon.

  1. NeuroScience Gateway (NSG) portal.

    NSG runs EEGLAB / Brainstorm / MNE pipelines on supercomputing time donated by SDSC. Create an account, point a job at this dataset's S3 prefix (s3://nemar/nm000128), and submit.
    nsgportal.org →

  2. Local processing with nemar-cli.

    Pull the dataset to your machine and run any toolbox locally. Honors the published version pinning.

    npm install -g nemar-cli
    nemar dataset clone nm000128
    cd nm000128 && nemar dataset get
  3. Just the files.

    rclone, aria2c, or any HTTPS client works against data.nemar.org/nm000128/ — the manifest carries presigned S3 URLs.

Direct compute access is coming soon. One-click NSG submission from this page is scoped for a follow-up phase. Tracked on nemarOrg/website#6.

![DOI](https://doi.org/10.82901/nemar.nm000128)

59-subject 40-class SSVEP dataset

59-subject 40-class SSVEP dataset.

Dataset Overview

  • Code: Dong2023
  • Paradigm: ssvep
  • DOI: 10.26599/BSA.2023.9050020
  • Subjects: 59
  • Sessions per subject: 1
  • Events: 8=1, 8.2=2, 8.4=3, 8.6=4, 8.8=5, 9=6, 9.2=7, 9.4=8, 9.6=9, 9.8=10, 10=11, 10.2=12, 10.4=13, 10.6=14, 10.8=15, 11=16, 11.2=17, 11.4=18, 11.6=19, 11.8=20, 12=21, 12.2=22, 12.4=23, 12.6=24, 12.8=25, 13=26, 13.2=27, 13.4=28, 13.6=29, 13.8=30, 14=31, 14.2=32, 14.4=33, 14.6=34, 14.8=35, 15=36, 15.2=37, 15.4=38, 15.6=39, 15.8=40
  • Trial interval: [0.5, 4.5] s
  • File format: MAT

Acquisition

  • Sampling rate: 250.0 Hz
  • Number of channels: 8
  • Channel types: eeg=8
  • Channel names: POz, PO3, PO4, PO7, PO8, Oz, O1, O2
  • Montage: standard_1005
  • Hardware: NeuSenW (Neuracle)
  • Reference: Fp1
  • Ground: Fp2
  • Sensor type: semi-dry (pre-gelled)
  • Line frequency: 50.0 Hz

Participants

  • Number of subjects: 59
  • Health status: healthy
  • Age: mean=12.4, min=10, max=16
  • Gender distribution: male=37, female=22

Experimental Protocol

  • Paradigm: ssvep
  • Task type: SSVEP speller
  • Number of classes: 40
  • Class labels: 8, 8.2, 8.4, 8.6, 8.8, 9, 9.2, 9.4, 9.6, 9.8, 10, 10.2, 10.4, 10.6, 10.8, 11, 11.2, 11.4, 11.6, 11.8, 12, 12.2, 12.4, 12.6, 12.8, 13, 13.2, 13.4, 13.6, 13.8, 14, 14.2, 14.4, 14.6, 14.8, 15, 15.2, 15.4, 15.6, 15.8
  • Trial duration: 4.0 s
  • Feedback type: visual
  • Stimulus type: JFPM visual flicker
  • Stimulus modalities: visual
  • Primary modality: visual
  • Synchronicity: synchronous
  • Mode: offline
  • Training/test split: False

HED Event Annotations

Schema: HED 8.4.0 | Browse: https://www.hedtags.org/hed-schema-browser

  8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/8

  8.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/8_2

  8.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/8_4

  8.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/8_6

  8.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/8_8

  9
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/9

  9.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/9_2

  9.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/9_4

  9.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/9_6

  9.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/9_8

  10
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/10

  10.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/10_2

  10.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/10_4

  10.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/10_6

  10.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/10_8

  11
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/11

  11.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/11_2

  11.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/11_4

  11.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/11_6

  11.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/11_8

  12
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/12

  12.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/12_2

  12.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/12_4

  12.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/12_6

  12.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/12_8

  13
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/13

  13.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/13_2

  13.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/13_4

  13.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/13_6

  13.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/13_8

  14
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/14

  14.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/14_2

  14.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/14_4

  14.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/14_6

  14.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/14_8

  15
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/15

  15.2
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/15_2

  15.4
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/15_4

  15.6
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/15_6

  15.8
    ├─ Sensory-event
    ├─ Experimental-stimulus
    ├─ Visual-presentation
    └─ Label/15_8

Paradigm-Specific Parameters

  • Detected paradigm: ssvep
  • Stimulus frequencies: [8.0, 8.2, 8.4, 8.6, 8.8, 9.0, 9.2, 9.4, 9.6, 9.8, 10.0, 10.2, 10.4, 10.6, 10.8, 11.0, 11.2, 11.4, 11.6, 11.8, 12.0, 12.2, 12.4, 12.600000000000001, 12.8, 13.0, 13.2, 13.4, 13.600000000000001, 13.8, 14.0, 14.2, 14.4, 14.600000000000001, 14.8, 15.0, 15.2, 15.4, 15.600000000000001, 15.8] Hz
  • Frequency resolution: 0.2 Hz

Data Structure

  • Trials: 160
  • Blocks per session: 4

Preprocessing

  • Data state: epoched
  • Downsampled to: 250.0 Hz

Signal Processing

  • Classifiers: FBCCA, eTRCA, msTRCA
  • Spatial filters: CCA, TRCA

Cross-Validation

  • Method: leave-one-block-out
  • Folds: 4
  • Evaluation type: within_subject

BCI Application

  • Environment: non-shielded
  • Online feedback: True

Tags

  • Pathology: healthy
  • Modality: visual
  • Type: perception

Documentation

  • DOI: 10.26599/BSA.2023.9050020
  • License: CC BY-NC 4.0
  • Investigators: Yue Dong, Sen Tian
  • Senior author: Yue Dong
  • Institution: Jiangsu JITRI Brain Machine Fusion Intelligence Institute
  • Country: CN
  • Repository: Zenodo
  • Data URL: https://zenodo.org/records/18847318
  • Publication year: 2023

References

Y. Dong and S. Tian, "A large database towards user-friendly SSVEP-based BCI," Brain Science Advances, vol. 9, no. 4, pp. 297-309, 2023. DOI: 10.26599/BSA.2023.9050020 Appelhoff, S., Sanderson, M., Brooks, T., Vliet, M., Quentin, R., Holdgraf, C., Chaumon, M., Mikulan, E., Tavabi, K., Hochenberger, R., Welke, D., Brunner, C., Rockhill, A., Larson, E., Gramfort, A. and Jas, M. (2019). MNE-BIDS: Organizing electrophysiological data into the BIDS format and facilitating their analysis. Journal of Open Source Software 4: (1896). https://doi.org/10.21105/joss.01896

Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6, 103. https://doi.org/10.1038/s41597-019-0104-8


Generated by MOABB 1.4.3 (Mother of All BCI Benchmarks) https://github.com/NeuroTechX/moabb

Files

66 top-level entries · 397 MB total