Login is moving

Authentication for nemar.org is migrating from the legacy system to the new Cloudflare-backed identity. Until that ships, sign in via the CLI:

npm install -g nemar-cli
nemar login
on005628 NEMAR · OpenNeuro mirror

Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site

Imported from OpenNeuro ds005628

Compute on this dataset

Two routes today, with a third (in-browser one-click submission) landing soon.

  1. NeuroScience Gateway (NSG) portal.

    NSG runs EEGLAB / Brainstorm / MNE pipelines on supercomputing time donated by SDSC. Create an account, point a job at this dataset's S3 prefix (s3://nemar/on005628), and submit.
    nsgportal.org →

  2. Local processing with nemar-cli.

    Pull the dataset to your machine and run any toolbox locally. Honors the published version pinning.

    npm install -g nemar-cli
    nemar dataset clone on005628
    cd on005628 && nemar dataset get
  3. Just the files.

    rclone, aria2c, or any HTTPS client works against data.nemar.org/on005628/ — the manifest carries presigned S3 URLs.

Direct compute access is coming soon. One-click NSG submission from this page is scoped for a follow-up phase. Tracked on nemarOrg/website#6.

NEMAR-curated copy of OpenNeuro ds005628

This is the NEMAR-curated copy. We pull from OpenNeuro periodically; each pull is a major version bump (vN.0.0). Versions between pulls (vN.x.y) are NEMAR-side fixes and improvements.

NEMAR DOI
OpenNeuro DOI
unknown

![DOI](https://doi.org/10.82901/nemar.on005628)

README

  • Authors
  • Juan Pablo Rosado-Aíza, Fernando José Domínguez-Morales, Tania Yareni Pech-Canul, Paola Guadalupe Vázquez-Rodríguez, Gustavo Navas-Reascos, Luz María Alonso-Valerdi, David I. Ibarra Zarate

Overview

  • Project name
  • Dataset of Visual and Audiovisual Stimuli in Virtual Reality from the Edzna Archaeological Site

  • Year that the project ran
  • 2024

  • Brief overview

The purpose of this dataset is to analyze user experience in a virtual reality (VR) environment, focusing on a comparative study between visual and audiovisual stimuli based on the archaeological site of Edzna, Mexico. The immersive experience allowed participants to explore the site without needing to physically being there, and the experiment was conducted in a museum setting, offering a unique experience that goes beyond traditional visual-only exhibits. The dataset includes both Electroencephalography (EEG) recordings from eight channels (Fz, C3, Cz, C4, Pz, PO7, Oz, and PO8) and user responses to the User Experience Questionnaire (UEQ), providing necessary data for future studies on how immersive environments affect user perception. The EEG data was collected using a Unicorn Hybrid Black EEG system with a sampling rate of 250 Hz. Participants were exposed to two conditions: a visual-only stimulus and an audiovisual stimulus, both of which represented scenes from the archaeological site in VR. Prior to exposure, a baseline measurement was taken to capture the initial state of the participants. Data collection was conducted in MOSTLA, a digital innovation lab at Tecnologico de Monterrey campus, and the Museum of Contemporary Art in Monterrey.

Each EEG recording is shared in .set format and follows the BIDS structure. The recordings include eight channels of brainwave recordings for the baseline, visual, and audiovisual conditions. The signals are presented in both formats: raw and preprocessed. Additionally, an .xlsx file is provided with basic participant metadata, such as age, gender, unique identifier as well as the UEQ responses.

Each EEG file contains data segmented into the three phases of the experiment: baseline, visual stimulus, and audiovisual stimulus, allowing researchers to directly compare neural responses across conditions.

This dataset offers a comprehensive resource for researchers interested in investigating the effects of immersive VR environments on user engagement, and attention, making it highly applicable and useful.

  • Description of the contents of the dataset
  • sub-N - Raw data sub-Np - Preprocesed data

Example: sub-1 - Raw data of subject 1 sub-1p - Preprocesed data of subject 1

Subjects

A total of 51 participants were obtained.

Apparatus

Unicorn Hybrid Black EEG system VR Headset Headphones

Experimental location

MOSTLA place at Tecnologico de Monterrey. It is located at Av. Eugenio Garza Sada 2501 Sur, Tecnologico, 64849 Monterrey, N.L., Mexico. MARCO a contemporary art museum located in Monterrey at Zuazua y Jardón, Centro, 64000 Monterrey, N.L., Mexico.

Notes

All the metadata information, including the UEQ answers could be obtained from the file metadata.xlsx

The videos presented to the participants are shown in:

Audiovisual video: https://youtu.be/FBWbtSFwVuo

Visual video: https://youtu.be/aLzzl0ygBnc

Files

109 top-level entries · 634 MB total