THINGS-EEG: Human electroencephalography recordings from 50 subjects for 22,248 images from 1,854 object concepts
  • Description

    The neural basis of object recognition and semantic knowledge have been the focus of a large body of research but given the high dimensionality of object space, it is challenging to develop an overarching theory on how brain organises object knowledge. To help understand how the brain allows us to recognise, categorise, and represent objects and object categories, there is a growing interest in using large-scale image databases for neuroimaging experiments. Traditional image databases are based on manually selected object concepts and often single images per concept. In contrast, ‘big data’ stimulus sets typically consist of images that can vary significantly in quality and may be biased in content. To address this issue, recent work developed THINGS: a large stimulus set of 1,854 object concepts and 26,107 associated images (https://things-initiative.org/). In the current paper, we present THINGS-EEG, a dataset containing human electroencephalography responses from 50 subjects to all concepts and 22,248 images in the THINGS stimulus set. The THINGS-EEG dataset provides neuroimaging recordings to a systematic collection of objects and concepts and can therefore support a wide array of research to understand visual object processing in the human brain. This repository contains the code that was used to perform the analyses described in this paper: Grootswagers, T., Zhou, I., Robinson, A.K. et al. Human EEG recordings for 1,854 concepts presented in rapid serial visual presentation streams. Sci Data 9, 3 (2022). https://doi.org/10.1038/s41597-021-01102-7 - THINGS images and concept descriptions obtained from: https://osf.io/jum2f (see also: https://things-initiative.org/) - The raw data, preprocessed data, and grand-average RDMs are publicly available on Openneuro: https://openneuro.org/datasets/ds003825 - RDMs for single subjects are publicly available on figshare: https://doi.org/10.6084/m9.figshare.14721282 (note: OSF sometimes incorrectly lists this as private) see the README in the code folder for instructions on how to reproduce the figures in the paper.


    • Data publication title THINGS-EEG: Human electroencephalography recordings from 50 subjects for 22,248 images from 1,854 object concepts
    • Description

      The neural basis of object recognition and semantic knowledge have been the focus of a large body of research but given the high dimensionality of object space, it is challenging to develop an overarching theory on how brain organises object knowledge. To help understand how the brain allows us to recognise, categorise, and represent objects and object categories, there is a growing interest in using large-scale image databases for neuroimaging experiments. Traditional image databases are based on manually selected object concepts and often single images per concept. In contrast, ‘big data’ stimulus sets typically consist of images that can vary significantly in quality and may be biased in content. To address this issue, recent work developed THINGS: a large stimulus set of 1,854 object concepts and 26,107 associated images (https://things-initiative.org/). In the current paper, we present THINGS-EEG, a dataset containing human electroencephalography responses from 50 subjects to all concepts and 22,248 images in the THINGS stimulus set. The THINGS-EEG dataset provides neuroimaging recordings to a systematic collection of objects and concepts and can therefore support a wide array of research to understand visual object processing in the human brain. This repository contains the code that was used to perform the analyses described in this paper: Grootswagers, T., Zhou, I., Robinson, A.K. et al. Human EEG recordings for 1,854 concepts presented in rapid serial visual presentation streams. Sci Data 9, 3 (2022). https://doi.org/10.1038/s41597-021-01102-7 - THINGS images and concept descriptions obtained from: https://osf.io/jum2f (see also: https://things-initiative.org/) - The raw data, preprocessed data, and grand-average RDMs are publicly available on Openneuro: https://openneuro.org/datasets/ds003825 - RDMs for single subjects are publicly available on figshare: https://doi.org/10.6084/m9.figshare.14721282 (note: OSF sometimes incorrectly lists this as private) see the README in the code folder for instructions on how to reproduce the figures in the paper.


    • Data type dataset
    • Keywords
      • object recognition
    • Funding source
    • Grant number(s)
      • -
    • FoR codes
      SEO codes
      Temporal (time) coverage
    • Start date
    • End date
    • Time period
       
      Spatial (location,mapping) coverage
    • Locations
      Data Locations

      Type Location Notes
      URL https://osf.io/hd6zk/
      The Data Manager is: Tijl Grootswagers
      Access conditions Open
    • Related publications
    • Related website
        Name
      • URL
      • Notes
    • Related metadata (including standards, codebooks, vocabularies, thesauri, ontologies)
    • Related data
        Name
      • URL
      • Notes
    • Related services
        Name
      • URL
      • Notes
      The data will be licensed under
    • Other license
    • Statement of rights in data Copyright Western Sydney University
      Citation Grootswagers, Tijl (2021): THINGS-EEG: Human electroencephalography recordings from 50 subjects for 22,248 images from 1,854 object concepts. OSF. https://doi.org/10.17605/OSF.IO/HD6ZK