MobileEYE: Deep-Learning-based Mobile Device Eye Tracking Solution for Dynamic Visuals
  • Description

    This repository contains two zipped files - Models and Codes - related to the development and evaluation of the MobileEYE algorithm.

    1. Models

    These models were optimised for various inference environments, including mobile and edge devices.

    - Baseline Models: The original deep learning models developed during the first phase of the research without any optimisations.

    - Pruned Models: Models that have undergone pruning.

    - Quantized Models: Optimised models using quantisation.

    - ONNX Models: Models exported in ONNX format, primarily for integration with Flutter applications.

    2. Codes

    This folder contains the source codes for each phase and iteration of the study, organised according to the Sub Research Questions (SRQs) they address.

    - SRQ_1_Iteration_1: Source code for deep learning models developed in the first iteration to answer the first sub-research question (SRQ1). These models were tested on static visual stimuli.

    - SRQ_1_Iteration_2: Code for the second iteration of deep learning models developed to answer SRQ1, focusing on dynamic visual stimuli.

    - SRQ_2_Iteration_1: Contains the Android application for on-device inference, and Python programs for cloud-based and edge-based inference, addressing the first iteration of the second sub-research question (SRQ2).

    - SRQ_2_Iteration_2_n_3: Python programs used for edge-based inference on real edge devices, along with a Flutter application for on-device inference on the Samsung Galaxy S22, used in the second and third iterations for SRQ2.

    - SRQ_3: Python programs for data analysis and the Python-Flask application, along with the Web Application used for data collection. It also includes the participant details in an Excel sheet, related to SRQ3.

    This project has extended ethics consent, but cannot be published openly. To discuss the data or project, please contact Nishan Gunawardena <19937931@student.westernsydney.edu.au> ORCID 0000-0002-4629-7335.


    • Data publication title MobileEYE: Deep-Learning-based Mobile Device Eye Tracking Solution for Dynamic Visuals
    • Description

      This repository contains two zipped files - Models and Codes - related to the development and evaluation of the MobileEYE algorithm.

      1. Models

      These models were optimised for various inference environments, including mobile and edge devices.

      - Baseline Models: The original deep learning models developed during the first phase of the research without any optimisations.

      - Pruned Models: Models that have undergone pruning.

      - Quantized Models: Optimised models using quantisation.

      - ONNX Models: Models exported in ONNX format, primarily for integration with Flutter applications.

      2. Codes

      This folder contains the source codes for each phase and iteration of the study, organised according to the Sub Research Questions (SRQs) they address.

      - SRQ_1_Iteration_1: Source code for deep learning models developed in the first iteration to answer the first sub-research question (SRQ1). These models were tested on static visual stimuli.

      - SRQ_1_Iteration_2: Code for the second iteration of deep learning models developed to answer SRQ1, focusing on dynamic visual stimuli.

      - SRQ_2_Iteration_1: Contains the Android application for on-device inference, and Python programs for cloud-based and edge-based inference, addressing the first iteration of the second sub-research question (SRQ2).

      - SRQ_2_Iteration_2_n_3: Python programs used for edge-based inference on real edge devices, along with a Flutter application for on-device inference on the Samsung Galaxy S22, used in the second and third iterations for SRQ2.

      - SRQ_3: Python programs for data analysis and the Python-Flask application, along with the Web Application used for data collection. It also includes the participant details in an Excel sheet, related to SRQ3.

      This project has extended ethics consent, but cannot be published openly. To discuss the data or project, please contact Nishan Gunawardena <19937931@student.westernsydney.edu.au> ORCID 0000-0002-4629-7335.


    • Data type dataset
    • Keywords
      • Eye tracking
      • Edge intellgence
      • Deep learning
      • Model optimisation
      • Hybrid models
      • Mobile Devices
    • Funding source
    • Grant number(s)
      • -
    • FoR codes
      • 461103 - Deep learning
      • 460601 - Cloud computing
      • 460611 - Performance evaluation
      • 460304 - Computer vision
      • 460806 - Human-computer interaction
      SEO codes
      Temporal (time) coverage
    • Start date
    • End date
    • Time period
       
      Spatial (location,mapping) coverage
    • Locations
      Data Locations

      Type Location Notes
      The Data Manager is: Kahandawala Gunawardena
      Access conditions Conditional
    • Related publications
        Name
      • URL
      • Notes
    • Related website
        Name
      • URL
      • Notes
    • Related metadata (including standards, codebooks, vocabularies, thesauri, ontologies)
    • Related data
        Name
      • URL
      • Notes
    • Related services
        Name
      • URL
      • Notes
      The data will be licensed under
    • Other license
    • Statement of rights in data Copyright Western Sydney University
      Citation Gunawardena, Kahandawala; Ginige, Jeewani; Javadi, Bahman; Lui, Gough; Ginige, Jeewani (2024): MobileEYE: Deep-Learning-based Mobile Device Eye Tracking Solution for Dynamic Visuals. Western Sydney University. https://doi.org/10.26183/0ryn-p137