Mobile phones are ubiquitous in higher education and are already integrated into students’ daily routines. Beyond their role in communication, they provide a rich source of sensor data (e.g., microphone, accelerometer, gyroscope, screen interaction patterns, GPS, app usage). These data streams can act as proxies for environmental conditions (e.g., noise, movement, crowding) and user states (e.g., focus, distraction, stress).
For students with sensory processing differences, mobile phones could act as real-time companions, monitoring their environments and behaviours to identify when they may be experiencing overload or disengagement, and offering adaptive supports. Harnessing mobile phone data for this purpose may significantly improve inclusivity and engagement in academic contexts.
Aim:
To explore how mobile phone data can be used to detect sensory processing challenges in learners and provide personalised support to enhance learning engagement, focus, and well-being.
Objectives:
- Data Capture & Integration
- Collect multimodal data from mobile phone sensors (microphone for ambient noise, accelerometer/gyroscope for activity levels, screen time patterns, notifications).
- Annotate data with learner-reported experiences of sensory overload, distraction, or comfort.
- Sensory State Modelling
- Use AI/ML techniques (e.g., supervised learning, time-series analysis) to identify patterns that indicate sensory stress, overload, or under-stimulation.
- Explore cross-referencing phone sensor data with contextual information (location, time of day, lecture schedule).
- Design of Interventions
- Prototype a mobile application that provides adaptive sensory support. Examples:
- Notifications to take breaks when stress indicators rise.
- Recommendations for quieter study locations.
- Adaptive filtering of distracting notifications during high-sensory-load contexts.
- Prototype a mobile application that provides adaptive sensory support. Examples:
- Evaluation
- Conduct a pilot study with students in real academic settings.
- Measure system accuracy, user acceptance, and perceived impact on focus and comfort.
- Address ethical issues around continuous monitoring, privacy, and consent.
Methodology:
- Use Android/iOS APIs to capture sensor and usage data.
- Collect parallel experience sampling data (self-reports on stress, distraction, sensory comfort).
- Apply data analysis pipelines and ML models to build predictive systems.
- Develop a functional prototype app integrating monitoring and intervention.
Expected Outcomes:
- A model linking mobile phone sensor data to sensory processing states.
- A mobile prototype that delivers personalised support for learners.
- Insights into how phone-based interventions can promote inclusivity and engagement.
- Ethical guidelines for responsibly using personal device data in education.
Learning Context & Inclusion:
This project addresses sensory challenges such as:
- Distracting environments (e.g., noisy cafes, crowded lecture halls).
- Overstimulation from notifications and continuous screen use.
- Difficulty maintaining focus in different academic contexts.
Potential supports include distraction management tools, location-aware suggestions for study spaces, adaptive notification filtering, and personalised reminders to regulate sensory load.