UROP Proceeding 2024-25

School of Engineering Department of Computer Science and Engineering 130 LLM-Assisted Multimodal Sensing on Mobile Devices Supervisor: OUYANG Xiaomin / CSE Student: CHEN Hongyu / MATH-GM Course: UROP 1000, Summer This project, “LLM-Assisted Multimodal Sensing on Mobile Devices”, focuses on designing a mobile application for multimodal data acquisition and implementing LLM-driven data processing pipelines. Building upon prior work, key enhancements include: A novel QASession module enabling user feedback on LLMgenerated activity sequences and analytical queries derived from sensor data. An atomic activity segmentation framework decomposing holistic daily activities into discrete, multidimensional time-series components. Also, including a customQA module for users to derive their own QAs for supplement. These features support granular activity analysis and iterative LLM optimization via human-in-the-loop refinement. Specific implementations of the data ingestion architecture, LLM integration protocols, and feedback interfaces are elaborated in subsequent. LLM-Assisted Multimodal Sensing on Mobile Devices Supervisor: OUYANG Xiaomin / CSE Student: LIU Kwun Ho / DSCT Course: UROP 1100, Spring UROP 2100, Summer These days, with the increasing popularity of mobile devices, such as mobile phones, tablets, and smartwatches, one may obtain sensor data from those devices for different tasks. For instance, one can use them for human activity recognition tasks, namely, gesture detection, action recognition, as well as fall detection. The purpose of this study is to describe the development and deployment of MobiBox, a portable, lightweight smartphone application for persistent health analysis and behavioral data collection. In this project, the large language model (GPT-4o mini) is adopted for analysis and collaboration of the information from the collected data, such as the inertial measurement unit (IMU) data, GPS data, app usage data, and so on. The Q&A and atomic activities sessions are introduced for a holistic human activity recognition. LLM-Assisted Multimodal Sensing on Mobile Devices Supervisor: OUYANG Xiaomin / CSE Student: WU Yu-chen / COMP Course: UROP 1100, Spring Nowadays, as mobile devices such as mobile phones and smartwatches become more and more popular and ubiquitous, we can acquire sensor data from them for human activity recognition (HAR) tasks, namely, gesture detection, fall detection, action recognition, to mention but a few. This report aims to present the design and implementation of MobiBox, a portable smartphone application for long-term dynamic health analysis and persistent behavioral data collection. To leverage the contextual information from the collected data such as inertial measurement unit (IMU) data, GPS data, app usage data, and so on, we adopt large language models (LLMs) to collaborate with the contextual information in our implementation.

RkJQdWJsaXNoZXIy NDk5Njg=