Academy of Interdisciplinary Studies Division of Integrative Systems and Design 256 On-Device Motion Intelligence: Imminent Hand Motion Prediction for Adaptive Smartphone UI with ZeroPermission Sensors Supervisor: SHAO Qijia / ISD Student: XU Ziyi / ISD Course: UROP 1100, Summer Modern smartphones increasingly challenge one-handed operation as screen sizes grow, while existing interaction aids often require intrusive sensors or disrupt natural use. This project develops an AI-driven framework that utilizes zero-permission inertial signals to predict imminent hand motions and enable realtime adaptive smartphone interfaces. By modelling the physical constraints of human hand dynamics and exploiting embedded IMU sensors, we show that upcoming tap intentions and touch positions can be reliably predicted within very short horizons (≈200–300 ms). Our lightweight on-device AI model (0.27 MB) captures both the temporal evolution and spatial patterns of hand movement, enabling fast, private, and energyefficient inference without accessing any screen content. Empirical evaluations demonstrate robust performance, achieving an average 300 ms prediction lead time and 0.83 positional accuracy, which supports seamless UI adaptation during one-handed use. This work opens a promising direction for anticipatory, lowoverhead, and personalized mobile interaction powered by on-device motion intelligence.
RkJQdWJsaXNoZXIy NDk5Njg=