Vibrations convey rich information about interactions between people, objects, and environments. Exploiting ubiquitous sensors and machine learning, I have developed systems to sense, understand, and broker universal low volume vibrational interactions between people and the physical world.
In this presentation, I will focus on using subtle human body vibrations to translate freestyle finger writing and typing for interaction with wearable computing devices such as smartwatches and smart glasses. Challenges addressed include difficulties collecting and labeling a large vibration dataset, filtering human activity noise from finger typing and writing vibration signals through signal processing, designing a novel adversarial neural network to overcome human variations including those of typing strength, writing style, hand shape, smartwatch wrist position, and finally adopting a recurrent neural aligner for enabling both continuous and discrete finger movement recognition.
I will also briefly discuss the capture of vibrations from buildings, robots, and environments for use in ubiquitous computing applications such as metaverse, robotic automation, smart health, smart home tech, and security and privacy. The mission of this research is to integrate human, cyber, and physical experiences into an intelligent world of vibrational interactions.