Friday, February 15, 2013

[G] Mobile interaction research at Google

| More

Google Research Blog: Mobile interaction research at Google

Posted by Xiaojun Bi, Ciprian Chelba, Tom Ouyang, Kurt Partridge and Shumin Zhai

Google takes a hybrid approach to research - research happens across the entire company, and affects everything we do. As one example, we have a group that focuses on mobile interaction research. With research backgrounds in human-computer interaction, machine learning, statistical language modeling, and ubicomp, the group has focused on both foundational work and feature innovations for smart touchscreen keyboards. These innovations help us make things like typing messages on your Android device easier for hundreds of millions of people each day.

We work closely with world-class engineers, designers, product managers, and UX researchers across the company, which enables us to rapidly integrate the fruits of our research into the Android platform. The first major integration was the launch of Gesture Typing in Android 4.2.

Rapidly developed from basic concepts up to product code, and built on years of Android platform groundwork on input method editors (IME) and input method framework (IMF), Gesture Typing uses novel algorithms to dynamically infer and display the user’s intended word right at the fingertip. Often the intended word is displayed even before the user has finished gesturing--creating a magical experience for the user. Seamlessly integrated with touch tapping, Gesture Typing also supports two-thumb use.

It is exciting and rewarding to do research inside a product team that enforces engineering and user experience discipline. At the same time, we as researchers also contribute to the broader research community; publication, whether in the form of papers, code, or data, bind a research community together. The following papers are based on our work over the last year, some with bright and hardworking student interns:

Octopus: Evaluating Touchscreen Keyboard Correction and Recognition Algorithms via “Remulation”

by Xiaojun Bi, Shiri Azenkot (U. of Washington), Kurt Partridge, Shumin Zhai

CHI 2013, in press (link to come)

FFitts Law: Modeling Finger Touch with Fitts’ Law

by Xiaojun Bi, Yang Li, Shumin Zhai

CHI 2013, in press (link to come)

Making Touchscreen Keyboards Adaptive to Keys, Hand Postures, and Individuals - A Hierarchical Spatial Backoff Model Approach

by Ying Yin (MIT), Tom Ouyang, Kurt Partridge, Shumin Zhai

CHI 2013, in press (link to come)

Bimanual gesture keyboard.

by Xiaojun Bi, Ciprian Chelba, Tom Ouyang, Kurt Partridge, and Shumin Zhai

UIST 2012

Touch Behavior with Different Postures on Soft Smart Phone Keyboards

by Shiri Azenkot (U. Washington) and Shumin Zhai

MobileHCI 2012


No comments: