Many recently proposed text entry methods on mobile touch screen devices are gesture-based: the traditional tapping in- teraction is being replaced with a more natural gesture, per- formed through a pointer (pen or finger) on a soft keyboard. These methods need an effective technique to interpret user gestures, in order to correctly obtain the text the user intends to enter. In this paper we present a set of approaches aimed at interpreting the gestures associated to KeyScretch, a re- cently introduced text entry method based on menu-augmen- ted soft keyboards, along with a comparison of their perfor- mances. The evaluation shows that geometric sketch recog- nition techniques, associated to other calibrations can sig- nificantly improve the accuracy achievable using the simple target-based method.
File in questo prodotto:
Non ci sono file associati a questo prodotto.