The Keystroke-level model was developed over the years to allow us to predict task times of a given set of interaction. Using this measurement is fantastic when brainstorming up potential UI solutions.
There's an issue, however, where the Keystroke-level model deals primary with, you guessed it, mouse and keyboards. What about touch? We could easily re-run the Time and Motion study to figure out times for touch devices.
Is there an updated Keystroke-level model for touch devices?
Answer
Google Scholar search for "haptic klm" came up with these updates to KLM:
FLM (fingerstroke level model)
The traditional Keystroke-Level Model (KLM) was not applicable to predict the task performance in the touch-sensitive user interface. This case study thus proposed Fingerstroke Level Model (FLM), and analyzed the inter-network mirroring game - 'Freestyle II"' with FLM. The empirical study confirmed the effectiveness and efficiency of FLM, and suggested how HCI methods can improve the design of mobile gaming user interface.
There is also a video on this.
TLM (touch level model)
In this paper, we introduce new operators and other modifications to KLM-GOMS to accommodate modern touchscreen interfaces. We call these additions, together with updates to the existing KLM operators, the Touch Level Model (TLM). We propose that this model can be employed to model human task performance on a constrained-input touchscreen device and, with proper benchmarking, accurately predict actual user performance.
Touch-level model (TLM): evolving KLM-GOMS for touchscreen and mobile devices
The paper on TLM seems interesting, I need to read it through.
No comments:
Post a Comment