There’s an interesting article in the Dec. 2010 issues of “Communications of the ACM” discussing the current status of using eye-tracking movements to control the interface of mobile phones.
I can imagine that what we’ll learn re: the barriers appearing in eye-tracking for manipulating mobile devices (ex: the screen’s small size seems to be a hindrance for meaningful eye movement, as each eye movement may be interpreted as a command) will, like the speech-to-text app on my droid, be useful in a myriad of ways.
Citation: Goth,G. (2010). The Eyes Have It. Communications of the ACM, (53)12, pp.13-15.