Show simple item record

dc.contributor.authorMwima, Tobiah
dc.date.accessioned2015-10-05T08:37:46Z
dc.date.available2015-10-05T08:37:46Z
dc.date.issued2015-10-05
dc.identifier.urihttp://dspace.unza.zm/handle/123456789/4077
dc.description.abstractThe Mouse and keyboard is the major means of passing information from user to computer. Direct manipulation of objects via the mouse was a breakthrough in the design of a more natural and intuitive user interfaces for computers. However, in real life we have a rich set of communication methods at our disposal; when interacting with others, we, for example interpret their gestures, expressions and eye movements. This information can be used also when moving human -computer interaction toward the more natural and effective method. In particular, the eye gesture can be a more valuable source of information for a computer system to be aware of, if it has to provide assistance when appropriate.The focus of this research is on examining how the information acquired from a user's eye movements in human-to-computer interaction can be used to assist electronic book readers. For this purpose a simple prototype called assisted and augmented reading will be developed. Enhancing the reading experience and awarding electronic book readers some reward points, which will be dependent on the tracked reading progress; this will be an encouraging and motivating technique to help increase literacy levels. However, finding the point of focus on the screen by eye tracking using the ordinary webcam is the main hypothesis behind the development of this prototype. The implementation of this prototype will be founded on a C++ library called Open Computer Vision and IDE for C++.en_US
dc.language.isoenen_US
dc.subjectAssisted and Augmented Readingen_US
dc.titleAssisted and Augmented Readingen_US
dc.typeThesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record