fbpx
Wikipedia

Eye tracking

Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research on the visual system, in psychology, in psycholinguistics, marketing, as an input device for human-computer interaction, and in product design. In addition, eye trackers are increasingly being used for assistive and rehabilitative applications such as controlling wheelchairs, robotic arms, and prostheses. Recently, eye tracking has been examined as a tool for the early detection of autism spectrum disorder. There are several methods for measuring eye movement, with the most popular variant using video images to extract eye position. Other methods use search coils or are based on the electrooculogram.

Eye tracking device
Scientists track eye movements in glaucoma patients to check vision impairment while driving.

History edit

 
Yarbus eye tracker from the 1960s

In the 1800s, studies of eye movement were made using direct observations. For example, Louis Émile Javal observed in 1879 that reading does not involve a smooth sweeping of the eyes along the text, as previously assumed, but a series of short stops (called fixations) and quick saccades.[1] This observation raised important questions about reading, questions which were explored during the 1900s: On which words do the eyes stop? For how long? When do they regress to already seen words?

 
An example of fixations and saccades over text. This is the typical pattern of eye movement during reading. The eyes never move smoothly over still text.

Edmund Huey[2] built an early eye tracker, using a sort of contact lens with a hole for the pupil. The lens was connected to an aluminum pointer that moved in response to the movement of the eye. Huey studied and quantified regressions (only a small proportion of saccades are regressions), and he showed that some words in a sentence are not fixated.

The first non-intrusive eye-trackers were built by Guy Thomas Buswell in Chicago, using beams of light that were reflected on the eye, then recording on film. Buswell made systematic studies into reading[3][4] and picture viewing.[5]

In the 1950s, Alfred L. Yarbus[6] performed eye tracking research, and his 1967 book is often quoted. He showed that the task given to a subject has a very large influence on the subject's eye movement. He also wrote about the relation between fixations and interest:

All the records ... show conclusively that the character of the eye movement is either completely independent of or only very slightly dependent on the material of the picture and how it was made, provided that it is flat or nearly flat."[7]

The cyclical pattern in the examination of pictures "is dependent on not only what is shown on the picture, but also the problem facing the observer and the information that he hopes to gain from the picture."[8]

 
This study by Yarbus (1967) is often referred to as evidence on how the task given to a person influences his or her eye movement.

Records of eye movements show that the observer's attention is usually held only by certain elements of the picture.... Eye movement reflects the human thought processes; so the observer's thought may be followed to some extent from records of eye movement (the thought accompanying the examination of the particular object). It is easy to determine from these records which elements attract the observer's eye (and, consequently, his thought), in what order, and how often.[7]

The observer's attention is frequently drawn to elements which do not give important information but which, in his opinion, may do so. Often an observer will focus his attention on elements that are unusual in the particular circumstances, unfamiliar, incomprehensible, and so on.[9]

... when changing its points of fixation, the observer's eye repeatedly returns to the same elements of the picture. Additional time spent on perception is not used to examine the secondary elements, but to reexamine the most important elements.[10]

 
This study by Hunziker (1970)[11] on eye tracking in problem solving used simple 8 mm film to track eye movement by filming the subject through a glass plate on which the visual problem was displayed.[12][11]

In the 1970s, eye-tracking research expanded rapidly, particularly reading research. A good overview of the research in this period is given by Rayner.[13]

In 1980, Just and Carpenter[14] formulated the influential Strong eye-mind hypothesis, that "there is no appreciable lag between what is fixated and what is processed". If this hypothesis is correct, then when a subject looks at a word or object, he or she also thinks about it (process cognitively), and for exactly as long as the recorded fixation. The hypothesis is often taken for granted by researchers using eye-tracking. However, gaze-contingent techniques offer an interesting option in order to disentangle overt and covert attentions, to differentiate what is fixated and what is processed.

During the 1980s, the eye-mind hypothesis was often questioned in light of covert attention,[15][16] the attention to something that one is not looking at, which people often do. If covert attention is common during eye-tracking recordings, the resulting scan-path and fixation patterns would often show not where attention has been, but only where the eye has been looking, failing to indicate cognitive processing.

The 1980s also saw the birth of using eye-tracking to answer questions related to human-computer interaction. Specifically, researchers investigated how users search for commands in computer menus.[17] Additionally, computers allowed researchers to use eye-tracking results in real time, primarily to help disabled users.[17]

More recently, there has been growth in using eye tracking to study how users interact with different computer interfaces. Specific questions researchers ask are related to how easy different interfaces are for users.[17] The results of the eye tracking research can lead to changes in design of the interface. Another recent area of research focuses on Web development. This can include how users react to drop-down menus or where they focus their attention on a website so the developer knows where to place an advertisement.[18]

According to Hoffman,[19] current consensus is that visual attention is always slightly (100 to 250 ms) ahead of the eye. But as soon as attention moves to a new position, the eyes will want to follow.[20]

Specific cognitive processes still cannot be inferred directly from a fixation on a particular object in a scene.[21] For instance, a fixation on a face in a picture may indicate recognition, liking, dislike, puzzlement etc. Therefore, eye tracking is often coupled with other methodologies, such as introspective verbal protocols.

Thanks to advancement in portable electronic devices, portable head-mounted eye trackers currently can achieve excellent performance and are being increasingly used in research and market applications targeting daily life settings.[22] These same advances have led to increases in the study of small eye movements that occur during fixation, both in the lab and in applied settings.[23]

 
The use of convolutional neural networks in eye-tracking allow for new information to be identified by artificial intelligence.

In the 21st century, the use of artificial intelligence (AI) and artificial neural networks has become a viable way to complete eye-tracking tasks and analysis. In particular, the convolutional neural network lends itself to eye-tracking, as it is designed for image-centric tasks. With AI, eye-tracking tasks and studies can yield additional information that may not have been detected by human observers. The practice of deep learning also allows for a given neural network to improve at a given task when given enough sample data. This requires a relatively large supply of training data, however.[24]

The potential use cases for AI in eye-tracking cover a wide range of topics from medical applications[25] to driver safety[24] to game theory[26] and even education and training applications.[27][28][29]

Tracker types edit

Eye-trackers measure rotations of the eye in one of several ways, but principally they fall into one of three categories:

  1. measurement of the movement of an object (normally, a special contact lens) attached to the eye
  2. optical tracking without direct contact to the eye
  3. measurement of electric potentials using electrodes placed around the eyes.

Eye-attached tracking edit

The first type uses an attachment to the eye, such as a special contact lens with an embedded mirror or magnetic field sensor, and the movement of the attachment is measured with the assumption that it does not slip significantly as the eye rotates. Measurements with tight-fitting contact lenses have provided extremely sensitive recordings of eye movement, and magnetic search coils are the method of choice for researchers studying the dynamics and underlying physiology of eye movement. This method allows the measurement of eye movement in horizontal, vertical and torsion directions.[30]

Optical tracking edit

 
An eye-tracking head-mounted display. Each eye has an LED light source (gold-color metal) on the side of the display lens, and a camera under the display lens.

The second broad category uses some non-contact, optical method for measuring eye motion. Light, typically infrared, is reflected from the eye and sensed by a video camera or some other specially designed optical sensor. The information is then analyzed to extract eye rotation from changes in reflections. Video-based eye trackers typically use the corneal reflection (the first Purkinje image) and the center of the pupil as features to track over time. A more sensitive type of eye-tracker, the dual-Purkinje eye tracker,[31] uses reflections from the front of the cornea (first Purkinje image) and the back of the lens (fourth Purkinje image) as features to track. A still more sensitive method of tracking is to image features from inside the eye, such as the retinal blood vessels, and follow these features as the eye rotates. Optical methods, particularly those based on video recording, are widely used for gaze-tracking and are favored for being non-invasive and inexpensive.

Electric potential measurement edit

The third category uses electric potentials measured with electrodes placed around the eyes. The eyes are the origin of a steady electric potential field which can also be detected in total darkness and if the eyes are closed. It can be modelled to be generated by a dipole with its positive pole at the cornea and its negative pole at the retina. The electric signal that can be derived using two pairs of contact electrodes placed on the skin around one eye is called Electrooculogram (EOG). If the eyes move from the centre position towards the periphery, the retina approaches one electrode while the cornea approaches the opposing one. This change in the orientation of the dipole and consequently the electric potential field results in a change in the measured EOG signal. Inversely, by analysing these changes in eye movement can be tracked. Due to the discretisation given by the common electrode setup, two separate movement components – a horizontal and a vertical – can be identified. A third EOG component is the radial EOG channel,[32] which is the average of the EOG channels referenced to some posterior scalp electrode. This radial EOG channel is sensitive to the saccadic spike potentials stemming from the extra-ocular muscles at the onset of saccades, and allows reliable detection of even miniature saccades.[33]

Due to potential drifts and variable relations between the EOG signal amplitudes and the saccade sizes, it is challenging to use EOG for measuring slow eye movement and detecting gaze direction. EOG is, however, a very robust technique for measuring saccadic eye movement associated with gaze shifts and detecting blinks. Contrary to video-based eye-trackers, EOG allows recording of eye movements even with eyes closed, and can thus be used in sleep research. It is a very light-weight approach that, in contrast to current video-based eye-trackers, requires low computational power, works under different lighting conditions and can be implemented as an embedded, self-contained wearable system.[34][35] It is thus the method of choice for measuring eye movement in mobile daily-life situations and REM phases during sleep. The major disadvantage of EOG is its relatively poor gaze-direction accuracy compared to a video tracker. That is, it is difficult to determine with good accuracy exactly where a subject is looking, though the time of eye movements can be determined.

Technologies and techniques edit

The most widely used current designs are video-based eye-trackers. A camera focuses on one or both eyes and records eye movement as the viewer looks at some kind of stimulus. Most modern eye-trackers use the center of the pupil and infrared / near-infrared non-collimated light to create corneal reflections (CR). The vector between the pupil center and the corneal reflections can be used to compute the point of regard on surface or the gaze direction. A simple calibration procedure of the individual is usually needed before using the eye tracker.[36]

Two general types of infrared / near-infrared (also known as active light) eye-tracking techniques are used: bright-pupil and dark-pupil. Their difference is based on the location of the illumination source with respect to the optics. If the illumination is coaxial with the optical path, then the eye acts as a retroreflector as the light reflects off the retina creating a bright pupil effect similar to red eye. If the illumination source is offset from the optical path, then the pupil appears dark because the retroreflection from the retina is directed away from the camera.[37]

Bright-pupil tracking creates greater iris/pupil contrast, allowing more robust eye-tracking with all iris pigmentation, and greatly reduces interference caused by eyelashes and other obscuring features.[38] It also allows tracking in lighting conditions ranging from total darkness to very bright.

Another, less used, method is known as passive light. It uses visible light to illuminate, something which may cause some distractions to users.[37] Another challenge with this method is that the contrast of the pupil is less than in the active light methods, therefore, the center of iris is used for calculating the vector instead.[39] This calculation needs to detect the boundary of the iris and the white sclera (limbus tracking). It presents another challenge for vertical eye movements due to obstruction of eyelids.[40]

Eye-tracking setups vary greatly. Some are head-mounted, some require the head to be stable (for example, with a chin rest), and some function remotely and automatically track the head during motion. Most use a sampling rate of at least 30 Hz. Although 50/60 Hz is more common, today many video-based eye trackers run at 240, 350 or even 1000/1250 Hz, speeds needed to capture fixational eye movements or correctly measure saccade dynamics.

Eye movements are typically divided into fixations and saccades – when the eye gaze pauses in a certain position, and when it moves to another position, respectively. The resulting series of fixations and saccades is called a scanpath. Smooth pursuit describes the eye following a moving object. Fixational eye movements include microsaccades: small, involuntary saccades that occur during attempted fixation. Most information from the eye is made available during a fixation or smooth pursuit, but not during a saccade.[41]

Scanpaths are useful for analyzing cognitive intent, interest, and salience. Other biological factors (some as simple as gender) may affect the scanpath as well. Eye tracking in human–computer interaction (HCI) typically investigates the scanpath for usability purposes, or as a method of input in gaze-contingent displays, also known as gaze-based interfaces.[42]

Data presentation edit

Interpretation of the data that is recorded by the various types of eye-trackers employs a variety of software that animates or visually represents it, so that the visual behavior of one or more users can be graphically resumed. The video is generally manually coded to identify the AOIs (areas of interest) or recently using artificial intelligence. Graphical presentation is rarely the basis of research results, since they are limited in terms of what can be analysed - research relying on eye-tracking, for example, usually requires quantitative measures of the eye movement events and their parameters, The following visualisations are the most commonly used:

Animated representations of a point on the interface This method is used when the visual behavior is examined individually indicating where the user focused their gaze in each moment, complemented with a small path that indicates the previous saccade movements, as seen in the image.

Static representations of the saccade path This is fairly similar to the one described above, with the difference that this is static method. A higher level of expertise than with the animated ones is required to interpret this.

Heat maps An alternative static representation, used mainly for the agglomerated analysis of the visual exploration patterns in a group of users. In these representations, the 'hot' zones or zones with higher density designate where the users focused their gaze (not their attention) with a higher frequency. Heat maps are the best known visualization technique for eyetracking studies.[43]

Blind zones maps, or focus maps This method is a simplified version of the heat maps where the visually less attended zones by the users are displayed clearly, thus allowing for an easier understanding of the most relevant information, that is to say, it provides more information about which zones were not seen by the users.

Saliency maps Similar to heat maps, a saliency map illustrates areas of focus by brightly displaying the attention-grabbing objects over an initially black canvas. The more focus is given to a particular object, the brighter it will appear.[44]

Eye-tracking vs. gaze-tracking edit

Eye-trackers necessarily measure the rotation of the eye with respect to some frame of reference. This is usually tied to the measuring system. Thus, if the measuring system is head-mounted, as with EOG or a video-based system mounted to a helmet, then eye-in-head angles are measured. To deduce the line of sight in world coordinates, the head must be kept in a constant position or its movements must be tracked as well. In these cases, head direction is added to eye-in-head direction to determine gaze direction. However, if the motion of the head is minor, the eye remains in constant position.[45]

If the measuring system is table-mounted, as with scleral search coils or table-mounted camera (remote) systems, then gaze angles are measured directly in world coordinates. Typically, in these situations head movements are prohibited. For example, the head position is fixed using a bite bar or a forehead support. Then a head-centered reference frame is identical to a world-centered reference frame. Or colloquially, the eye-in-head position directly determines the gaze direction.

Some results are available on human eye movements under natural conditions where head movements are allowed as well.[46] The relative position of eye and head, even with constant gaze direction, influences neuronal activity in higher visual areas.[47]

Practice edit

A great deal of research has gone into studies of the mechanisms and dynamics of eye rotation, but the goal of eye tracking is most often to estimate gaze direction. Users may be interested in what features of an image draw the eye, for example. It is important to realize that the eye tracker does not provide absolute gaze direction, but rather can measure only changes in gaze direction. To determine precisely what a subject is looking at, some calibration procedure is required in which the subject looks at a point or series of points, while the eye tracker records the value that corresponds to each gaze position. (Even those techniques that track features of the retina cannot provide exact gaze direction because there is no specific anatomical feature that marks the exact point where the visual axis meets the retina, if indeed there is such a single, stable point.) An accurate and reliable calibration is essential for obtaining valid and repeatable eye movement data, and this can be a significant challenge for non-verbal subjects or those who have unstable gaze.

Each method of eye-tracking has advantages and disadvantages, and the choice of an eye-tracking system depends on considerations of cost and application. There are offline methods and online procedures like AttentionTracking. There is a trade-off between cost and sensitivity, with the most sensitive systems costing many tens of thousands of dollars and requiring considerable expertise to operate properly. Advances in computer and video technology have led to the development of relatively low-cost systems that are useful for many applications and fairly easy to use.[48] Interpretation of the results still requires some level of expertise, however, because a misaligned or poorly calibrated system can produce wildly erroneous data.

Eye-tracking while driving a car in a difficult situation edit

 
Frames from narrow road eye tracking described in this section[49]

The eye movement of two groups of drivers have been filmed with a special head camera by a team of the Swiss Federal Institute of Technology: Novice and experienced drivers had their eye-movement recorded while approaching a bend of a narrow road. The series of images has been condensed from the original film frames[50] to show 2 eye fixations per image for better comprehension.

Each of these stills corresponds to approximately 0.5 seconds in real time.

The series of images shows an example of eye fixations #9 to #14 of a typical novice and of an experienced driver.

Comparison of the top images shows that the experienced driver checks the curve and even has Fixation No. 9 left to look aside while the novice driver needs to check the road and estimate his distance to the parked car.

In the middle images, the experienced driver is now fully concentrating on the location where an oncoming car could be seen. The novice driver concentrates his view on the parked car.

In the bottom image the novice is busy estimating the distance between the left wall and the parked car, while the experienced driver can use their peripheral vision for that and still concentrate vision on the dangerous point of the curve: If a car appears there, the driver has to give way, i.e. stop to the right instead of passing the parked car.[51]

More recent studies have also used head-mounted eye tracking to measure eye movements during real-world driving conditions.[52][23]

Eye-tracking of younger and elderly people while walking edit

While walking, elderly subjects depend more on foveal vision than do younger subjects. Their walking speed is decreased by a limited visual field, probably caused by a deteriorated peripheral vision.

Younger subjects make use of both their central and peripheral vision while walking. Their peripheral vision allows faster control over the process of walking.[53]

Applications edit

A wide variety of disciplines use eye-tracking techniques, including cognitive science; psychology (notably psycholinguistics; the visual world paradigm); human-computer interaction (HCI); human factors and ergonomics; marketing research and medical research (neurological diagnosis).[54] Specific applications include the tracking eye movement in language reading, music reading, human activity recognition, the perception of advertising, playing of sports, distraction detection and cognitive load estimation of drivers and pilots and as a means of operating computers by people with severe motor impairment.[23] In the field of virtual reality, eye tracking is used in head mounted displays for a variety of purposes including to reduce processing load by only rendering the graphical area within the user's gaze.[55]

Commercial applications edit

In recent years, the increased sophistication and accessibility of eye-tracking technologies have generated a great deal of interest in the commercial sector. Applications include web usability, advertising, sponsorship, package design and automotive engineering. In general, commercial eye-tracking studies function by presenting a target stimulus to a sample of consumers while an eye tracker records eye activity. Examples of target stimuli may include websites, television programs, sporting events, films and commercials, magazines and newspapers, packages, shelf displays, consumer systems (ATMs, checkout systems, kiosks) and software. The resulting data can be statistically analyzed and graphically rendered to provide evidence of specific visual patterns. By examining fixations, saccades, pupil dilation, blinks and a variety of other behaviors, researchers can determine a great deal about the effectiveness of a given medium or product. While some companies complete this type of research internally, there are many private companies that offer eye-tracking services and analysis.

One field of commercial eye-tracking research is web usability. While traditional usability techniques are often quite powerful in providing information on clicking and scrolling patterns, eye-tracking offers the ability to analyze user interaction between the clicks and how much time a user spends between clicks, thereby providing valuable insight into which features are the most eye-catching, which features cause confusion and which are ignored altogether. Specifically, eye-tracking can be used to assess search efficiency, branding, online advertisements, navigation usability, overall design and many other site components. Analyses may target a prototype or competitor site in addition to the main client site.

Eye-tracking is commonly used in a variety of different advertising media. Commercials, print ads, online ads and sponsored programs are all conducive to analysis with current eye-tracking technology. One example is the analysis of eye movements over advertisements in the Yellow Pages. One study focused on what particular features caused people to notice an ad, whether they viewed ads in a particular order and how viewing times varied. The study revealed that ad size, graphics, color, and copy all influence attention to advertisements. Knowing this allows researchers to assess in great detail how often a sample of consumers fixates on the target logo, product or ad. Hence an advertiser can quantify the success of a given campaign in terms of actual visual attention.[56] Another example of this is a study that found that in a search engine results page, authorship snippets received more attention than the paid ads or even the first organic result.[57]

Yet another example of commercial eye-tracking research comes from the field of recruitment. A study analyzed how recruiters screen LinkedIn profiles and presented results as heat maps.[58]

Safety applications edit

Scientists in 2017 constructed a Deep Integrated Neural Network (DINN) out of a Deep Neural Network and a convolutional neural network.[24] The goal was to use deep learning to examine images of drivers and determine their level of drowsiness by "classify[ing] eye states." With enough images, the proposed DINN could ideally determine when drivers blink, how often they blink, and for how long. From there, it could judge how tired a given driver appears to be, effectively conducting an eye-tracking exercise. The DINN was trained on data from over 2,400 subjects and correctly diagnosed their states 96%-99.5% of the time. Most other artificial intelligence models performed at rates above 90%.[24] This technology could ideally provide another avenue for driver drowsiness detection.

Game theory applications edit

In a 2019 study, a Convolutional Neural Network (CNN) was constructed with the ability to identify individual chess pieces the same way other CNNs can identify facial features.[26] It was then fed eye-tracking input data from 30 chess players of various skill levels. With this data, the CNN used gaze estimation to determine parts of the chess board to which a player was paying close attention. It then generated a saliency map to illustrate those parts of the board. Ultimately, the CNN would combine its knowledge of the board and pieces with its saliency map to predict the players' next move. Regardless of the training dataset the neural network system was trained upon, it predicted the next move more accurately than if it had selected any possible move at random, and the saliency maps drawn for any given player and situation were more than 54% similar.[26]

Assistive technology edit

People with severe motor impairment can use eye tracking for interacting with computers[59] as it is faster than single switch scanning techniques and intuitive to operate.[60][61] Motor impairment caused by Cerebral Palsy[62] or Amyotrophic lateral sclerosis often affects speech, and users with Severe Speech and Motor Impairment (SSMI) use a type of software known as Augmentative and Alternative Communication (AAC) aid,[63] that displays icons, words and letters on screen[64] and uses text-to-speech software to generate spoken output.[65] In recent times, researchers also explored eye tracking to control robotic arms[66] and powered wheelchairs.[67] Eye tracking is also helpful in analysing visual search patterns,[68] detecting presence of Nystagmus and detecting early signs of learning disability by analysing eye gaze movement during reading.[69]

Aviation applications edit

Eye tracking has already been studied for flight safety by comparing scan paths and fixation duration to evaluate the progress of pilot trainees,[70] for estimating pilots' skills,[71] for analyzing crew's joint attention and shared situational awareness.[72] Eye tracking technology was also explored to interact with helmet mounted display systems[73] and multi-functional displays[74] in military aircraft. Studies were conducted to investigate the utility of eye tracker for Head-up target locking and Head-up target acquisition in Helmet mounted display systems (HMDS).[73] Pilots' feedback suggested that even though the technology is promising, its hardware and software components are yet to be matured.[73] Research on interacting with multi-functional displays in simulator environment showed that eye tracking can improve the response times and perceived cognitive load significantly over existing systems. Further, research also investigated utilizing measurements of fixation and pupillary responses to estimate pilot's cognitive load. Estimating cognitive load can help to design next generation adaptive cockpits with improved flight safety.[75] Eye tracking is also useful for detecting pilot fatigue.[76][23]

Automotive applications edit

In recent time, eye tracking technology is investigated in automotive domain in both passive and active ways. National Highway Traffic Safety Administration measured glance duration for undertaking secondary tasks while driving and used it to promote safety by discouraging the introduction of excessively distracting devices in vehicles[77] In addition to distraction detection, eye tracking is also used to interact with IVIS.[78] Though initial research[79] investigated the efficacy of eye tracking system for interaction with HDD (Head Down Display), it still required drivers to take their eyes off the road while performing a secondary task. Recent studies investigated eye gaze controlled interaction with HUD (Head Up Display) that eliminates eyes-off-road distraction.[80] Eye tracking is also used to monitor cognitive load of drivers to detect potential distraction. Though researchers[81] explored different methods to estimate cognitive load of drivers from different physiological parameters, usage of ocular parameters explored a new way to use the existing eye trackers to monitor cognitive load of drivers in addition to interaction with IVIS.[82][83]

Entertainment applications edit

The 2021 video game Before Your Eyes registers and reads the player's blinking, and uses it as the main way of interacting with the game.[84][85]

Engineering applications edit

The widespread use of eye-tracking technology has shed light to its use in empirical software engineering in the most recent years. The eye-tracking technology and data analysis techniques are used to investigate the understandability of software engineering concepts by the researchers. These include the understandability of business process models,[86] and diagrams used in software engineering such as UML activity diagrams and EER diagrams.[87] Eye-tracking metrics such as fixation, scan-path, scan-path precision, scan-path recall, fixations on area of interest/relevant region are computed, analyzed and interpreted in terms of model and diagram understandability. The findings are used to enhance the understandability of diagrams and models with proper model related solutions and by improving personal related factors such as working-memory capacity, cognitive-load, learning style and strategy of the software engineers and modelers.

Cartographic applications edit

Cartographic research has widely adopted eye tracking techniques. Researchers have used them to see how individuals perceive and interpret maps.[88] For example, eye tracking has been used to study differences in perception of 2D and 3D visualization,[89][90] comparison of map reading strategies between novices and experts[91] or students and their geography teachers,[92] and evaluation of the cartographic quality of maps.[93] Besides, cartographers have employed eye tracking to investigate various factors affecting map reading, including attributes such as color or symbol density.[94][95] Numerous studies about the usability of map applications took advantage of eye tracking, too.[96][97]

The cartographic community's daily engagement with visual and spatial data positioned it to contribute significantly to eye tracking data visualization methods and tools.[98] For example, cartographers have developed methods for integrating eye tracking data with GIS, utilizing GIS software for further visualization and analysis.[99][100] The community has also delivered tools for visualizing eye tracking data[101][98] or a toolbox for the identification of eye fixations based on a spatial component of eye-tracking data.[102]

Privacy concerns edit

With eye tracking projected to become a common feature in various consumer electronics, including smartphones,[103] laptops[104] and virtual reality headsets,[105][106] concerns have been raised about the technology's impact on consumer privacy.[107][108] With the aid of machine learning techniques, eye tracking data may indirectly reveal information about a user's ethnicity, personality traits, fears, emotions, interests, skills, and physical and mental health condition.[109] If such inferences are drawn without a user's awareness or approval, this can be classified as an inference attack. Eye activities are not always under volitional control, e.g., "stimulus-driven glances, pupil dilation, ocular tremor, and spontaneous blinks mostly occur without conscious effort, similar to digestion and breathing”.[109] Therefore, it can be difficult for eye tracking users to estimate or control the amount of information they reveal about themselves.

See also edit

Notes edit

  1. ^ Reported in Huey & 1908/1968
  2. ^ Huey, Edmund (1968) [originally published 1908]. The Psychology and Pedagogy of Reading (Reprint ed.). MIT Press.
  3. ^ Buswell, G.T. (1922). "Fundamental reading habits: a study of their development". Supplementary Educational Monographs. No. 21. Chicago: University of Chicago.
  4. ^ Buswell, G.T. (1937). "How adults read". Supplementary Educational Monographs. No. 45. Chicago: University of Chicago.
  5. ^ Buswell, G.T. (1935), How people look at pictures: a study of the psychology and perception in art, University of Chicago Press, Trove 12223957
  6. ^ Yarbus, Alfred L. (1967). Eye movements and vision (PDF). New York: Plenum Press. ISBN 978-1-4899-5379-7. Retrieved 24 March 2022.
  7. ^ a b Yarbus 1967, p. 190
  8. ^ Yarbus 1967, p. 194
  9. ^ Yarbus 1967, p. 191
  10. ^ Yarbus 1967, p. 193
  11. ^ a b "Visual Perception: Eye Movements in Problem Solving". www.learning-systems.ch.
  12. ^ [1] 2011-07-06 at the Wayback Machine
  13. ^ Rayner 1978
  14. ^ Just & Carpenter 1980
  15. ^ Posner, Michael I. (1980). "Orienting of Attention". Quarterly Journal of Experimental Psychology. 32 (1). SAGE Publications: 3–25. doi:10.1080/00335558008248231. ISSN 0033-555X. PMID 7367577. S2CID 2842391.
  16. ^ Wright, R.D.; Ward, L.M. (2008). Orienting of Attention. Oxford University Press. ISBN 978-0-19-802997-7.
  17. ^ a b c Robert J. K. Jacob; Keith S. Karn (2003). "Eye Tracking in Human–Computer Interaction and Usability Research: Ready to Deliver the Promises". In Hyona; Radach; Deubel (eds.). The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. Oxford, England: Elsevier Science BV. CiteSeerX 10.1.1.100.445. ISBN 0-444-51020-6.
  18. ^ Schiessl, Michael; Duda, Sabrina; Thölke, Andreas; Fischer, Rico. "Eye tracking and its application in usability and media research" (PDF).
  19. ^ Hoffman, James E. (2016). "Visual attention and eye movements". In Pashler, H. (ed.). Attention. Studies in Cognition. Taylor & Francis. pp. 119–153. ISBN 978-1-317-71549-8.
  20. ^ Deubel, Heiner (1996). "Saccade target selection and object recognition: Evidence for a common attentional mechanism". Vision Research. 36 (12): 1827–1837. doi:10.1016/0042-6989(95)00294-4. PMID 8759451. S2CID 16916037.
  21. ^ Holsanova, Jana (2007). "Användares interaktion med multimodala texter" [User interaction with multimodal texts]. In L. Gunnarsson; A.-M. Karlsson (eds.). Ett vidgat textbegrepp (in Swedish). pp. 41–58.
  22. ^ Cognolato M, Atzori M, Müller H (2018). "Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances". Journal of Rehabilitation and Assistive Technologies Engineering. 5: 205566831877399. doi:10.1177/2055668318773991. PMC 6453044. PMID 31191938.
  23. ^ a b c d Alexander, Robert; Macknik, Stephen; Martinez-Conde, Susana (2020). "Microsaccades in applied environments: Real-world applications of fixational eye movement measurements". Journal of Eye Movement Research. 12 (6). doi:10.16910/jemr.12.6.15. PMC 7962687. PMID 33828760.
  24. ^ a b c d Zhao, Lei; Wang, Zengcai; Zhang, Guoxin; Qi, Yazhou; Wang, Xiaojin (15 November 2017). "Eye state recognition based on deep integrated neural network and transfer learning". Multimedia Tools and Applications. 77 (15): 19415–19438. doi:10.1007/s11042-017-5380-8. ISSN 1380-7501. S2CID 20691291.
  25. ^ Stember, J. N.; Celik, H.; Krupinski, E.; Chang, P. D.; Mutasa, S.; Wood, B. J.; Lignelli, A.; Moonis, G.; Schwartz, L. H.; Jambawalikar, S.; Bagci, U. (August 2019). "Eye Tracking for Deep Learning Segmentation Using Convolutional Neural Networks". Journal of Digital Imaging. 32 (4): 597–604. doi:10.1007/s10278-019-00220-4. ISSN 0897-1889. PMC 6646645. PMID 31044392.
  26. ^ a b c Louedec, Justin Le; Guntz, Thomas; Crowley, James L.; Vaufreydaz, Dominique (2019). "Deep learning investigation for chess player attention prediction using eye-tracking and game data". Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. New York, New York, USA: ACM Press. pp. 1–9. arXiv:1904.08155. Bibcode:2019arXiv190408155L. doi:10.1145/3314111.3319827. ISBN 978-1-4503-6709-7. S2CID 118688325.
  27. ^ Nadu, T (2015). "A review: Towards quality improvement in real time eye-tracking and gaze detection". International Journal of Applied Engineering Research. 10 (6).
  28. ^ Nückles, M (2021). "Investigating visual perception in teaching and learning with advanced eye-tracking methodologies: Rewards and challenges of an innovative research paradigm". Educational Psychology Review. 33 (1): 149–167. doi:10.1007/s10648-020-09567-5. S2CID 225345884.
  29. ^ Alexander, RG; Waite, S; Macknik, SL; Martinez-Conde, S (2020). "What do radiologists look for? Advances and limitations of perceptual learning in radiologic search". Journal of Vision. 20 (10): 17. doi:10.1167/jov.20.10.17. PMC 7571277. PMID 33057623.
  30. ^ Robinson, David A. (October 1963). "A Method of Measuring Eye Movemnent Using a Scieral Search Coil in a Magnetic Field". IEEE Transactions on Bio-medical Electronics. 10 (4). Institute of Electrical and Electronics Engineers: 137–145. doi:10.1109/tbmel.1963.4322822. ISSN 0096-0616. PMID 14121113.
  31. ^ Crane, H.D.; Steele, C.M. (1985). "Generation-V dual-Purkinje-image eyetracker". Applied Optics. 24 (4): 527–537. Bibcode:1985ApOpt..24..527C. doi:10.1364/AO.24.000527. PMID 18216982. S2CID 10595433.
  32. ^ Elbert, T., Lutzenberger, W., Rockstroh, B., Birbaumer, N., 1985. Removal of ocular artifacts from the EEG. A biophysical approach to the EOG. Electroencephalogr Clin Neurophysiol 60, 455-463.
  33. ^ Keren, A.S.; Yuval-Greenberg, S.; Deouell, L.Y. (2010). "Saccadic spike potentials in gamma-band EEG: Characterization, detection and suppression". NeuroImage. 49 (3): 2248–2263. doi:10.1016/j.neuroimage.2009.10.057. PMID 19874901. S2CID 7106696.
  34. ^ Bulling, A.; Roggen, D.; Tröster, G. (2009). "Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments". Journal of Ambient Intelligence and Smart Environments. 1 (2): 157–171. doi:10.3233/AIS-2009-0020. hdl:20.500.11850/352886. S2CID 18423163.
  35. ^ Sopic, D., Aminifar, A., & Atienza, D. (2018). e-glass: A wearable system for real-time detection of epileptic seizures. In IEEE International Symposium on Circuits and Systems (ISCAS).
  36. ^ Witzner Hansen, Dan; Qiang Ji (March 2010). "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze". IEEE Trans. Pattern Anal. Mach. Intell. 32 (3): 478–500. doi:10.1109/tpami.2009.30. PMID 20075473. S2CID 16489508.
  37. ^ a b Gneo, Massimo; Schmid, Maurizio; Conforto, Silvia; D’Alessio, Tommaso (2012). "A free geometry model-independent neural eye-gaze tracking system". Journal of NeuroEngineering and Rehabilitation. 9 (1): 82. doi:10.1186/1743-0003-9-82. PMC 3543256. PMID 23158726.
  38. ^ The Eye: A Survey of Human Vision; Wikimedia Foundation
  39. ^ Sigut, J; Sidha, SA (February 2011). "Iris center corneal reflection method for gaze tracking using visible light". IEEE Transactions on Bio-Medical Engineering. 58 (2): 411–9. doi:10.1109/tbme.2010.2087330. PMID 20952326. S2CID 206611506.
  40. ^ Hua, H; Krishnaswamy, P; Rolland, JP (15 May 2006). "Video-based eyetracking methods and algorithms in head-mounted displays". Optics Express. 14 (10): 4328–50. Bibcode:2006OExpr..14.4328H. doi:10.1364/oe.14.004328. PMID 19516585.
  41. ^ Purves, D; et al. (2001). "What Eye Movements Accomplish". Neuroscience (2nd ed.). Sunderland, MA: Sinauer Assocs.
  42. ^ Majaranta, P., Aoki, H., Donegan, M., Hansen, D.W., Hansen, J.P., Hyrskykari, A., Räihä, K.J., Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, IGI Global, 2011
  43. ^ Nielsen, J.; Pernice, K. (2010). Eyetracking Web Usability. New Rideres Publishing. p. 11. ISBN 978-0-321-71407-7. Retrieved 28 October 2013.
  44. ^ Le Meur, O; Baccino, T (2013). "Methods for comparing scanpaths and saliency maps: strengths and weaknesses". Behavior Research Methods. 45 (1).
  45. ^ Aharonson V, Coopoo V, Govender K, Postema M (2020). "Automatic pupil detection and gaze estimation using the vestibulo-ocular reflex in a low-cost eye-tracking setup". SAIEE Africa Research Journal. 111 (3): 120–124. doi:10.23919/SAIEE.2020.9142605.
  46. ^ Einhäuser, W; Schumann, F; Bardins, S; Bartl, K; Böning, G; Schneider, E; König, P (2007). "Human eye-head co-ordination in natural exploration". Network: Computation in Neural Systems. 18 (3): 267–297. doi:10.1080/09548980701671094. PMID 17926195. S2CID 1812177.
  47. ^ Andersen, R. A.; Bracewell, R. M.; Barash, S.; Gnadt, J. W.; Fogassi, L. (1990). "Eye position effects on visual, memory, and saccade-related activity in areas LIP and 7a of macaque". Journal of Neuroscience. 10 (4): 1176–1196. doi:10.1523/JNEUROSCI.10-04-01176.1990. PMC 6570201. PMID 2329374. S2CID 18817768.
  48. ^ Ferhat, Onur; Vilariño, Fernando (2016). "Low Cost Eye Tracking: The Current Panorama". Computational Intelligence and Neuroscience. 2016: 1–14. doi:10.1155/2016/8680541. PMC 4808529. PMID 27034653.
  49. ^ Hunziker 2006. Based on data from: Cohen, A. S. (1983). Informationsaufnahme beim Befahren von Kurven, Psychologie für die Praxis 2/83, Bulletin der Schweizerischen Stiftung für Angewandte Psychologie.
  50. ^ Cohen, A. S. (1983). Informationsaufnahme beim Befahren von Kurven, Psychologie für die Praxis 2/83, Bulletin der Schweizerischen Stiftung für Angewandte Psychologie
  51. ^ Pictures from Hunziker 2006
  52. ^ Grüner, M; Ansorge, U (2017). "Mobile eye tracking during real-world night driving: A selective review of findings and recommendations for future research". Journal of Eye Movement Research. 10 (2). doi:10.16910/JEMR.10.2.1. PMC 7141062. PMID 33828651.
  53. ^ Itoh, Nana; Fukuda, Tadahiko (2002). "Comparative Study of Eye Movements in Extent of Central and Peripheral Vision and Use by Young and Elderly Walkers". Perceptual and Motor Skills. 94 (3_suppl): 1283–1291. doi:10.2466/pms.2002.94.3c.1283. PMID 12186250. S2CID 1058879.
  54. ^ Duchowski, A. T. (2002). "A breadth-first survey of eye-tracking applications". Behavior Research Methods, Instruments, & Computers. 34 (4): 455–470. doi:10.3758/BF03195475. PMID 12564550. S2CID 4361938.
  55. ^ Rogers, Sol. "Seven Reasons Why Eye-tracking Will Fundamentally Change VR". Forbes. Retrieved 16 December 2021.
  56. ^ Lohse, Gerald; Wu, D. J. (1 February 2001). "Eye Movement Patterns on Chinese Yellow Pages Advertising". Electronic Markets. 11 (2): 87–96. doi:10.1080/101967801300197007. S2CID 1064385.
  57. ^ "Eye Tracking Study: The Importance of Using Google Authorship in Search Results"
  58. ^ "3 seconds is enough to screen candidate's profile. Eye tracking research results". Element's Blog - nowości ze świata rekrutacji, HR Tech i Element (in Polish). 21 February 2019. Retrieved 3 April 2021.
  59. ^ Corno, F.; Farinetti, L.; Signorile, I. (August 2002). "A cost-effective solution for eye-gaze assistive technology". Proceedings. IEEE International Conference on Multimedia and Expo. Vol. 2. pp. 433–436. doi:10.1109/ICME.2002.1035632. ISBN 0-7803-7304-9. S2CID 42361339. Retrieved 5 August 2020.
  60. ^ Pinheiro, C.; Naves, E. L.; Pino, P.; Lesson, E.; Andrade, A.O.; Bourhis, G. (July 2011). "Alternative communication systems for people with severe motor disabilities: a survey". BioMedical Engineering OnLine. 10 (1): 31. doi:10.1186/1475-925X-10-31. PMC 3103465. PMID 21507236.
  61. ^ Saunders, M.D.; Smagner, J.P.; Saunders, R.R. (August 2003). "Improving methodological and technological analyses of adaptive switch use of individuals with profound multiple impairments". Behavioral Interventions. 18 (4): 227–243. doi:10.1002/bin.141.
  62. ^ "Cerebral Palsy (CP)". Retrieved 4 August 2020.
  63. ^ Wilkinson, K.M.; Mitchell, T. (March 2014). "Eye tracking research to answer questions about augmentative and alternative communication assessment and intervention". Augmentative and Alternative Communication. 30 (2): 106–119. doi:10.3109/07434618.2014.904435. PMC 4327869. PMID 24758526.
  64. ^ Galante, A.; Menezes, P. (June 2012). "A gaze-based interaction system for people with cerebral palsy". Procedia Technology. 5: 895–902. doi:10.1016/j.protcy.2012.09.099.
  65. ^ BLISCHAK, D.; LOMBARDINO, L.; DYSON, A. (June 2003). "Use of speech-generating devices: In support of natural speech". Augmentative and Alternative Communication. 19 (1): 29–35. doi:10.1080/0743461032000056478. PMID 28443791. S2CID 205581902.
  66. ^ Sharma, V.K.; Murthy, L. R. D.; Singh Saluja, K.; Mollyn, V.; Sharma, G.; Biswas, Pradipta (August 2020). "Webcam controlled robotic arm for persons with SSMI". Technology and Disability. 32 (3): 179–197. arXiv:2005.11994. doi:10.3233/TAD-200264. S2CID 218870304. Retrieved 5 August 2020.
  67. ^ Eid, M.A.; Giakoumidis, N.; El Saddik, A. (July 2016). "A novel eye-gaze-controlled wheelchair system for navigating unknown environments: case study with a person with ALS". IEEE Access. 4: 558–573. Bibcode:2016IEEEA...4..558E. doi:10.1109/ACCESS.2016.2520093. S2CID 28210837.
  68. ^ Jeevithashree, D. V.; Saluja, K.S.; Biswas, Pradipta (December 2019). "A case study of developing gaze-controlled interface for users with severe speech and motor impairment". Technology and Disability. 31 (1–2): 63–76. doi:10.3233/TAD-180206. S2CID 199083245. Retrieved 5 August 2020.
  69. ^ Jones, M.W.; Obregón, M.; Kelly, M.L.; Branigan, H.P. (May 2008). "Elucidating the component processes involved in dyslexic and non-dyslexic reading fluency: An eye-tracking study". Cognition. 109 (3): 389–407. doi:10.1016/j.cognition.2008.10.005. PMID 19019349. S2CID 29389144. Retrieved 5 August 2020.
  70. ^ Calhoun, G. L; Janson (1991). "Eye line-of-sight control compared to manual selection of discrete switches". Armstrong Laboratory Report AL-TR-1991-0015.
  71. ^ Fitts, P.M.; Jones, R.E.; Milton, J.L (1950). "Eye movements of aircraft pilots during instrument-landing approaches". Aeronaut. Eng. Rev. Retrieved 20 July 2020.
  72. ^ Peysakhovich, V.; Lefrançois, O.; Dehais, F.; Causse, M. (2018). "The neuroergonomics of aircraft cockpits: the four stages of eye-tracking integration to enhance flight safety". Safety. 4 (1): 8. doi:10.3390/safety4010008.
  73. ^ a b c de Reus, A.J.C.; Zon, R.; Ouwerkerk, R. (November 2012). "Exploring the use of an eye tracker in a helmet mounted display". National Aerospace Laboratory Technical Report NLR-TP-2012-001.
  74. ^ DV, JeevithaShree; Murthy, L R.D.; Saluja, K. S.; Biswas, P. (2018). "Operating different displays in military fast jets using eye gaze tracker". Journal of Aviation Technology and Engineering. 8 (4). Retrieved 24 July 2020.
  75. ^ Babu, M.; D V, JeevithaShree; Prabhakar, G.; Saluja, K.P.; Pashilkar, A.; Biswas, P. (2019). "Estimating pilots' cognitive load from ocular parameters through simulation and in-flight studies". Journal of Eye Movement Research. 12 (3). doi:10.16910/jemr.12.3.3. PMC 7880144. PMID 33828735. Retrieved 3 August 2020.
  76. ^ Peißl, S.; Wickens, C. D.; Baruah, R. (2018). "Eye-tracking measures in aviation: A selective literature review". The International Journal of Aerospace Psychology. 28 (3–4): 98–112. doi:10.1080/24721840.2018.1514978. S2CID 70016458.
  77. ^ "Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices".
  78. ^ US patent 8928585B2, Mondragon, Christopher K. & Bleacher, Brett, "Eye tracking control of vehicle entertainment systems", issued 2015-01-06, assigned to Thales Avionics Inc 
  79. ^ Poitschke, T.; Laquai, F.; Stamboliev, S.; Rigoll, G. (2011). "Gaze-based interaction on multiple displays in an automotive environment" (PDF). 2011 IEEE International Conference on Systems, Man, and Cybernetics. pp. 543–548. doi:10.1109/ICSMC.2011.6083740. ISBN 978-1-4577-0653-0. ISSN 1062-922X. S2CID 9362329.
  80. ^ Prabhakar, G.; Ramakrishnan, A.; Murthy, L.; Sharma, V.K.; Madan, M.; Deshmukh, S.; Biswas, P. (2020). "Interactive Gaze & Finger controlled HUD for Cars". Journal of Multimodal User Interface. 14: 101–121. doi:10.1007/s12193-019-00316-9. S2CID 208261516.
  81. ^ Marshall, S. (2002). "The Index of Cognitive Activity: Measuring cognitive workload". Proceedings of the IEEE 7th Conference on Human Factors and Power Plants. pp. 7-5–7-9. doi:10.1109/HFPP.2002.1042860. ISBN 0-7803-7450-9. S2CID 44561112.
  82. ^ Duchowski, A. T.; Biele, C.; Niedzielska, A.; Krejtz, K.; Krejtz, I.; Kiefer, P.; Raubal, M.; Giannopoulos, I. (2018). "The Index of Pupillary Activity Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation". ACM SIGCHI Conference on Human Factors. doi:10.1145/3173574.3173856. S2CID 5064488.
  83. ^ Prabhakar, G.; Mukhopadhyay, A.; Murthy, L.; Modiksha, M. A. D. A. N.; Biswas, P. (2020). "Cognitive load estimation using Ocular Parameters in Automotive". Transportation Engineering. 2: 100008. doi:10.1016/j.treng.2020.100008.
  84. ^ McGuire, Keegan (8 April 2021). "What The Critics Are Saying About Before Your Eyes". looper.com. from the original on 23 April 2021.
  85. ^ von Au, Caspar (24 April 2021). "Computerspiel "Before Your Eyes" wird mit den Augen gesteuert" [Video game "Before Your Eyes" is controlled with your eyes]. Bayerischer Rundfunk (in German). from the original on 26 April 2021.
  86. ^ Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A. (2017). "How visual cognition influences process model comprehension". Decision Support Systems. C (96): 1–16. doi:10.1016/j.dss.2017.01.005. ISSN 0167-9236.
  87. ^ Sözen, Nergiz; Say, Bilge; Kılıç, Özkan (27 November 2020). "An Experimental Study Towards Investigating the Effect of Working Memory Capacity on Complex Diagram Understandability". TEM Journal. Association for Information Communication Technology Education and Science: 1384–1395. doi:10.18421/tem94-09. ISSN 2217-8333. S2CID 229386117.
  88. ^ Krassanakis, Vassilios; Cybulski, Paweł (14 June 2021). "Eye Tracking Research in Cartography: Looking into the Future". ISPRS International Journal of Geo-Information. 10 (6): 411. Bibcode:2021IJGI...10..411K. doi:10.3390/ijgi10060411. ISSN 2220-9964.
  89. ^ Popelka, Stanislav; Brychtova, Alzbeta (2013). "Eye-tracking Study on Different Perception of 2D and 3D Terrain Visualisation". The Cartographic Journal. 50 (3): 240–246. Bibcode:2013CartJ..50..240P. doi:10.1179/1743277413Y.0000000058. ISSN 0008-7041. S2CID 128975149.
  90. ^ Herman, Lukas; Popelka, Stanislav; Hejlova, Vendula (31 May 2017). "Eye-tracking Analysis of Interactive 3D Geovisualization". Journal of Eye Movement Research. 10 (3). doi:10.16910/jemr.10.3.2. ISSN 1995-8692. PMC 7141050. PMID 33828655.
  91. ^ Ooms, K.; De Maeyer, P.; Fack, V. (22 November 2013). "Study of the attentive behavior of novice and expert map users using eye tracking". Cartography and Geographic Information Science. 41 (1): 37–54. doi:10.1080/15230406.2013.860255. hdl:1854/LU-4252541. ISSN 1523-0406. S2CID 11087520.
  92. ^ Beitlova, Marketa; Popelka, Stanislav; Vozenilek, Vit (19 August 2020). "Differences in Thematic Map Reading by Students and Their Geography Teacher". ISPRS International Journal of Geo-Information. 9 (9): 492. Bibcode:2020IJGI....9..492B. doi:10.3390/ijgi9090492. ISSN 2220-9964.
  93. ^ Burian, Jaroslav; Popelka, Stanislav; Beitlova, Marketa (17 May 2018). "Evaluation of the Cartographical Quality of Urban Plans by Eye-Tracking". ISPRS International Journal of Geo-Information. 7 (5): 192. Bibcode:2018IJGI....7..192B. doi:10.3390/ijgi7050192. ISSN 2220-9964.
  94. ^ Brychtova, Alzbeta; Coltekin, Arzu (30 June 2016). "An Empirical User Study for Measuring the Influence of Colour Distance and Font Size in Map Reading Using Eye Tracking". The Cartographic Journal. 53 (3): 202–212. Bibcode:2016CartJ..53..202B. doi:10.1179/1743277414y.0000000103. ISSN 0008-7041. S2CID 18911777.
  95. ^ Cybulski, Paweł (9 January 2020). "Spatial distance and cartographic background complexity in graduated point symbol map-reading task". Cartography and Geographic Information Science. 47 (3): 244–260. Bibcode:2020CGISc..47..244C. doi:10.1080/15230406.2019.1702102. ISSN 1523-0406. S2CID 213161788.
  96. ^ Manson, Steven M.; Kne, Len; Dyke, Kevin R.; Shannon, Jerry; Eria, Sami (2012). "Using Eye-tracking and Mouse Metrics to Test Usability of Web Mapping Navigation". Cartography and Geographic Information Science. 39 (1): 48–60. Bibcode:2012CGISc..39...48M. doi:10.1559/1523040639148. ISSN 1523-0406. S2CID 131449617.
  97. ^ Popelka, Stanislav; Vondrakova, Alena; Hujnakova, Petra (30 May 2019). "Eye-tracking Evaluation of Weather Web Maps". ISPRS International Journal of Geo-Information. 8 (6): 256. Bibcode:2019IJGI....8..256P. doi:10.3390/ijgi8060256. ISSN 2220-9964.
  98. ^ a b Vojtechovska, Michaela; Popelka, Stanislav (12 August 2023). "GazePlotter – tool for eye movement sequences visualization". Abstracts of the ICA. 6: 264–. Bibcode:2023AbICA...6..264V. doi:10.5194/ica-abs-6-264-2023. ISSN 2570-2106.
  99. ^ Sultan, Minha Noor; Popelka, Stanislav; Strobl, Josef (24 June 2022). "ET2Spatial – software for georeferencing of eye movement data". Earth Science Informatics. 15 (3): 2031–2049. Bibcode:2022EScIn..15.2031S. doi:10.1007/s12145-022-00832-5. ISSN 1865-0473. S2CID 249961269.
  100. ^ Göbel, Fabian; Kiefer, Peter; Raubal, Martin (2 May 2019). "Correction to: FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps". GeoInformatica. 24 (4): 1061–1062. doi:10.1007/s10707-019-00352-3. ISSN 1384-6175. S2CID 155184852.
  101. ^ Dolezalova, Jitka; Popelka, Stanislav (5 August 2016). "ScanGraph: A Novel Scanpath Comparison Method Using Visualisation of Graph Cliques". Journal of Eye Movement Research. 9 (4). doi:10.16910/jemr.9.4.5. ISSN 1995-8692.
  102. ^ Krassanakis, Vassilios; Filippakopoulou, Vassiliki; Nakos, Byron (21 February 2014). "EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification". Journal of Eye Movement Research. 7 (1). doi:10.16910/jemr.7.1.1. ISSN 1995-8692. S2CID 38319871.
  103. ^ Dickson, Ben (19 February 2017). "Unlocking the potential of eye tracking technology". TechCrunch. Retrieved 8 April 2021.
  104. ^ Reddy, Venkateshwar (15 April 2019). "Eye Tracking Technology: Applications & Future Scope". IndustryARC. Retrieved 8 April 2021.
  105. ^ Rogers, Sol (5 February 2019). "Seven Reasons Why Eye-tracking Will Fundamentally Change VR". Forbes. Retrieved 13 May 2020.
  106. ^ Stein, Scott (31 January 2020). "Eye tracking is the next phase for VR, ready or not". CNET. Retrieved 8 April 2021.
  107. ^ Stanley, Jay (6 May 2013). "The Privacy-Invading Potential of Eye Tracking Technology". American Civil Liberties Union. Retrieved 8 April 2021.
  108. ^ Blain, Loz (29 March 2021). "Eye tracking can reveal an unbelievable amount of information about you". New Atlas. Retrieved 8 April 2021.
  109. ^ a b Kröger, Jacob Leon; Lutz, Otto Hans-Martin; Müller, Florian (2020). "What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking". Privacy and Identity Management. Data for Better Living: AI and Privacy. IFIP Advances in Information and Communication Technology. Vol. 576. Cham: Springer International Publishing. pp. 226–241. doi:10.1007/978-3-030-42504-3_15. ISBN 978-3-030-42503-6. ISSN 1868-4238.

References edit

  • Cornsweet, TN; Crane, HD (1973). "Accurate two-dimensional eye tracker using first and fourth Purkinje images". J Opt Soc Am. 63 (8): 921–8. Bibcode:1973JOSA...63..921C. doi:10.1364/josa.63.000921. PMID 4722578. S2CID 14866408.
  • Cornsweet, TN (1958). "New technique for the measurement of small eye movements". JOSA. 48 (11): 808–811. Bibcode:1958JOSA...48..808C. doi:10.1364/josa.48.000808. PMID 13588456.
  • Hunziker, Hans-Werner (2006). Im Auge des Lesers: foveale und periphere Wahrnehmung – vom Buchstabieren zur Lesefreude [In the eye of the reader: foveal and peripheral perception – from letter recognition to the joy of reading] (in German). Transmedia Stäubli Verlag Zürich. ISBN 978-3-7266-0068-6.
  • Just, MA; Carpenter, PA (1980). "A theory of reading: from eye fixation to comprehension" (PDF). Psychol Rev. 87 (4): 329–354. doi:10.1037/0033-295x.87.4.329. PMID 7413885. S2CID 3793521.
  • Rayner, K (1978). "Eye movements in reading and information processing". Psychological Bulletin. 85 (3): 618–660. CiteSeerX 10.1.1.294.4262. doi:10.1037/0033-2909.85.3.618. PMID 353867.
  • Rayner, K (1998). "Eye movements in reading and information processing: 20 years of research". Psychological Bulletin. 124 (3): 372–422. CiteSeerX 10.1.1.211.3546. doi:10.1037/0033-2909.124.3.372. PMID 9849112.
  • Romano Bergstrom, Jennifer (2014). Eye Tracking in User Experience Design. Morgan Kaufmann. ISBN 978-0-12-408138-3.
  • Bojko, Aga (2013). Eye Tracking The User Experience (A Practical Guide to Research). Rosenfeld Media. ISBN 978-1-933820-10-1.

Commercial eye tracking edit

  • Pieters, R.; Wedel, M. (2007). "Goal Control of Visual Attention to Advertising: The Yarbus Implication". Journal of Consumer Research. 34 (2): 224–233. CiteSeerX 10.1.1.524.9550. doi:10.1086/519150.
  • Pieters, R.; Wedel, M. (2004). "Attention Capture and Transfer by elements of Advertisements". Journal of Marketing. 68 (2): 36–50. CiteSeerX 10.1.1.115.3006. doi:10.1509/jmkg.68.2.36.27794. S2CID 15259684.

tracking, this, article, about, study, movement, tendency, visually, track, potential, prey, stalking, process, measuring, either, point, gaze, where, looking, motion, relative, head, tracker, device, measuring, positions, movement, trackers, used, research, v. This article is about the study of eye movement For the tendency to visually track potential prey see eye stalking Eye tracking is the process of measuring either the point of gaze where one is looking or the motion of an eye relative to the head An eye tracker is a device for measuring eye positions and eye movement Eye trackers are used in research on the visual system in psychology in psycholinguistics marketing as an input device for human computer interaction and in product design In addition eye trackers are increasingly being used for assistive and rehabilitative applications such as controlling wheelchairs robotic arms and prostheses Recently eye tracking has been examined as a tool for the early detection of autism spectrum disorder There are several methods for measuring eye movement with the most popular variant using video images to extract eye position Other methods use search coils or are based on the electrooculogram Eye tracking device source source source source source source Scientists track eye movements in glaucoma patients to check vision impairment while driving Contents 1 History 2 Tracker types 2 1 Eye attached tracking 2 2 Optical tracking 2 3 Electric potential measurement 3 Technologies and techniques 4 Data presentation 5 Eye tracking vs gaze tracking 6 Practice 6 1 Eye tracking while driving a car in a difficult situation 6 2 Eye tracking of younger and elderly people while walking 7 Applications 7 1 Commercial applications 7 2 Safety applications 7 3 Game theory applications 7 4 Assistive technology 7 5 Aviation applications 7 6 Automotive applications 7 7 Entertainment applications 7 8 Engineering applications 7 9 Cartographic applications 8 Privacy concerns 9 See also 10 Notes 11 References 11 1 Commercial eye trackingHistory edit nbsp Yarbus eye tracker from the 1960s In the 1800s studies of eye movement were made using direct observations For example Louis Emile Javal observed in 1879 that reading does not involve a smooth sweeping of the eyes along the text as previously assumed but a series of short stops called fixations and quick saccades 1 This observation raised important questions about reading questions which were explored during the 1900s On which words do the eyes stop For how long When do they regress to already seen words nbsp An example of fixations and saccades over text This is the typical pattern of eye movement during reading The eyes never move smoothly over still text Edmund Huey 2 built an early eye tracker using a sort of contact lens with a hole for the pupil The lens was connected to an aluminum pointer that moved in response to the movement of the eye Huey studied and quantified regressions only a small proportion of saccades are regressions and he showed that some words in a sentence are not fixated The first non intrusive eye trackers were built by Guy Thomas Buswell in Chicago using beams of light that were reflected on the eye then recording on film Buswell made systematic studies into reading 3 4 and picture viewing 5 In the 1950s Alfred L Yarbus 6 performed eye tracking research and his 1967 book is often quoted He showed that the task given to a subject has a very large influence on the subject s eye movement He also wrote about the relation between fixations and interest All the records show conclusively that the character of the eye movement is either completely independent of or only very slightly dependent on the material of the picture and how it was made provided that it is flat or nearly flat 7 The cyclical pattern in the examination of pictures is dependent on not only what is shown on the picture but also the problem facing the observer and the information that he hopes to gain from the picture 8 nbsp This study by Yarbus 1967 is often referred to as evidence on how the task given to a person influences his or her eye movement Records of eye movements show that the observer s attention is usually held only by certain elements of the picture Eye movement reflects the human thought processes so the observer s thought may be followed to some extent from records of eye movement the thought accompanying the examination of the particular object It is easy to determine from these records which elements attract the observer s eye and consequently his thought in what order and how often 7 The observer s attention is frequently drawn to elements which do not give important information but which in his opinion may do so Often an observer will focus his attention on elements that are unusual in the particular circumstances unfamiliar incomprehensible and so on 9 when changing its points of fixation the observer s eye repeatedly returns to the same elements of the picture Additional time spent on perception is not used to examine the secondary elements but to reexamine the most important elements 10 nbsp This study by Hunziker 1970 11 on eye tracking in problem solving used simple 8 mm film to track eye movement by filming the subject through a glass plate on which the visual problem was displayed 12 11 In the 1970s eye tracking research expanded rapidly particularly reading research A good overview of the research in this period is given by Rayner 13 In 1980 Just and Carpenter 14 formulated the influential Strong eye mind hypothesis that there is no appreciable lag between what is fixated and what is processed If this hypothesis is correct then when a subject looks at a word or object he or she also thinks about it process cognitively and for exactly as long as the recorded fixation The hypothesis is often taken for granted by researchers using eye tracking However gaze contingent techniques offer an interesting option in order to disentangle overt and covert attentions to differentiate what is fixated and what is processed During the 1980s the eye mind hypothesis was often questioned in light of covert attention 15 16 the attention to something that one is not looking at which people often do If covert attention is common during eye tracking recordings the resulting scan path and fixation patterns would often show not where attention has been but only where the eye has been looking failing to indicate cognitive processing The 1980s also saw the birth of using eye tracking to answer questions related to human computer interaction Specifically researchers investigated how users search for commands in computer menus 17 Additionally computers allowed researchers to use eye tracking results in real time primarily to help disabled users 17 More recently there has been growth in using eye tracking to study how users interact with different computer interfaces Specific questions researchers ask are related to how easy different interfaces are for users 17 The results of the eye tracking research can lead to changes in design of the interface Another recent area of research focuses on Web development This can include how users react to drop down menus or where they focus their attention on a website so the developer knows where to place an advertisement 18 According to Hoffman 19 current consensus is that visual attention is always slightly 100 to 250 ms ahead of the eye But as soon as attention moves to a new position the eyes will want to follow 20 Specific cognitive processes still cannot be inferred directly from a fixation on a particular object in a scene 21 For instance a fixation on a face in a picture may indicate recognition liking dislike puzzlement etc Therefore eye tracking is often coupled with other methodologies such as introspective verbal protocols Thanks to advancement in portable electronic devices portable head mounted eye trackers currently can achieve excellent performance and are being increasingly used in research and market applications targeting daily life settings 22 These same advances have led to increases in the study of small eye movements that occur during fixation both in the lab and in applied settings 23 nbsp The use of convolutional neural networks in eye tracking allow for new information to be identified by artificial intelligence In the 21st century the use of artificial intelligence AI and artificial neural networks has become a viable way to complete eye tracking tasks and analysis In particular the convolutional neural network lends itself to eye tracking as it is designed for image centric tasks With AI eye tracking tasks and studies can yield additional information that may not have been detected by human observers The practice of deep learning also allows for a given neural network to improve at a given task when given enough sample data This requires a relatively large supply of training data however 24 The potential use cases for AI in eye tracking cover a wide range of topics from medical applications 25 to driver safety 24 to game theory 26 and even education and training applications 27 28 29 Tracker types editEye trackers measure rotations of the eye in one of several ways but principally they fall into one of three categories measurement of the movement of an object normally a special contact lens attached to the eye optical tracking without direct contact to the eye measurement of electric potentials using electrodes placed around the eyes Eye attached tracking edit The first type uses an attachment to the eye such as a special contact lens with an embedded mirror or magnetic field sensor and the movement of the attachment is measured with the assumption that it does not slip significantly as the eye rotates Measurements with tight fitting contact lenses have provided extremely sensitive recordings of eye movement and magnetic search coils are the method of choice for researchers studying the dynamics and underlying physiology of eye movement This method allows the measurement of eye movement in horizontal vertical and torsion directions 30 Optical tracking edit See also Video oculography nbsp An eye tracking head mounted display Each eye has an LED light source gold color metal on the side of the display lens and a camera under the display lens The second broad category uses some non contact optical method for measuring eye motion Light typically infrared is reflected from the eye and sensed by a video camera or some other specially designed optical sensor The information is then analyzed to extract eye rotation from changes in reflections Video based eye trackers typically use the corneal reflection the first Purkinje image and the center of the pupil as features to track over time A more sensitive type of eye tracker the dual Purkinje eye tracker 31 uses reflections from the front of the cornea first Purkinje image and the back of the lens fourth Purkinje image as features to track A still more sensitive method of tracking is to image features from inside the eye such as the retinal blood vessels and follow these features as the eye rotates Optical methods particularly those based on video recording are widely used for gaze tracking and are favored for being non invasive and inexpensive Electric potential measurement edit The third category uses electric potentials measured with electrodes placed around the eyes The eyes are the origin of a steady electric potential field which can also be detected in total darkness and if the eyes are closed It can be modelled to be generated by a dipole with its positive pole at the cornea and its negative pole at the retina The electric signal that can be derived using two pairs of contact electrodes placed on the skin around one eye is called Electrooculogram EOG If the eyes move from the centre position towards the periphery the retina approaches one electrode while the cornea approaches the opposing one This change in the orientation of the dipole and consequently the electric potential field results in a change in the measured EOG signal Inversely by analysing these changes in eye movement can be tracked Due to the discretisation given by the common electrode setup two separate movement components a horizontal and a vertical can be identified A third EOG component is the radial EOG channel 32 which is the average of the EOG channels referenced to some posterior scalp electrode This radial EOG channel is sensitive to the saccadic spike potentials stemming from the extra ocular muscles at the onset of saccades and allows reliable detection of even miniature saccades 33 Due to potential drifts and variable relations between the EOG signal amplitudes and the saccade sizes it is challenging to use EOG for measuring slow eye movement and detecting gaze direction EOG is however a very robust technique for measuring saccadic eye movement associated with gaze shifts and detecting blinks Contrary to video based eye trackers EOG allows recording of eye movements even with eyes closed and can thus be used in sleep research It is a very light weight approach that in contrast to current video based eye trackers requires low computational power works under different lighting conditions and can be implemented as an embedded self contained wearable system 34 35 It is thus the method of choice for measuring eye movement in mobile daily life situations and REM phases during sleep The major disadvantage of EOG is its relatively poor gaze direction accuracy compared to a video tracker That is it is difficult to determine with good accuracy exactly where a subject is looking though the time of eye movements can be determined Technologies and techniques editThe most widely used current designs are video based eye trackers A camera focuses on one or both eyes and records eye movement as the viewer looks at some kind of stimulus Most modern eye trackers use the center of the pupil and infrared near infrared non collimated light to create corneal reflections CR The vector between the pupil center and the corneal reflections can be used to compute the point of regard on surface or the gaze direction A simple calibration procedure of the individual is usually needed before using the eye tracker 36 Two general types of infrared near infrared also known as active light eye tracking techniques are used bright pupil and dark pupil Their difference is based on the location of the illumination source with respect to the optics If the illumination is coaxial with the optical path then the eye acts as a retroreflector as the light reflects off the retina creating a bright pupil effect similar to red eye If the illumination source is offset from the optical path then the pupil appears dark because the retroreflection from the retina is directed away from the camera 37 Bright pupil tracking creates greater iris pupil contrast allowing more robust eye tracking with all iris pigmentation and greatly reduces interference caused by eyelashes and other obscuring features 38 It also allows tracking in lighting conditions ranging from total darkness to very bright Another less used method is known as passive light It uses visible light to illuminate something which may cause some distractions to users 37 Another challenge with this method is that the contrast of the pupil is less than in the active light methods therefore the center of iris is used for calculating the vector instead 39 This calculation needs to detect the boundary of the iris and the white sclera limbus tracking It presents another challenge for vertical eye movements due to obstruction of eyelids 40 nbsp Infrared near infrared bright pupil nbsp Infrared near infrared dark pupil and corneal reflection nbsp Visible light center of iris red corneal reflection green and output vector blue Eye tracking setups vary greatly Some are head mounted some require the head to be stable for example with a chin rest and some function remotely and automatically track the head during motion Most use a sampling rate of at least 30 Hz Although 50 60 Hz is more common today many video based eye trackers run at 240 350 or even 1000 1250 Hz speeds needed to capture fixational eye movements or correctly measure saccade dynamics Eye movements are typically divided into fixations and saccades when the eye gaze pauses in a certain position and when it moves to another position respectively The resulting series of fixations and saccades is called a scanpath Smooth pursuit describes the eye following a moving object Fixational eye movements include microsaccades small involuntary saccades that occur during attempted fixation Most information from the eye is made available during a fixation or smooth pursuit but not during a saccade 41 Scanpaths are useful for analyzing cognitive intent interest and salience Other biological factors some as simple as gender may affect the scanpath as well Eye tracking in human computer interaction HCI typically investigates the scanpath for usability purposes or as a method of input in gaze contingent displays also known as gaze based interfaces 42 Data presentation editInterpretation of the data that is recorded by the various types of eye trackers employs a variety of software that animates or visually represents it so that the visual behavior of one or more users can be graphically resumed The video is generally manually coded to identify the AOIs areas of interest or recently using artificial intelligence Graphical presentation is rarely the basis of research results since they are limited in terms of what can be analysed research relying on eye tracking for example usually requires quantitative measures of the eye movement events and their parameters The following visualisations are the most commonly used Animated representations of a point on the interface This method is used when the visual behavior is examined individually indicating where the user focused their gaze in each moment complemented with a small path that indicates the previous saccade movements as seen in the image Static representations of the saccade path This is fairly similar to the one described above with the difference that this is static method A higher level of expertise than with the animated ones is required to interpret this Heat maps An alternative static representation used mainly for the agglomerated analysis of the visual exploration patterns in a group of users In these representations the hot zones or zones with higher density designate where the users focused their gaze not their attention with a higher frequency Heat maps are the best known visualization technique for eyetracking studies 43 Blind zones maps or focus maps This method is a simplified version of the heat maps where the visually less attended zones by the users are displayed clearly thus allowing for an easier understanding of the most relevant information that is to say it provides more information about which zones were not seen by the users Saliency maps Similar to heat maps a saliency map illustrates areas of focus by brightly displaying the attention grabbing objects over an initially black canvas The more focus is given to a particular object the brighter it will appear 44 Eye tracking vs gaze tracking editEye trackers necessarily measure the rotation of the eye with respect to some frame of reference This is usually tied to the measuring system Thus if the measuring system is head mounted as with EOG or a video based system mounted to a helmet then eye in head angles are measured To deduce the line of sight in world coordinates the head must be kept in a constant position or its movements must be tracked as well In these cases head direction is added to eye in head direction to determine gaze direction However if the motion of the head is minor the eye remains in constant position 45 If the measuring system is table mounted as with scleral search coils or table mounted camera remote systems then gaze angles are measured directly in world coordinates Typically in these situations head movements are prohibited For example the head position is fixed using a bite bar or a forehead support Then a head centered reference frame is identical to a world centered reference frame Or colloquially the eye in head position directly determines the gaze direction Some results are available on human eye movements under natural conditions where head movements are allowed as well 46 The relative position of eye and head even with constant gaze direction influences neuronal activity in higher visual areas 47 Practice editA great deal of research has gone into studies of the mechanisms and dynamics of eye rotation but the goal of eye tracking is most often to estimate gaze direction Users may be interested in what features of an image draw the eye for example It is important to realize that the eye tracker does not provide absolute gaze direction but rather can measure only changes in gaze direction To determine precisely what a subject is looking at some calibration procedure is required in which the subject looks at a point or series of points while the eye tracker records the value that corresponds to each gaze position Even those techniques that track features of the retina cannot provide exact gaze direction because there is no specific anatomical feature that marks the exact point where the visual axis meets the retina if indeed there is such a single stable point An accurate and reliable calibration is essential for obtaining valid and repeatable eye movement data and this can be a significant challenge for non verbal subjects or those who have unstable gaze Each method of eye tracking has advantages and disadvantages and the choice of an eye tracking system depends on considerations of cost and application There are offline methods and online procedures like AttentionTracking There is a trade off between cost and sensitivity with the most sensitive systems costing many tens of thousands of dollars and requiring considerable expertise to operate properly Advances in computer and video technology have led to the development of relatively low cost systems that are useful for many applications and fairly easy to use 48 Interpretation of the results still requires some level of expertise however because a misaligned or poorly calibrated system can produce wildly erroneous data Eye tracking while driving a car in a difficult situation edit nbsp Frames from narrow road eye tracking described in this section 49 The eye movement of two groups of drivers have been filmed with a special head camera by a team of the Swiss Federal Institute of Technology Novice and experienced drivers had their eye movement recorded while approaching a bend of a narrow road The series of images has been condensed from the original film frames 50 to show 2 eye fixations per image for better comprehension Each of these stills corresponds to approximately 0 5 seconds in real time The series of images shows an example of eye fixations 9 to 14 of a typical novice and of an experienced driver Comparison of the top images shows that the experienced driver checks the curve and even has Fixation No 9 left to look aside while the novice driver needs to check the road and estimate his distance to the parked car In the middle images the experienced driver is now fully concentrating on the location where an oncoming car could be seen The novice driver concentrates his view on the parked car In the bottom image the novice is busy estimating the distance between the left wall and the parked car while the experienced driver can use their peripheral vision for that and still concentrate vision on the dangerous point of the curve If a car appears there the driver has to give way i e stop to the right instead of passing the parked car 51 More recent studies have also used head mounted eye tracking to measure eye movements during real world driving conditions 52 23 Eye tracking of younger and elderly people while walking edit While walking elderly subjects depend more on foveal vision than do younger subjects Their walking speed is decreased by a limited visual field probably caused by a deteriorated peripheral vision Younger subjects make use of both their central and peripheral vision while walking Their peripheral vision allows faster control over the process of walking 53 Applications editA wide variety of disciplines use eye tracking techniques including cognitive science psychology notably psycholinguistics the visual world paradigm human computer interaction HCI human factors and ergonomics marketing research and medical research neurological diagnosis 54 Specific applications include the tracking eye movement in language reading music reading human activity recognition the perception of advertising playing of sports distraction detection and cognitive load estimation of drivers and pilots and as a means of operating computers by people with severe motor impairment 23 In the field of virtual reality eye tracking is used in head mounted displays for a variety of purposes including to reduce processing load by only rendering the graphical area within the user s gaze 55 Commercial applications edit In recent years the increased sophistication and accessibility of eye tracking technologies have generated a great deal of interest in the commercial sector Applications include web usability advertising sponsorship package design and automotive engineering In general commercial eye tracking studies function by presenting a target stimulus to a sample of consumers while an eye tracker records eye activity Examples of target stimuli may include websites television programs sporting events films and commercials magazines and newspapers packages shelf displays consumer systems ATMs checkout systems kiosks and software The resulting data can be statistically analyzed and graphically rendered to provide evidence of specific visual patterns By examining fixations saccades pupil dilation blinks and a variety of other behaviors researchers can determine a great deal about the effectiveness of a given medium or product While some companies complete this type of research internally there are many private companies that offer eye tracking services and analysis One field of commercial eye tracking research is web usability While traditional usability techniques are often quite powerful in providing information on clicking and scrolling patterns eye tracking offers the ability to analyze user interaction between the clicks and how much time a user spends between clicks thereby providing valuable insight into which features are the most eye catching which features cause confusion and which are ignored altogether Specifically eye tracking can be used to assess search efficiency branding online advertisements navigation usability overall design and many other site components Analyses may target a prototype or competitor site in addition to the main client site Eye tracking is commonly used in a variety of different advertising media Commercials print ads online ads and sponsored programs are all conducive to analysis with current eye tracking technology One example is the analysis of eye movements over advertisements in the Yellow Pages One study focused on what particular features caused people to notice an ad whether they viewed ads in a particular order and how viewing times varied The study revealed that ad size graphics color and copy all influence attention to advertisements Knowing this allows researchers to assess in great detail how often a sample of consumers fixates on the target logo product or ad Hence an advertiser can quantify the success of a given campaign in terms of actual visual attention 56 Another example of this is a study that found that in a search engine results page authorship snippets received more attention than the paid ads or even the first organic result 57 Yet another example of commercial eye tracking research comes from the field of recruitment A study analyzed how recruiters screen LinkedIn profiles and presented results as heat maps 58 Safety applications edit Scientists in 2017 constructed a Deep Integrated Neural Network DINN out of a Deep Neural Network and a convolutional neural network 24 The goal was to use deep learning to examine images of drivers and determine their level of drowsiness by classify ing eye states With enough images the proposed DINN could ideally determine when drivers blink how often they blink and for how long From there it could judge how tired a given driver appears to be effectively conducting an eye tracking exercise The DINN was trained on data from over 2 400 subjects and correctly diagnosed their states 96 99 5 of the time Most other artificial intelligence models performed at rates above 90 24 This technology could ideally provide another avenue for driver drowsiness detection Game theory applications edit In a 2019 study a Convolutional Neural Network CNN was constructed with the ability to identify individual chess pieces the same way other CNNs can identify facial features 26 It was then fed eye tracking input data from 30 chess players of various skill levels With this data the CNN used gaze estimation to determine parts of the chess board to which a player was paying close attention It then generated a saliency map to illustrate those parts of the board Ultimately the CNN would combine its knowledge of the board and pieces with its saliency map to predict the players next move Regardless of the training dataset the neural network system was trained upon it predicted the next move more accurately than if it had selected any possible move at random and the saliency maps drawn for any given player and situation were more than 54 similar 26 Assistive technology edit People with severe motor impairment can use eye tracking for interacting with computers 59 as it is faster than single switch scanning techniques and intuitive to operate 60 61 Motor impairment caused by Cerebral Palsy 62 or Amyotrophic lateral sclerosis often affects speech and users with Severe Speech and Motor Impairment SSMI use a type of software known as Augmentative and Alternative Communication AAC aid 63 that displays icons words and letters on screen 64 and uses text to speech software to generate spoken output 65 In recent times researchers also explored eye tracking to control robotic arms 66 and powered wheelchairs 67 Eye tracking is also helpful in analysing visual search patterns 68 detecting presence of Nystagmus and detecting early signs of learning disability by analysing eye gaze movement during reading 69 Aviation applications edit Eye tracking has already been studied for flight safety by comparing scan paths and fixation duration to evaluate the progress of pilot trainees 70 for estimating pilots skills 71 for analyzing crew s joint attention and shared situational awareness 72 Eye tracking technology was also explored to interact with helmet mounted display systems 73 and multi functional displays 74 in military aircraft Studies were conducted to investigate the utility of eye tracker for Head up target locking and Head up target acquisition in Helmet mounted display systems HMDS 73 Pilots feedback suggested that even though the technology is promising its hardware and software components are yet to be matured 73 Research on interacting with multi functional displays in simulator environment showed that eye tracking can improve the response times and perceived cognitive load significantly over existing systems Further research also investigated utilizing measurements of fixation and pupillary responses to estimate pilot s cognitive load Estimating cognitive load can help to design next generation adaptive cockpits with improved flight safety 75 Eye tracking is also useful for detecting pilot fatigue 76 23 Automotive applications edit In recent time eye tracking technology is investigated in automotive domain in both passive and active ways National Highway Traffic Safety Administration measured glance duration for undertaking secondary tasks while driving and used it to promote safety by discouraging the introduction of excessively distracting devices in vehicles 77 In addition to distraction detection eye tracking is also used to interact with IVIS 78 Though initial research 79 investigated the efficacy of eye tracking system for interaction with HDD Head Down Display it still required drivers to take their eyes off the road while performing a secondary task Recent studies investigated eye gaze controlled interaction with HUD Head Up Display that eliminates eyes off road distraction 80 Eye tracking is also used to monitor cognitive load of drivers to detect potential distraction Though researchers 81 explored different methods to estimate cognitive load of drivers from different physiological parameters usage of ocular parameters explored a new way to use the existing eye trackers to monitor cognitive load of drivers in addition to interaction with IVIS 82 83 Entertainment applications edit The 2021 video game Before Your Eyes registers and reads the player s blinking and uses it as the main way of interacting with the game 84 85 Engineering applications edit The widespread use of eye tracking technology has shed light to its use in empirical software engineering in the most recent years The eye tracking technology and data analysis techniques are used to investigate the understandability of software engineering concepts by the researchers These include the understandability of business process models 86 and diagrams used in software engineering such as UML activity diagrams and EER diagrams 87 Eye tracking metrics such as fixation scan path scan path precision scan path recall fixations on area of interest relevant region are computed analyzed and interpreted in terms of model and diagram understandability The findings are used to enhance the understandability of diagrams and models with proper model related solutions and by improving personal related factors such as working memory capacity cognitive load learning style and strategy of the software engineers and modelers Cartographic applications edit Cartographic research has widely adopted eye tracking techniques Researchers have used them to see how individuals perceive and interpret maps 88 For example eye tracking has been used to study differences in perception of 2D and 3D visualization 89 90 comparison of map reading strategies between novices and experts 91 or students and their geography teachers 92 and evaluation of the cartographic quality of maps 93 Besides cartographers have employed eye tracking to investigate various factors affecting map reading including attributes such as color or symbol density 94 95 Numerous studies about the usability of map applications took advantage of eye tracking too 96 97 The cartographic community s daily engagement with visual and spatial data positioned it to contribute significantly to eye tracking data visualization methods and tools 98 For example cartographers have developed methods for integrating eye tracking data with GIS utilizing GIS software for further visualization and analysis 99 100 The community has also delivered tools for visualizing eye tracking data 101 98 or a toolbox for the identification of eye fixations based on a spatial component of eye tracking data 102 Privacy concerns editWith eye tracking projected to become a common feature in various consumer electronics including smartphones 103 laptops 104 and virtual reality headsets 105 106 concerns have been raised about the technology s impact on consumer privacy 107 108 With the aid of machine learning techniques eye tracking data may indirectly reveal information about a user s ethnicity personality traits fears emotions interests skills and physical and mental health condition 109 If such inferences are drawn without a user s awareness or approval this can be classified as an inference attack Eye activities are not always under volitional control e g stimulus driven glances pupil dilation ocular tremor and spontaneous blinks mostly occur without conscious effort similar to digestion and breathing 109 Therefore it can be difficult for eye tracking users to estimate or control the amount of information they reveal about themselves See also editEye tracking on the ISS Foveated imaging Mouse Tracking Screen readingNotes edit Reported in Huey amp 1908 1968 Huey Edmund 1968 originally published 1908 The Psychology and Pedagogy of Reading Reprint ed MIT Press Buswell G T 1922 Fundamental reading habits a study of their development Supplementary Educational Monographs No 21 Chicago University of Chicago Buswell G T 1937 How adults read Supplementary Educational Monographs No 45 Chicago University of Chicago Buswell G T 1935 How people look at pictures a study of the psychology and perception in art University of Chicago Press Trove 12223957 Yarbus Alfred L 1967 Eye movements and vision PDF New York Plenum Press ISBN 978 1 4899 5379 7 Retrieved 24 March 2022 a b Yarbus 1967 p 190 Yarbus 1967 p 194 Yarbus 1967 p 191 Yarbus 1967 p 193 a b Visual Perception Eye Movements in Problem Solving www learning systems ch 1 Archived 2011 07 06 at the Wayback Machine Rayner 1978 Just amp Carpenter 1980 Posner Michael I 1980 Orienting of Attention Quarterly Journal of Experimental Psychology 32 1 SAGE Publications 3 25 doi 10 1080 00335558008248231 ISSN 0033 555X PMID 7367577 S2CID 2842391 Wright R D Ward L M 2008 Orienting of Attention Oxford University Press ISBN 978 0 19 802997 7 a b c Robert J K Jacob Keith S Karn 2003 Eye Tracking in Human Computer Interaction and Usability Research Ready to Deliver the Promises In Hyona Radach Deubel eds The Mind s Eye Cognitive and Applied Aspects of Eye Movement Research Oxford England Elsevier Science BV CiteSeerX 10 1 1 100 445 ISBN 0 444 51020 6 Schiessl Michael Duda Sabrina Tholke Andreas Fischer Rico Eye tracking and its application in usability and media research PDF Hoffman James E 2016 Visual attention and eye movements In Pashler H ed Attention Studies in Cognition Taylor amp Francis pp 119 153 ISBN 978 1 317 71549 8 Deubel Heiner 1996 Saccade target selection and object recognition Evidence for a common attentional mechanism Vision Research 36 12 1827 1837 doi 10 1016 0042 6989 95 00294 4 PMID 8759451 S2CID 16916037 Holsanova Jana 2007 Anvandares interaktion med multimodala texter User interaction with multimodal texts In L Gunnarsson A M Karlsson eds Ett vidgat textbegrepp in Swedish pp 41 58 Cognolato M Atzori M Muller H 2018 Head mounted eye gaze tracking devices An overview of modern devices and recent advances Journal of Rehabilitation and Assistive Technologies Engineering 5 205566831877399 doi 10 1177 2055668318773991 PMC 6453044 PMID 31191938 a b c d Alexander Robert Macknik Stephen Martinez Conde Susana 2020 Microsaccades in applied environments Real world applications of fixational eye movement measurements Journal of Eye Movement Research 12 6 doi 10 16910 jemr 12 6 15 PMC 7962687 PMID 33828760 a b c d Zhao Lei Wang Zengcai Zhang Guoxin Qi Yazhou Wang Xiaojin 15 November 2017 Eye state recognition based on deep integrated neural network and transfer learning Multimedia Tools and Applications 77 15 19415 19438 doi 10 1007 s11042 017 5380 8 ISSN 1380 7501 S2CID 20691291 Stember J N Celik H Krupinski E Chang P D Mutasa S Wood B J Lignelli A Moonis G Schwartz L H Jambawalikar S Bagci U August 2019 Eye Tracking for Deep Learning Segmentation Using Convolutional Neural Networks Journal of Digital Imaging 32 4 597 604 doi 10 1007 s10278 019 00220 4 ISSN 0897 1889 PMC 6646645 PMID 31044392 a b c Louedec Justin Le Guntz Thomas Crowley James L Vaufreydaz Dominique 2019 Deep learning investigation for chess player attention prediction using eye tracking and game data Proceedings of the 11th ACM Symposium on Eye Tracking Research amp Applications New York New York USA ACM Press pp 1 9 arXiv 1904 08155 Bibcode 2019arXiv190408155L doi 10 1145 3314111 3319827 ISBN 978 1 4503 6709 7 S2CID 118688325 Nadu T 2015 A review Towards quality improvement in real time eye tracking and gaze detection International Journal of Applied Engineering Research 10 6 Nuckles M 2021 Investigating visual perception in teaching and learning with advanced eye tracking methodologies Rewards and challenges of an innovative research paradigm Educational Psychology Review 33 1 149 167 doi 10 1007 s10648 020 09567 5 S2CID 225345884 Alexander RG Waite S Macknik SL Martinez Conde S 2020 What do radiologists look for Advances and limitations of perceptual learning in radiologic search Journal of Vision 20 10 17 doi 10 1167 jov 20 10 17 PMC 7571277 PMID 33057623 Robinson David A October 1963 A Method of Measuring Eye Movemnent Using a Scieral Search Coil in a Magnetic Field IEEE Transactions on Bio medical Electronics 10 4 Institute of Electrical and Electronics Engineers 137 145 doi 10 1109 tbmel 1963 4322822 ISSN 0096 0616 PMID 14121113 Crane H D Steele C M 1985 Generation V dual Purkinje image eyetracker Applied Optics 24 4 527 537 Bibcode 1985ApOpt 24 527C doi 10 1364 AO 24 000527 PMID 18216982 S2CID 10595433 Elbert T Lutzenberger W Rockstroh B Birbaumer N 1985 Removal of ocular artifacts from the EEG A biophysical approach to the EOG Electroencephalogr Clin Neurophysiol 60 455 463 Keren A S Yuval Greenberg S Deouell L Y 2010 Saccadic spike potentials in gamma band EEG Characterization detection and suppression NeuroImage 49 3 2248 2263 doi 10 1016 j neuroimage 2009 10 057 PMID 19874901 S2CID 7106696 Bulling A Roggen D Troster G 2009 Wearable EOG goggles Seamless sensing and context awareness in everyday environments Journal of Ambient Intelligence and Smart Environments 1 2 157 171 doi 10 3233 AIS 2009 0020 hdl 20 500 11850 352886 S2CID 18423163 Sopic D Aminifar A amp Atienza D 2018 e glass A wearable system for real time detection of epileptic seizures In IEEE International Symposium on Circuits and Systems ISCAS Witzner Hansen Dan Qiang Ji March 2010 In the Eye of the Beholder A Survey of Models for Eyes and Gaze IEEE Trans Pattern Anal Mach Intell 32 3 478 500 doi 10 1109 tpami 2009 30 PMID 20075473 S2CID 16489508 a b Gneo Massimo Schmid Maurizio Conforto Silvia D Alessio Tommaso 2012 A free geometry model independent neural eye gaze tracking system Journal of NeuroEngineering and Rehabilitation 9 1 82 doi 10 1186 1743 0003 9 82 PMC 3543256 PMID 23158726 The Eye A Survey of Human Vision Wikimedia Foundation Sigut J Sidha SA February 2011 Iris center corneal reflection method for gaze tracking using visible light IEEE Transactions on Bio Medical Engineering 58 2 411 9 doi 10 1109 tbme 2010 2087330 PMID 20952326 S2CID 206611506 Hua H Krishnaswamy P Rolland JP 15 May 2006 Video based eyetracking methods and algorithms in head mounted displays Optics Express 14 10 4328 50 Bibcode 2006OExpr 14 4328H doi 10 1364 oe 14 004328 PMID 19516585 Purves D et al 2001 What Eye Movements Accomplish Neuroscience 2nd ed Sunderland MA Sinauer Assocs Majaranta P Aoki H Donegan M Hansen D W Hansen J P Hyrskykari A Raiha K J Gaze Interaction and Applications of Eye Tracking Advances in Assistive Technologies IGI Global 2011 Nielsen J Pernice K 2010 Eyetracking Web Usability New Rideres Publishing p 11 ISBN 978 0 321 71407 7 Retrieved 28 October 2013 Le Meur O Baccino T 2013 Methods for comparing scanpaths and saliency maps strengths and weaknesses Behavior Research Methods 45 1 Aharonson V Coopoo V Govender K Postema M 2020 Automatic pupil detection and gaze estimation using the vestibulo ocular reflex in a low cost eye tracking setup SAIEE Africa Research Journal 111 3 120 124 doi 10 23919 SAIEE 2020 9142605 Einhauser W Schumann F Bardins S Bartl K Boning G Schneider E Konig P 2007 Human eye head co ordination in natural exploration Network Computation in Neural Systems 18 3 267 297 doi 10 1080 09548980701671094 PMID 17926195 S2CID 1812177 Andersen R A Bracewell R M Barash S Gnadt J W Fogassi L 1990 Eye position effects on visual memory and saccade related activity in areas LIP and 7a of macaque Journal of Neuroscience 10 4 1176 1196 doi 10 1523 JNEUROSCI 10 04 01176 1990 PMC 6570201 PMID 2329374 S2CID 18817768 Ferhat Onur Vilarino Fernando 2016 Low Cost Eye Tracking The Current Panorama Computational Intelligence and Neuroscience 2016 1 14 doi 10 1155 2016 8680541 PMC 4808529 PMID 27034653 Hunziker 2006 Based on data from Cohen A S 1983 Informationsaufnahme beim Befahren von Kurven Psychologie fur die Praxis 2 83 Bulletin der Schweizerischen Stiftung fur Angewandte Psychologie Cohen A S 1983 Informationsaufnahme beim Befahren von Kurven Psychologie fur die Praxis 2 83 Bulletin der Schweizerischen Stiftung fur Angewandte Psychologie Pictures from Hunziker 2006 Gruner M Ansorge U 2017 Mobile eye tracking during real world night driving A selective review of findings and recommendations for future research Journal of Eye Movement Research 10 2 doi 10 16910 JEMR 10 2 1 PMC 7141062 PMID 33828651 Itoh Nana Fukuda Tadahiko 2002 Comparative Study of Eye Movements in Extent of Central and Peripheral Vision and Use by Young and Elderly Walkers Perceptual and Motor Skills 94 3 suppl 1283 1291 doi 10 2466 pms 2002 94 3c 1283 PMID 12186250 S2CID 1058879 Duchowski A T 2002 A breadth first survey of eye tracking applications Behavior Research Methods Instruments amp Computers 34 4 455 470 doi 10 3758 BF03195475 PMID 12564550 S2CID 4361938 Rogers Sol Seven Reasons Why Eye tracking Will Fundamentally Change VR Forbes Retrieved 16 December 2021 Lohse Gerald Wu D J 1 February 2001 Eye Movement Patterns on Chinese Yellow Pages Advertising Electronic Markets 11 2 87 96 doi 10 1080 101967801300197007 S2CID 1064385 Eye Tracking Study The Importance of Using Google Authorship in Search Results 3 seconds is enough to screen candidate s profile Eye tracking research results Element s Blog nowosci ze swiata rekrutacji HR Tech i Element in Polish 21 February 2019 Retrieved 3 April 2021 Corno F Farinetti L Signorile I August 2002 A cost effective solution for eye gaze assistive technology Proceedings IEEE International Conference on Multimedia and Expo Vol 2 pp 433 436 doi 10 1109 ICME 2002 1035632 ISBN 0 7803 7304 9 S2CID 42361339 Retrieved 5 August 2020 Pinheiro C Naves E L Pino P Lesson E Andrade A O Bourhis G July 2011 Alternative communication systems for people with severe motor disabilities a survey BioMedical Engineering OnLine 10 1 31 doi 10 1186 1475 925X 10 31 PMC 3103465 PMID 21507236 Saunders M D Smagner J P Saunders R R August 2003 Improving methodological and technological analyses of adaptive switch use of individuals with profound multiple impairments Behavioral Interventions 18 4 227 243 doi 10 1002 bin 141 Cerebral Palsy CP Retrieved 4 August 2020 Wilkinson K M Mitchell T March 2014 Eye tracking research to answer questions about augmentative and alternative communication assessment and intervention Augmentative and Alternative Communication 30 2 106 119 doi 10 3109 07434618 2014 904435 PMC 4327869 PMID 24758526 Galante A Menezes P June 2012 A gaze based interaction system for people with cerebral palsy Procedia Technology 5 895 902 doi 10 1016 j protcy 2012 09 099 BLISCHAK D LOMBARDINO L DYSON A June 2003 Use of speech generating devices In support of natural speech Augmentative and Alternative Communication 19 1 29 35 doi 10 1080 0743461032000056478 PMID 28443791 S2CID 205581902 Sharma V K Murthy L R D Singh Saluja K Mollyn V Sharma G Biswas Pradipta August 2020 Webcam controlled robotic arm for persons with SSMI Technology and Disability 32 3 179 197 arXiv 2005 11994 doi 10 3233 TAD 200264 S2CID 218870304 Retrieved 5 August 2020 Eid M A Giakoumidis N El Saddik A July 2016 A novel eye gaze controlled wheelchair system for navigating unknown environments case study with a person with ALS IEEE Access 4 558 573 Bibcode 2016IEEEA 4 558E doi 10 1109 ACCESS 2016 2520093 S2CID 28210837 Jeevithashree D V Saluja K S Biswas Pradipta December 2019 A case study of developing gaze controlled interface for users with severe speech and motor impairment Technology and Disability 31 1 2 63 76 doi 10 3233 TAD 180206 S2CID 199083245 Retrieved 5 August 2020 Jones M W Obregon M Kelly M L Branigan H P May 2008 Elucidating the component processes involved in dyslexic and non dyslexic reading fluency An eye tracking study Cognition 109 3 389 407 doi 10 1016 j cognition 2008 10 005 PMID 19019349 S2CID 29389144 Retrieved 5 August 2020 Calhoun G L Janson 1991 Eye line of sight control compared to manual selection of discrete switches Armstrong Laboratory Report AL TR 1991 0015 Fitts P M Jones R E Milton J L 1950 Eye movements of aircraft pilots during instrument landing approaches Aeronaut Eng Rev Retrieved 20 July 2020 Peysakhovich V Lefrancois O Dehais F Causse M 2018 The neuroergonomics of aircraft cockpits the four stages of eye tracking integration to enhance flight safety Safety 4 1 8 doi 10 3390 safety4010008 a b c de Reus A J C Zon R Ouwerkerk R November 2012 Exploring the use of an eye tracker in a helmet mounted display National Aerospace Laboratory Technical Report NLR TP 2012 001 DV JeevithaShree Murthy L R D Saluja K S Biswas P 2018 Operating different displays in military fast jets using eye gaze tracker Journal of Aviation Technology and Engineering 8 4 Retrieved 24 July 2020 Babu M D V JeevithaShree Prabhakar G Saluja K P Pashilkar A Biswas P 2019 Estimating pilots cognitive load from ocular parameters through simulation and in flight studies Journal of Eye Movement Research 12 3 doi 10 16910 jemr 12 3 3 PMC 7880144 PMID 33828735 Retrieved 3 August 2020 Peissl S Wickens C D Baruah R 2018 Eye tracking measures in aviation A selective literature review The International Journal of Aerospace Psychology 28 3 4 98 112 doi 10 1080 24721840 2018 1514978 S2CID 70016458 Visual Manual NHTSA Driver Distraction Guidelines for In Vehicle Electronic Devices US patent 8928585B2 Mondragon Christopher K amp Bleacher Brett Eye tracking control of vehicle entertainment systems issued 2015 01 06 assigned to Thales Avionics Inc Poitschke T Laquai F Stamboliev S Rigoll G 2011 Gaze based interaction on multiple displays in an automotive environment PDF 2011 IEEE International Conference on Systems Man and Cybernetics pp 543 548 doi 10 1109 ICSMC 2011 6083740 ISBN 978 1 4577 0653 0 ISSN 1062 922X S2CID 9362329 Prabhakar G Ramakrishnan A Murthy L Sharma V K Madan M Deshmukh S Biswas P 2020 Interactive Gaze amp Finger controlled HUD for Cars Journal of Multimodal User Interface 14 101 121 doi 10 1007 s12193 019 00316 9 S2CID 208261516 Marshall S 2002 The Index of Cognitive Activity Measuring cognitive workload Proceedings of the IEEE 7th Conference on Human Factors and Power Plants pp 7 5 7 9 doi 10 1109 HFPP 2002 1042860 ISBN 0 7803 7450 9 S2CID 44561112 Duchowski A T Biele C Niedzielska A Krejtz K Krejtz I Kiefer P Raubal M Giannopoulos I 2018 The Index of Pupillary Activity Measuring Cognitive Load vis a vis Task Difficulty with Pupil Oscillation ACM SIGCHI Conference on Human Factors doi 10 1145 3173574 3173856 S2CID 5064488 Prabhakar G Mukhopadhyay A Murthy L Modiksha M A D A N Biswas P 2020 Cognitive load estimation using Ocular Parameters in Automotive Transportation Engineering 2 100008 doi 10 1016 j treng 2020 100008 McGuire Keegan 8 April 2021 What The Critics Are Saying About Before Your Eyes looper com Archived from the original on 23 April 2021 von Au Caspar 24 April 2021 Computerspiel Before Your Eyes wird mit den Augen gesteuert Video game Before Your Eyes is controlled with your eyes Bayerischer Rundfunk in German Archived from the original on 26 April 2021 Petrusel Razvan Mendling Jan Reijers Hajo A 2017 How visual cognition influences process model comprehension Decision Support Systems C 96 1 16 doi 10 1016 j dss 2017 01 005 ISSN 0167 9236 Sozen Nergiz Say Bilge Kilic Ozkan 27 November 2020 An Experimental Study Towards Investigating the Effect of Working Memory Capacity on Complex Diagram Understandability TEM Journal Association for Information Communication Technology Education and Science 1384 1395 doi 10 18421 tem94 09 ISSN 2217 8333 S2CID 229386117 Krassanakis Vassilios Cybulski Pawel 14 June 2021 Eye Tracking Research in Cartography Looking into the Future ISPRS International Journal of Geo Information 10 6 411 Bibcode 2021IJGI 10 411K doi 10 3390 ijgi10060411 ISSN 2220 9964 Popelka Stanislav Brychtova Alzbeta 2013 Eye tracking Study on Different Perception of 2D and 3D Terrain Visualisation The Cartographic Journal 50 3 240 246 Bibcode 2013CartJ 50 240P doi 10 1179 1743277413Y 0000000058 ISSN 0008 7041 S2CID 128975149 Herman Lukas Popelka Stanislav Hejlova Vendula 31 May 2017 Eye tracking Analysis of Interactive 3D Geovisualization Journal of Eye Movement Research 10 3 doi 10 16910 jemr 10 3 2 ISSN 1995 8692 PMC 7141050 PMID 33828655 Ooms K De Maeyer P Fack V 22 November 2013 Study of the attentive behavior of novice and expert map users using eye tracking Cartography and Geographic Information Science 41 1 37 54 doi 10 1080 15230406 2013 860255 hdl 1854 LU 4252541 ISSN 1523 0406 S2CID 11087520 Beitlova Marketa Popelka Stanislav Vozenilek Vit 19 August 2020 Differences in Thematic Map Reading by Students and Their Geography Teacher ISPRS International Journal of Geo Information 9 9 492 Bibcode 2020IJGI 9 492B doi 10 3390 ijgi9090492 ISSN 2220 9964 Burian Jaroslav Popelka Stanislav Beitlova Marketa 17 May 2018 Evaluation of the Cartographical Quality of Urban Plans by Eye Tracking ISPRS International Journal of Geo Information 7 5 192 Bibcode 2018IJGI 7 192B doi 10 3390 ijgi7050192 ISSN 2220 9964 Brychtova Alzbeta Coltekin Arzu 30 June 2016 An Empirical User Study for Measuring the Influence of Colour Distance and Font Size in Map Reading Using Eye Tracking The Cartographic Journal 53 3 202 212 Bibcode 2016CartJ 53 202B doi 10 1179 1743277414y 0000000103 ISSN 0008 7041 S2CID 18911777 Cybulski Pawel 9 January 2020 Spatial distance and cartographic background complexity in graduated point symbol map reading task Cartography and Geographic Information Science 47 3 244 260 Bibcode 2020CGISc 47 244C doi 10 1080 15230406 2019 1702102 ISSN 1523 0406 S2CID 213161788 Manson Steven M Kne Len Dyke Kevin R Shannon Jerry Eria Sami 2012 Using Eye tracking and Mouse Metrics to Test Usability of Web Mapping Navigation Cartography and Geographic Information Science 39 1 48 60 Bibcode 2012CGISc 39 48M doi 10 1559 1523040639148 ISSN 1523 0406 S2CID 131449617 Popelka Stanislav Vondrakova Alena Hujnakova Petra 30 May 2019 Eye tracking Evaluation of Weather Web Maps ISPRS International Journal of Geo Information 8 6 256 Bibcode 2019IJGI 8 256P doi 10 3390 ijgi8060256 ISSN 2220 9964 a b Vojtechovska Michaela Popelka Stanislav 12 August 2023 GazePlotter tool for eye movement sequences visualization Abstracts of the ICA 6 264 Bibcode 2023AbICA 6 264V doi 10 5194 ica abs 6 264 2023 ISSN 2570 2106 Sultan Minha Noor Popelka Stanislav Strobl Josef 24 June 2022 ET2Spatial software for georeferencing of eye movement data Earth Science Informatics 15 3 2031 2049 Bibcode 2022EScIn 15 2031S doi 10 1007 s12145 022 00832 5 ISSN 1865 0473 S2CID 249961269 Gobel Fabian Kiefer Peter Raubal Martin 2 May 2019 Correction to FeaturEyeTrack automatic matching of eye tracking data with map features on interactive maps GeoInformatica 24 4 1061 1062 doi 10 1007 s10707 019 00352 3 ISSN 1384 6175 S2CID 155184852 Dolezalova Jitka Popelka Stanislav 5 August 2016 ScanGraph A Novel Scanpath Comparison Method Using Visualisation of Graph Cliques Journal of Eye Movement Research 9 4 doi 10 16910 jemr 9 4 5 ISSN 1995 8692 Krassanakis Vassilios Filippakopoulou Vassiliki Nakos Byron 21 February 2014 EyeMMV toolbox An eye movement post analysis tool based on a two step spatial dispersion threshold for fixation identification Journal of Eye Movement Research 7 1 doi 10 16910 jemr 7 1 1 ISSN 1995 8692 S2CID 38319871 Dickson Ben 19 February 2017 Unlocking the potential of eye tracking technology TechCrunch Retrieved 8 April 2021 Reddy Venkateshwar 15 April 2019 Eye Tracking Technology Applications amp Future Scope IndustryARC Retrieved 8 April 2021 Rogers Sol 5 February 2019 Seven Reasons Why Eye tracking Will Fundamentally Change VR Forbes Retrieved 13 May 2020 Stein Scott 31 January 2020 Eye tracking is the next phase for VR ready or not CNET Retrieved 8 April 2021 Stanley Jay 6 May 2013 The Privacy Invading Potential of Eye Tracking Technology American Civil Liberties Union Retrieved 8 April 2021 Blain Loz 29 March 2021 Eye tracking can reveal an unbelievable amount of information about you New Atlas Retrieved 8 April 2021 a b Kroger Jacob Leon Lutz Otto Hans Martin Muller Florian 2020 What Does Your Gaze Reveal About You On the Privacy Implications of Eye Tracking Privacy and Identity Management Data for Better Living AI and Privacy IFIP Advances in Information and Communication Technology Vol 576 Cham Springer International Publishing pp 226 241 doi 10 1007 978 3 030 42504 3 15 ISBN 978 3 030 42503 6 ISSN 1868 4238 References edit nbsp Wikimedia Commons has media related to Eye tracking nbsp Scholia has a topic profile for Eye tracking Cornsweet TN Crane HD 1973 Accurate two dimensional eye tracker using first and fourth Purkinje images J Opt Soc Am 63 8 921 8 Bibcode 1973JOSA 63 921C doi 10 1364 josa 63 000921 PMID 4722578 S2CID 14866408 Cornsweet TN 1958 New technique for the measurement of small eye movements JOSA 48 11 808 811 Bibcode 1958JOSA 48 808C doi 10 1364 josa 48 000808 PMID 13588456 Hunziker Hans Werner 2006 Im Auge des Lesers foveale und periphere Wahrnehmung vom Buchstabieren zur Lesefreude In the eye of the reader foveal and peripheral perception from letter recognition to the joy of reading in German Transmedia Staubli Verlag Zurich ISBN 978 3 7266 0068 6 Just MA Carpenter PA 1980 A theory of reading from eye fixation to comprehension PDF Psychol Rev 87 4 329 354 doi 10 1037 0033 295x 87 4 329 PMID 7413885 S2CID 3793521 Rayner K 1978 Eye movements in reading and information processing Psychological Bulletin 85 3 618 660 CiteSeerX 10 1 1 294 4262 doi 10 1037 0033 2909 85 3 618 PMID 353867 Rayner K 1998 Eye movements in reading and information processing 20 years of research Psychological Bulletin 124 3 372 422 CiteSeerX 10 1 1 211 3546 doi 10 1037 0033 2909 124 3 372 PMID 9849112 Romano Bergstrom Jennifer 2014 Eye Tracking in User Experience Design Morgan Kaufmann ISBN 978 0 12 408138 3 Bojko Aga 2013 Eye Tracking The User Experience A Practical Guide to Research Rosenfeld Media ISBN 978 1 933820 10 1 Commercial eye tracking edit Pieters R Wedel M 2007 Goal Control of Visual Attention to Advertising The Yarbus Implication Journal of Consumer Research 34 2 224 233 CiteSeerX 10 1 1 524 9550 doi 10 1086 519150 Pieters R Wedel M 2004 Attention Capture and Transfer by elements of Advertisements Journal of Marketing 68 2 36 50 CiteSeerX 10 1 1 115 3006 doi 10 1509 jmkg 68 2 36 27794 S2CID 15259684 Retrieved from https en wikipedia org w index php title Eye tracking amp oldid 1192982769, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.