LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Name
Journal of Eye Movement Research
Type
Journal
Items
229 Publications
Compatibility
OpenAIRE 3.0 (OA, funding)
OAI-PMH
https://bop.unibe.ch/index.php/JEMR/oai

 

  • Systematic tendencies in scene viewing

    While many current models of scene perception debate the relative roles of low- and highlevel factors in eye guidance, systematic tendencies in how the eyes move may be informative. We consider how each saccade and fixation is influenced by that which preceded or followed it, during free inspection of images of natural scenes. We find evidence to suggest periods of localized scanning separated by ‘global’ relocations to new regions of the scene. We also find evidence to support the existence ...

    Dynamic programming for re-mapping noisy fixations in translation tasks

    Eyetrackers which allow for free head movements are in many cases imprecise to the extent that reading patterns become heavily distorted. The poor usability and interpretability of these gaze patterns is corroborated by a "naïve" fixation-to-symbol mapping, which often wrongly maps the possibly drifted center of the observed fixation onto the symbol directly below it. In this paper I extend this naïve fixation-to-symbol mapping by introducing background knowledge about the translation task. I...

    Microsaccades under monocular viewing conditions

    Among the eye movements during fixation, the function of small saccades occuring quite commonly at fixation is still unclear. It has been reported that a substantial number of these microsaccades seem to occur in only one of the eyes. The aim of the present study is to investigate microsaccades in monocular stimulation conditions. Although this is an artificial test condition which does not occur in natural vision, this monocular presentation paradigm allows for a critical test of a presumpti...

    Speed and Accuracy of Gaze Gestures

    We conducted an experiment where participants carried out six gaze gesture tasks. The gaze paths were analyzed to find out the speed and accuracy of the gaze gestures. As a result, the gaze gestures took more time than we anticipated and only the very fastest participants got close to what was expected. There was not much difference in performance times between small and large gaze gestures, because the participants reached significantly faster speed when making large gestures than small gest...

    Measuring Attention in Second Language Reading Using Eye-tracking: The Case of the Noticing Hypothesis

    Taking Schmidt’s (1990) noticing hypothesis as point of departure this study aims to measure attention and learning gains during second language (L2) reading by making use of eye-tracking methodology. Relying on Robinson’s hierarchical memory model (1995, 2003), it is hypothesized that vocabulary learning and attention are closely associated. After a vocabulary pre-test, seventy-five learners of English read a standard text individually while their eye movements were being recorded followed b...
  • No data provider research data found
  • Latest Documents Timeline

    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.

    Document Types

    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.

Share - Bookmark