Research

 
The research of this programme focuses on the interaction between sensory perception and motor control. We study haptic and visual perception, hand and eye movements, and reaching, grasping, and manipulating objects.

 

Current research projects


  • H-Haptics is an STW Perspectief programme that aims to develop a more intuitive haptic force feedback system.
     
  • THE Hand Embodied is an EU-funded project for studying the interaction of the human and robotic hand with the outside world in order to improve prosthetics and tele-operation.
     
  • Automatic annotation of Exploratory Procedures (Sander Jansen)
    Active movement of the hand and fingers enables dynamic exploration of objects and materials. A set of stereotyped movement patterns called Exploratory Procedures (EPs) have been linked to the acquisition of knowledge concerning specific object properties. For example, lateral motion is the optimal EP for acquiring roughness information and compliance is best estimated by employing pressure.
    Presently, analysis of haptic exploratory behavior requires manual annotation of EPs during video observation. This is a laborious and time-consuming task. Our goal is to construct a system that is capable of automatic EP annotation by analyzing a set of parameters that are extracted from the position and orientation of a few markers on the hands and stimulus.
     

    Figure: Schematic overview of the proposed method. First, position and orientation of markers is gathered. Second, kinematic parameters are computed from the data. Third, intervals of exploratory procedures (EPs) are annotated by comparing these parameters to a set of predefined criteria for each EP."
     
  • Human motor control in grasping
    Dimitris Voudouris is mainly interested in human motor control and particularly in the control of grasping movements. During the last years he examined the selection of grasping points, the positions on an object where people’s digits make contact with it. Recently he started working on the role of eye movements in grasping. Do people look at their grasping points, and, if yes, which of the two do they fixate?
     

    The figure shows the fixations made by one participant when grasping a cube. But since the fixation strategies vary across participants, Dimitris now aims to find out what the fixations depend on. His research is based on the analysis of arm and body kinematics, as well as of head-free eye movements.
     
  • Control of goal-directed arm movements (Dinant Kistemaker)
    In order to achieve accurate limb movements, the central nervous system (CNS) must generate appropriate time-varying muscle activation patterns. The goal of this project is to understand how the human motor system achieves this. We study patterns of muscle activation and kinematics during (multi-joint) arm movements over the course of motor learning. In addition, we use mathematical models of the human neuro-musculoskeletal system to predict behavior under the experimental conditions and under various motor control hypotheses. We try to provide answers to questions like:
    • Does the CNS control movements according to the Equilibrium Point Hypothesis?
    • Does the CNS minimize energy consumption in arm movements?
    • Does the CNS select a kinematic path separate from generating muscle activation patterns?


    Figure: Example of experimental setup and schematic drawing of musculoskeletal model of the arm.
     
  • From grouping to haptic object perception (Myrthe Plaisier)
    To handle an object we need to know what we are holding in our hand. When we move, rotate and squeeze what we have in our hand, the brain receives a stream of tactile inputs from locations distributed over the hand. To be able to process this haptic (i.e. touch) information in a useful way, information from the same object needs to be grouped together and segmented from other inputs. At the moment it is unknown how this crucial step of grouping of haptic information is performed. In this NWO Veni project I aim at mapping out the principles that govern this process. This predominantly fundamental knowledge will potentially advance the development of technical applications such as tactile displays and robotic hands.
     
  • Illusions in the brain: a new approach to understanding visual processing (Anouk de Brouwer)
    I’m a PhD student in the lab of Pieter Medendorp and the lab of Jeroen Smeets (Faculty of Human Movement Sciences, VU University Amsterdam) since August 2011. I’m interested in how the brain creates a representation of the world around us that we need for perception and for planning and guiding our actions (such as grasping a cup). The main question of my PhD project is whether our brain processes visual information independently for perception (ventral stream) and action (dorsal stream), or uses a common representation for both, a topic that has been hotly debated in the last decades. I study this problem by using motor and perceptual tasks involving visual illusions. By combining behavioural measures (e.g., eye movements, perceptual judgments) and neuroimaging (fMRI), I want to get more insight into how and where our brain processes visual information.
     
  • Motion sickness and postural stability (Astrid Lubeck)
    In today's society we are often confronted by visual motion patterns: while watching a movie, television or while playing video games or operating a simulator. These motion patterns can create a powerful sensation of self-motion. Negative side effects of exposure to such motion can be motion sickness, dizziness, and postural instability.
    These negative side effects can have a great impact on performance. Motion sickness may force the person to quit simulator training or may cause avoidance of situations with exposure to visual motion patterns.
    We believe that most of these side effects may be explained by a theoretical framework that takes expectations about sensory input into account in regulating motion sickness (internal model). In case of a mismatch between actual and expected sensory input motion sickness and/or postural unsteadiness may occur.
    In summary, this project focuses on side effects of exposure to visual motion patterns based on a theoretical framework taking expectations about sensory input into account. The most important questions we want to answer are:
    • What is the temporal relationship between motion sickness and postural stability?
    • What is the causal relationship between motion sickness and postural stability?
    • How do adaptation and re-adaptation to novel environments occur (real or virtual)?
    • To what extent can visually induced pathologies be explained by a wrongly calibrated internal model?