
ESR1: Development of wireless fNIRS for use with infants (WP2)
Objectives: It is the goal of this project to develop a wireless functional near-infrared spectroscopy (fNIRS) device that will allow measuring infants’ brain activity during natural interactions. The current state-of-the-art in infant fNIRS experiments is to use optic fibers with narrowband lasers on a receiver/transmitter device that is usually quite heavy and thus not applicable for studying infants in dynamic paradigms. AMS has recently developed a wireless, multichannel device for the adult forehead using high-quality LEDs instead of narrowband lasers that allow for lighter, thus portable, devices. In this project, we will apply the knowledge on our current devices for adults to develop a lightweight, wholehead fNIRS device to study infants in motion. We will develop hardware solutions to cope with physiological confounds (e.g. change of heart rate and blood pressure) and environmental confounds (such movement artefacts, infrared light emitted from the motion-/eyetracking, etc.). This requires a downscale of current sensor electronics, while maintaining a high-quality signal also in areas with less direct skin contact (hair), and the integrating of reference and motion sensors. The hardware development will be complemented by creating sophisticated artefact detection and interpolation techniques. The current state-of-the-art in dealing with artefacts in fNIRS data is to bandpass-filter of the signal in low frequencies, thereby ignoring signals of non-neurological origin (such as heartbeats and respiration). Here, we will investigate the applicability of more advanced techniques, such as principal or independent component analysis, or auto-regressive and general linear models. In addition, we will facilitate sensor attachment to the scalp by developing algorithms to indicate fNIRS signal quality by quantifying.

ESR2: Data acquisition algorithms for infant eye-tracking and integration with other techniques (WP2)
Objectives: It is the goal of this project to improve the data acquisition algorithms for unobtrusive infant eye-tracking and its integration with other techniques. Eye-tracking technology plays an important role in studying participants’ information pickup when exploring their environment and in examining interactions between individuals. Studies with infants have been performed before26, but infant eye-tracking still poses many challenges27, especially if it is to be implemented in natural, live interactions. The geometrics of infants’ head and eye models differ from those of adults which makes it important to investigate and optimise algorithms to perform better with infants. The first aim of this project will thus be to optimise eyetracking algorithms. Different approaches will be verified and analysed for infant data to determine the optimal solution. For comprehensive studies of active infants, researchers need to also use other technologies in addition to eye-tracking. To integrate the data from other systems such as motion capture, EEG, or fNIRS, a user-friendly integration between the involved systems is needed. The second aim of the project is thus to integrate different technologies with focus on correct timing, compatibility with the provided environments and an analysis tool to evaluate the generated data.

ESR3: Tools for infant motion capture (WP2)
Objectives: It is the aim of this project to further develop motion capture hard- and software to allow for unobtrusive, high quality motion registration with infants in natural situations. Movements are a rich source of information when studying infants in interaction with their physical and social environment. However, with respect to testing developing populations several problems arise, including a substantial portion of missing data and large variability in the way actions are executed. Moreover, analysing the microstructure of movements requires detailed, laborious analyses of X, Y, Z coordinates of movement along with velocity and acceleration profiles. These analyses generally require a substantial amount of programming skills and knowledge of signal processing and filtering. Few researchers within the field of developmental science have the required skills to analyse this type of data. Developing novel analysis tools that specifically target developmental data would have several important benefits, as it will allow more researchers to access and analyse motion data and enhance new research directions studying how infants act on and interact with their physical and social environment.

ESR4: Development of an ambulatory and wireless system for electro-physiological measurements with infants and young children (WP2)
Objectives: It is the objective of this project to adapt an ambulatory system for the measurement of EEG and other electro-physiological variables (such as EMG) for use with infants and young children in natural interactions. A cap especially suited for use with infants will be developed that is easy and quick to apply and connect (e.g. water-based electrodes, no need for skin scratching). To this end, the existing electrodes will be redesigned and reduced in size to fit in a small infant cap. We will use recent adult shielding technology to prevent mains interference and reduce sensitivity to movement artefacts. The system will also allow for the registration of muscle activity through EMG. Synchronization between the device and other measurement equipment like fNIRS, eye-tracking, and motion-tracking video will be made possible. The synchronization interface will be integrated in the device. Interfaces and synchronization to PC, tablets and other
measurement devices will be available. Several software algorithms will be adapted for the special characteristics of infant EEG and EMG analysis. Interfaces with MatLab will be created to allow convenient data analysis.

ESR5: Effect of eye gaze on neural responses in naturalistic settings (WP3)
Objectives: This project will concentrate on EEG and fNIRS data to facilitate our understanding of links between lab results and effects found in real world situations for social engagement. Specifically, we will assess how infants detect eye contact and gaze cues to locations in space. Thus far, this fundamental aspect of social neuroscience has been assessed within strict laboratory controlled conditions. This work package seeks to extend these results beyond the laboratory and into applied settings. The identification of known versus unknown social partners has an impact on the speed and accuracy of brain mechanisms related to processing social information. Even in this stage, eye gaze plays an important role. Hood and colleagues showed that children’s and adults’ recognition of faces is modulated by eye gaze direction both during encoding and retrieval stages of their task. Eye contact facilitates gender judgment and incidental recognition memory of faces and leads to increased activation of cortical areas involved in face processing. The literature has thus far failed to determine the generalisability of results that have been obtained in the laboratory. Are these the cognitive mechanisms that are utilised when not in strictly controlled environments? In part, this is due to technical difficulties related to stimuli selection as well as EEG and fNIRS measures. Recent developments in wireless fNIRS and wireless EEG now allow the assessment of “real life” stimuli rather than screen based presentations. Consequently, this project will address fundamental research and applied factors that need to be examined at this point in time in order to truly understand how cognitive processes are utilised in natural social situations.

ESR6: Interactive brains in uncontrolled environments (WP3)
Objectives: In this project, we will examine how brains interact inside and outside the laboratory. Research into infant cognition has focused almost exclusively on intracognitive processes. But how are these processes dynamically altered by interaction with another person? By linking two EEG systems with two eye-trackers (see Figure 3.1), we will be able to tell when an infant and a social partner are engaging in multiple aspects of social-cognitive processing. This project will push infancy research into new directions by assessing for the first time overt infant social behaviours such as initiating protodeclarative pointing in joint attention situations; receptiveness to joint attention cues such as responding to own name, and downstream neural correlates of attention, social partner detection and partner spatial location maintenance. This project has the potential to determine how working memory and learning interact in real-life situations. This research can only be conducted at this point in time as adequate advances in frequency analysis techniques and in the assessment of event-related potentials during infancy can now allow for the precise timings required to detect events in live situations. This project is closely linked to the project of ESR5 both, conceptually and technically. The advent of tools to measure neural activity in infants (such as fNIRS and frequency analysis of EEG) now allow for the presentation of live stimuli. It is an open question as to how linked neural processing of static images is when contrasted to live stimuli. The overwhelming majority of social neuroscience related to gaze and affect is grounded in static images due to technical reasons. Both projects are devoted to investigating processing of social stimuli via more naturalistic means, including moving stimuli.

ESR7: Infant development of emotional mimicry (WP3)
Objectives: During social encounters, we continuously use multiple sources to recognise and interpret others’ emotional states, intentions and mental states. During their first year of life, infants widely enrich their skills to use emotional prompts for social interactions. This project will investigate how infants engaged in natural interactions understand others’ emotional behaviours. Emotional mimicry – the ability to imitate emotional expressions conveyed through faces, voices or postures of people we are interacting with – is thought to be extremely important for empathic mechanisms. Evidence suggests that it is involved in regulating social interactions and that consequently it varies as a function of social context and the type of emotion expression shown. Individuals with whom we cooperate seem to be more likely to elicit mimicry than those with whom we are in competition or with whom we disagree. In adults, mimicry for happy expressions (affiliative intent) does not vary as a function of the sociality of the context, while mimicry to angry expressions (non-affiliative intent), is weaker when people are exposed to expressions by unknown others about whom they know nothing. It is thus possible that mimicry might be sensitive to situational affiliative intents. Evidence from infants suggests that they react more favourably towards adults who imitate them than adults who do not. This project will investigate if and how being imitated or not by a human individual affects successive mimicry responses to dynamic facial emotional expressions (e.g., happiness and fear). While recording infants’ eye movements, a live experimenter-infant interaction will be used to manipulate the degree of imitation of infants’ facial expressions, thus providing positive (imitative) or negative (non-imitative) social cues. Subsequently, infants’ muscular and gaze responses will be jointly measured in response to dynamic facial expressions posed by the same experimenter. Simultaneous recording of gaze and muscular reactions will allow us to precisely examine emotion processing strategies while emotional expressions are unfolding. Moreover, EEG will be used to investigate how infants’ temperamental differences affect the ability to decode facial emotional expressions. This will be done by using dynamic and static emotional expressions and investigating the neural correlates elicited by more naturalistic stimuli as compared to static images.

ESR8: Early understanding of prosocial and antisocial interactions (WP3)
Objectives: In our daily life, we are continuously involved in complex social interactions. The first aim of this project is to address the origins and development of prosocial behaviour, intended as any voluntary behaviour performed by an individual to benefit another (e.g. helping, comforting, sharing). Its second aim is to understand when and how infants are capable of processing the affective nature of a social interaction. To date most studies addressing the early development of prosocial behaviour have used artificial stimuli, such as puppets or even geometrical shapes presented on a monitor, thus lacking a more ecological approach. Moreover, many issues remain unexplored. How do infants discriminate between prosocial and antisocial actions? Do infants mostly direct their gaze towards the eyes of the prosocial actors to attribute a social meaning to the action or do they evaluate the whole scene? The project will investigate how infants process naturalistic scenes of prosocial or antisocial actions by making use of ecological stimuli using both, videos depicting real human beings as well as live interactions. To be able to monitor which and how a prosocial and an antisocial actor are looked at, we will use an eye-tracking system during online viewing and interaction with prosocial or antisocial real actors.
To understand when and how infants are capable of processing the affective nature of a social interaction, we will investigate social interactions characterised by physical contacts between individuals. Interpersonal touch is considered a fundamental communicative tool since the very first moments in life and plays a key role in human interactions. Interestingly, interpersonal tactile contacts differ on the basis of their affective valence: they can be positive (i.e. to hold hands or to give a caress) or negative (i.e. pull one’s hand away to avoid contact or give a slap). Interpersonal touch is thus rich in information that reveals the affective nature of an interaction. Studies with adults have shown that emotional stimuli evoke subtle facial responses in the observer which are highly correlated with the affective valence of pictures, words or even sounds. Further, recent findings have shown that the sensitivity to pleasant touch emerges early in human development. This project will introduce infants to a real interactive environment and measure with wireless EMG their facial responses while they observe interpersonal touch between two actors that is positive, negative or neutral.

ESR9: Infants’ processing of kinematic cues about others’ emotions and intentions in naturalistic interactions (WP4)
Objectives: During social interaction, essential information is conveyed in how an action is performed. That is, the kinematics of an observed action contain cues about the intentions and emotions of the actor. The same action (e.g., closing a door) can be carried out with different kinematics (e.g., closing it carefully, firmly or slamming it) which conveys crucial information about the actor’s emotions and intentions. Whereas for an adult observer, this information is obvious, it is unknown how infants perceive such information and whether they are sensitive to it. Infant researchers have neglected this topic to date, since they focused on highly controlled stimulus stimuli that did not allow for such subtle cues. This project will investigate infants’ sensitivity to kinematic information about an actor’s mental states during naturalistic interactions. Research on infants’ emotion processing has largely focused on their responses to facial emotional displays, and only very recently, researchers have started to examine infants’ perception of emotional information in body postures. This project will study whether and how infants perceive real-life kinematic information on the emotional aspects of actions, as they encounter them frequently in their daily interactions. We will use motion-tracking techniques to measure kinematic adaptations during naturalistic interactions. Also, we will use fNIRS to investigate infants’ brain responses to mental state information when observing an interaction partner during natural interactions.

ESR10: Social learning about novel objects in infancy (WP4)
Objectives: This project will examine how infants learn during natural social interactions. During social interaction, information is exchanged between interaction partners in diverse ways. Previous research has mainly focused on how infants perceive and understand the “what” and “why” of object-directed actions they observe, by studying for instance whether they make predictions about the goal of an observed action or subsequently imitate the observed action on the object. However, crucial action information is also conveyed in how an action is performed. For instance, actors actively adjust their movements when they are demonstrating something to an observer, in particular a young child. To date, there is very little research on how infants perceive these “how” aspects of actions they observe. We will investigate when and how infants start to use subtle kinematic information in their perception and understanding of object -directed actions. This information is present in the actions infants observe in everyday live, but has been neglected in studies on action perception.
We will examine kinematic adjustments which are actively used to convey information. As we know from the language domain, adults adjust their communicative efforts when talking to young children. Whereas there is a lot of research on “motherese”, much less attention has been given to how adult caretakers guide young children’s actions with their own expressive behaviour (“motionese”). In this project, we will study which kinematic cues are being picked up by infants of different ages and how they influence infants’ understanding and learning of an action. Motion-tracking techniques will be used to measure kinematic adaptations during natural interactions. We will then also be able to investigate what combination of cues from the adult model and which interaction pattern between the infant and the adult is associated with optimal learning, remembering and later imitation. These novel insights in infants’ naturally occurring social learning will in turn will be important for parents and educators working with young children.

ESR11: The early development of eye-hand coordination during interaction with “smart toys” (WP4)
Objectives: It is the objective of this project to investigate the development of eye-hand coordination and object manipulation using motion capture and eye-tracking during real-world exploration. It will seek to map how infants learn to use tools (e.g., spoons, chopsticks) and to manipulate objects. The goal is to better understand how infants build complex motor plans that include both manual control and eye-hand coordination with a focus on prospective control and goal properties. In other words, the project will examine how infants become able to coordinate their hands and eyes with respect to future local goals as well as overarching goals. In naturalistic situations, we will investigate the microstructure of eye-hand coordination (measured with motion capture devices from QUA and eye-tracking from SMA) and tool-use while infants manipulate a series of “smart toys”. Those are tools and toys with integrated inertial measurement units that have been developed at the UU babylab.

ESR12: Infants’ real world multi-step planning and the development of executive control (WP4)
Objectives: This project will assess infants’ ability to plan their own actions in multiple steps, for example, reaching for a toy and placing it on a surface or using it to achieve a secondary goal. Prior work has demonstrated that the velocity of infants’ initial reaching is influenced by the difficulty of grasping the target object. We will test whether Fitts’s law can describe infants’ predictive actions, which states that the time required to reach for a target object is a function of the ratio between the distance to the target and its size. The aim of the project is to better understand how infants’ multiple-step action planning develops during naturalistic interactions with the physical world. We will also assess how these emerging planning abilities are related to later emerging executive control abilities. This knowledge is, in the long run, relevant for early education and might guide intervention programmes for infants with poor executive function skills.

ESR13: Computational modelling of goal-directed sequential action selection in infants (WP4)
Objectives: Computational approaches have demonstrated how object-appropriate actions can be learned through trial-and-error interaction with an environment, and moreover how both repetition and habituation are important in developing a robust yet broad repertoire of actions. However, a critical outstanding issue concerns how infants and toddlers learn to sequence object-appropriate actions in order to achieve specific outcomes. This project will combine motion capture and computational techniques to explore competing theoretical accounts of the development and control of goal-directed action sequences. The initial phase will involve empirical work in which toddlers interact with a series of objects in an undirected way. The objects will be designed such that non-transparent sequences of actions yield observable outcomes (e.g., moving subpart A to position X allows one to move subpart B to position Y, and when these two actions are performed in sequence, the object makes an interesting noise). Subsequently the toddlers will be asked to manipulate the objects so as to produce different outcomes. Kinematic data will be recorded in order to determine the nature and quantity of interactions needed to learn action sequences of different complexity. In the second phase of the project two competing computational accounts of action selection (simple recurrent network and interactive activation and competition) will be extended to incorporate a mechanism for grouping actions performed over time into functional (i.e., goal-directed) sequences. The models will then be pitted against each, with parameters from the kinematic data (e.g., number of correct / incorrect graphs, reaches, etc.) used both to construct realistic training regimes for the models and to evaluate the success of each model. This will allow discrimination between two competing models of adult action selection. Specifically, if either model is able to account for the observed empirical effects, then this will add credence both to the proposed mechanism of grouping together actions that achieve observable outcomes and to the specific model as a model of human goal-directed action selection (in both toddlers and adults).

ESR14: Development of embodied decision-making during object-sharing play (WP4)
Objectives: Interacting in a real world populated with many distractors involves continuously making decisions about which object to act on and also inhibiting the desire to interact with other objects. Recent studies with adults showed that this decision process is revealed in people’s reaching trajectories during choice tasks. The trajectories are distorted by the presence of appealing distractors or social agents – even when the participant explicitly tries to ignore them. This project will use precise kinematic and wireless EMG analyses of reaching movements to reveal the internal processes of decision –making in infants and toddlers involved in sharing games in natural settings. To understand the emergence of action control in these tasks, we will develop and extend existing computational models (specifically Diffusion Decision Models and Leaky Integrator Models) of two choice decision-making processes. In particular, we will explore how these cognitive level decision models can be linked to the motor competition processes linked to use of antagonistic muscle groups during intentional reaching. This project will inform us about the nature of reach trajectory interference in toddlers in the presence of cluttered environments, identify plausible mechanisms of reach decision making in toddlers, and bridge between the cognitive and neural levels of processing.