top of page

Field of research: Movement-emotion and movement-cognition interactions; Effects of movement and sensory-motor interventions on the brain, the body and behavior.

 

Research goals:

1) To uncover the neurophysiologic underpinnings and developmental progression of movement-emotion and movement-cognition interactions in health and disease, by designing and conducting thoughtful hypotheses-driven behavioral and brain-imaging studies.

​

2) To translate this knowledge into clinical use by developing novel, motor-based diagnostic and intervention tools for treatment of various neurological and psychological/psychiatric disorders.

​

3) To provide the missing evidence-based research that will support and validate diagnostic and intervention tools used in dance-movement therapy, the Anat Baniel Method: Neuromovement, Feldenkrais Method, and other somatic therapies.

​

In accordance with these goals, I have two lines of research:

​

1) Theoretical or “basic” research, describing the behavioral results and uncovering the neurophysiological mechanisms underlying movement-cognition and movement-emotion interaction, in particular emotion recognition from movement and emotion elicitation and regulation through movement (goals 1 & 3 above).

​

2) Clinical research, examining 1) Implications of abnormal emotion recognition from bodily emotional expressions in specific populations, and 2) the effects of novel motor interventions on the brain, cognition and emotion (goal 2 above).

 

Theoretical Research:

​

Past projects:

​

  • Under this line of research I have demonstrated that motor execution, observation and imagery of emotional movements can influence affective state (Shafir et al. 2013), and I have identified which motor characteristics are associated with each of the emotions happiness, sadness, fear and anger: Which motor characteristics enhance each emotion when incorporated in motor behavior (Shafir et al., 2016, Tsachor et al., 2019),  are used to express each emotion (in preparation), and are perceived as expressing that emotion (Melzer et al., 2019). Together with my colleague for this line of research, Rachelle Tsachor, we have also suggested how to use this knowledge in psychotherapy (Shafir 2016; Tsachor and Shafir 2017).

​

  • In addition, in collaboration with Prof. Assaf Schuster from the Computer Science Department at the Technion in Israel,we have started to develop automatic recognition of these movement characteristics using a Kinect camera (Bernstein et al., 2015 a, b), with the idea to develop it into a future diagnostic tool, based on people’s motor signature as expressed in secondary motor symptoms distinctive to certain disorders (e.g., autism, schizophrenia), as well as using it as a form of biofeedback for emotion regulation purposes.

​

  • When Prof. Schuster decided to found a new start-up, concentrating on a different aspect of computer science and focusing all his energy and time in this start-up, we started working with Prof. James Wang from the College of Information Science and Technology at Penn State University, whose expertise is in automatic recognition of emotions from visual media. Working with Prof. Wang and his team, we have demonstrated that automatic emotion recognition from videos of people moving daily movements, was enhanced and became more accurate, when in addition to traditional computer vision techniques we used AI (Artificial Intelligence) techniques to also identify in the people’s movements the specific motor elements, which as part of my previous research, were found to be associated in our brain with specific emotions. These identified motor elements were then used as additional intermediate features in the emotion recognition algorithms. The combination of traditional computer vision techniques with using those motor elements as indicators for specific emotions, significantly increased emotion recognition precision, beyond any other currently known automatic emotion recognition methods (Wu et al. 2023).

 

Current projects:

  • A collaborative study with Prof. Robyn Cruze from Lesley University and Rachelle Tsachor from University of Illinois at Chicago, investigating the relationship between personality traits and personal movement patterns as depicted by LBMS (Laban/Bartenieff Movement System) movement components.

​​

  • A collaborative study with Dr. Arik Cheshin from University of Haifa, investigating how incongruency between the emotion expressed in facial expressions and the emotion expressed in bodily expressions affects emotion recognition and behavior during negotiation.

​​

  • Together with a group of scientists from Pennsylvania State University and University of Illinois at Chicago and a few Certified Laban Movement Analysts, all led by Prof. James Wang from the College of Information Science and Technology in Penn State, we applied and were granted in April 2023 by NSF (USA National Science Foundation) $2,000,000 to create a data sharing infrastructure, and a huge data set of tens of thousands of video clips of people expressing emotions during their every-day life. Each video clip in this data set will be accompanied by ample information such as the emotion expressed by each person in the clip, the context of the situation portrayed in the clip, analysis of the movements and facial expressions of the people in the clip, and more. This data set will become available for use by any scientist for developing various innovative applications which involve automatic emotion recognition, such as designing social robots, interactive gaming, biofeedback, animation, mental health telemedicine through mood/emotional-state monitoring of patients at home, and more.

 

Clinical research:

​

Past projects:

  • Emotion recognition from whole body emotional expressions in elderly with Alzheimer Disease and its effect on their care giver’s burden: a collaboration with Prof. Perla Werner from University of Haifa (Spitzer et al., 2019).

​​

  • Effects of whole-body vibration stimulation on Depression: a collaboration with Richard Dopp, M.D. from University of Michigan

 

Future directions:

​​

bottom of page