Search

Research

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Publications

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Faculty

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Research

Behavioral Synchronization and Social Impression

Walking is one of our most natural daily actions. We discovered that walkers use step synchronization as a form of non-verbal social communication. Walking side-by-side, even without verbal communication, is sufficient to alter the social relation between two strangers. We also discovered that pairs with a better first impression had greater synchronization in their steps. In addition to social relation, personal traits are also important. Female pairs, compared to male pairs, exhibited higher walking synchrony in this experiment. There is also an age effect – older participants tend to synchronize with their partners more in walking. Participants with lower autistic tendencies synchronize better than pairs of higher autistic tendencies.
Cheng, M., Kato, M., and Tseng, C.H., (2017) Gender and autistic traits modulate implicit motor synchrony. PLOS One, 10(5).
Cheng, M., Kato, M., Saunders, J., Tseng, C.H. (2020). Paired walkers with better first impression synchronize better, PLOS One, 15(2), e0227880

Brain Synchronization and Team Flow

Team flow happens when team members get “in the zone” to accomplish a task together. Great teams experience this psychological phenomenon, from athletes to musicians and even professional work teams. When teamwork reaches the team flow level, one can observe the team perform in harmony, surpassing their limits. We used an EEG to measure the brain activity from 10 teams while they played a music video game together. It enables us to discover team flow’s unique signature: increased beta and gamma brain waves in the middle temporal cortex, a type of brain activity linked to information processing. Teammates also had more synchronized brain activity during the team flow state compared to the regular teamwork state.
Shehata, M., Cheng, M., Leung, A., Tsuchiya, N., Wu, D. A., Tseng, C. H., ... & Shimojo, S. (2021). Team Flow Is a Unique Brain State Associated with Enhanced Information Integration and Interbrain Synchrony. Eneuro, 8(5).

Decoding Individual and Team’s Psychological States from Non-verbal Information

Social communication is vital for our well-being and team productivity. The recent outbreak of COVID-19 has made online sociality a significant and central part of our life. We now use online systems for various kinds of business and leisure activities (e.g. classes, work meetings, social parties, etc). However, for many people, online communication has more barriers and is less satisfactory than in-person communication. One possible reason is that it is harder to feel “together,” i.e. ittaikan (一体感), with other people online, and it is harder to form “oneness” for an online group/team. We combine psychophysical behavioral experiments and machine learning techniques in order to decode the psychological states of individual members (e.g. engaged, help-needed) and the team (e.g. togetherness, co-presence).
We propose to see MA as the tension network which holds the bars together and apart at the same time. The black bars represent different categories of events, temporal moments, spatial location, or subjectivity. MA helps to keep the configuration integrity by balancing the pushes and pulls in this structure, which generates a holistic experience from holding the differences together.
Wang, G. Y., Nagata, H., Hatori, Y., Sato, Y., Tseng, C. H., & Shioiri, S. (2023). Detecting Learners' Hint Inquiry Behaviors in e-Learning Environment by using Facial Expressions. In Proceedings of the Tenth ACM Conference on Learning@ Scale (pp. 331-335).
Tseng, C.H., Cheng, M., Matout, H., Fujita, K., Kitamura, Y., Shioiri, S., Bachrach, A. (2021), MA and togetherness (ittaikan) in the narratives of dancers and spectators: sharing an uncertain space, Japanese Psychological Research, 63(4), 421-433.

Infants’ Sleep and Socio-Cognitive Development

Infants experience rapid growth and prolific brain development, which becomes critical foundations for later social and cognitive functions. We established the first infant research group in the Tohoku Region via interdisciplinary research strengths available at Tohoku University. Our core members from psychological development, neuro-linguistics, and public health address questions about infant development and their implications later in life. The ultimate goal is to provide evidence-base suggestions to inform better public health policies.
Our ongoing topics include infant learning and its relationship with infant sleep. We use objective experiments and subjective parents’ report to capture the pre-verbal infants’ sleep characteristics.
We examine infants’ social preferences via an eye-tracker, which enables us to measure the baby’s gaze patterns.
We attach a sleep watch measuring sleep-wake cycle to the infants’ wrist or ankle (Micro Motionlogger) by recording the time-series data of physical activity levels and ambient light.
We create visual-audio rule sequences and present to infants repetitively until they have learned it (indicated by their reduced looking time). The relation between visual-audio can be (1) arbitrary, (2) related in object level, or (3) relevant in semantic level (e.g. emotion).
Tsui, Ma, Ho, Chow, and Tseng. (2016) Bimodal emotion congruency is critical to preverbal infants’ abstract rule learning. Developmental Science, 1-12.
Tseng, Chow, Ma, and Ding. (2018) Preverbal infants utilize cross-modal semantic congruency in artificial grammar acquisition, Scientific Report, 8:12707.
Chow, Ma, Tseng. (2024). Social and Communicative not a Pre-requisite: Preverbal Infants Learn an Abstract Rule only from Congruent audio-visual dynamic pitch-height patterns. Journal of Experimental Child Psych., 248, 106046.

Human-Robot/Avatar v.s. Human-Human Relationships

An increase online activities is no doubt the most profound behavioral change during the Covid-19. In the cyber world, we quickly establish the group social identities (e.g. the partner, co-workers, boss, client) with minimum information (e.g. online ID, photo/image only). We discovered a “partner-advantage” which refers to the higher processing priority endowed to events or stimuli associated with our partners compared to that to strangers. We are expanding it from human-human to human-robot and human-avatar in order to characterize the social relationships between human beings and other agents.
Tseng, C. H., Jingling, L., & Cheng, M. (2022). Social affiliation is sufficient to provoke the partner-advantage. Scientific Reports, 12(1), 21293
Cheng, M., Tseng, C.H. (2019). Saliency at first sight: instant identity referential advantage toward a newly-met partner. Cognitive Research: Principles and Implications, 4(1), 1-18.

Multi-Sensory Conflicts, Verticality Illusions, and (Dis)comforts

Our ability to tell where the true vertical (i.e. verticality perception) is critical because it secures our safe interaction with the world (e.g. holding a cup of coffee without spilling it) and our posture maintenance. This ability is supported by multiple sense: visual, auditory, tactile, proprioceptive, and vestibular systems. We identified a new illusion in the natural environment where this ability is compromised – that is when our body pitch and body motion occur at the same time. We designed experiments to look for the sources of errors. The results indicated that this illusion is not from a single sensory system, but came at the sensory integration stage. When there is a conflict between multiple senses, our brain tries to make sense and thus an illusion.
This sensory conflict in VR has been a challenge to create comforts, and this series of study has a potential to help solve the fundamental issue of VR discomfort.


Tseng, Chow, and Spillman. (2013) Falling skyscrapers: when cross-modal perception of verticality fails. Psychological Science, 24(7), 1341-7.
Tseng, Chow, Spillmann, Oxner, & Sakurai. (2022). Body Pitch Together with Translational Body Motion Biases the Subjective Haptic Vertical. Multisensory Research, 36(1), 1-29.