Source: Osaka Metropolitan University
How do we understand words? Scientists don’t fully understand what happens when a word pops into your brain. A research group led by Professor Shogo Makioka of the Graduate School of Sustainable System Sciences at Osaka Metropolitan University wanted to test the idea of embodied cognition.
Embodied cognition proposes that people understand the words of objects through how they interact with them, so the researchers devised a test to observe the semantic processing of words when the ways in which participants could interact with the objects were limited
Words are expressed in relation to other words; a “cup”, for example, can be a “container, of glass, used for drinking”. However, you can only use a cup if you understand that to drink from a cup of water, you hold it in your hand and bring it to your mouth, or that if you drop the cup, it will crash to the floor .
Without understanding this, it would be difficult to create a robot that can handle a real cup. In artificial intelligence research, these problems are known as symbol grounding problems, which map symbols to the real world.
How do humans get the basis of symbols? Cognitive psychology and cognitive science propose the concept of embodied cognition, where objects receive meaning through interactions with the body and environment.
To test embodied cognition, the researchers conducted experiments to see how the participants’ brains responded to words describing objects that can be manipulated with the hand, when the participants’ hands were free to move compared to when they were restrained.
“It was very difficult to establish a method to measure and analyze brain activity. The first author, Ms. Sae Onishi, worked persistently to come up with a task, in a way that we could measure brain activity with sufficient precision Professor Makioka explained.
In the experiment, two words such as “cup” and “broom” were presented to the participants on a screen. They were asked to compare the relative sizes of the objects that these words represented and to verbally answer which object was larger, in this case, “broom.”
Comparisons were made between the words, describing two types of objects, hand-manipulable objects such as “cup” or “broom” and non-manipulable objects such as “building” or “street lamp”, to observe how each type was processed.
During the tests, participants placed their hands on a desk, where they were free or held by a transparent acrylic plate. When the two words were presented on the screen, to answer which one represented a larger object, participants had to think about both objects and compare their sizes, forcing them to process the meaning of each word.
Brain activity was measured with functional near-infrared spectroscopy (fNIRS), which has the advantage of taking measurements without imposing further physical constraints.
The measurements focused on the interparietal sulcus and the inferior parietal lobule (supramarginal gyrus and angular gyrus) of the left brain, which are responsible for tool-related semantic processing.
Verbal response speed was measured to determine how quickly the participant responded after the words appeared on the screen.
The results showed that left brain activity in response to hand-manipulable objects was significantly reduced by manual restraints. Verbal responses were also affected by manual restraints.
These results indicate that restricting hand movement affects object meaning processing, which supports the idea of embodied cognition. These results suggest that the idea of embodied cognition could also be effective for artificial intelligence to learn the meaning of objects.
About this cognitive research news
Author: Yoshiko Tani
Source: Osaka Metropolitan University
Contact: Yoshiko Tani – Osaka Metropolitan University
Image: Image is credited to Makioka, Osaka Metropolitan University
See also

Original Research: Open access
“Hand Restraint Reduces Brain Activity and Affects Speed of Verbal Responses in Semantic Tasks” by Sae Onishi et al. Scientific reports
Summary
Hand restraint reduces brain activity and affects the speed of verbal responses to semantic tasks
According to embodied cognition theory, semantic processing is closely related to body movements. For example, restricting hand movements inhibits memory for objects that can be manipulated with the hands. However, whether body restraint reduces semantic-related brain activity has not been confirmed.
We measured the effect of hand restraint on semantic processing in the parietal lobe using functional near-infrared spectroscopy.
A pair of words representing the names of hand-manipulable (e.g., cup or pencil) or non-manipulable (e.g., windmill or fountain) objects were presented, and participants were asked to identify which object was bigger.
We analyzed reaction time (RT) in the judgment task and activation of the left intraparietal sulcus (LIPS) and left inferior parietal lobule (LIPL), including the supramarginal gyrus and angular gyrus. We found that hand movement restriction suppressed brain activity in the lips to hand-manipulable objects and affected RT in the size judgment task.
These results indicate that bodily restraint reduces the activity of brain regions involved in semantics. Hand constraint could inhibit motor simulation, which, in turn, would inhibit body-related semantic processing.