A03Cognition and sociality Understanding Neural Computation for Double Articulation Analysis Bridging Sensory-motor Information and Natural Language in Human Brain
Human brain can analyze a two-layer hierarchical structure embedded in speech signal called double articulation structure, i.e., speech signal is segmented into words and phonemes in a hierarchical manner.
However, the computational process of double articulation analysis in human brain has not been revealed in neuroscience. In artificial intelligence and developmental robotics, we have not developed a robot that can automatically learn language using double articulation analyzer from human-robot and sensorimotor real-world interaction.
In this research project, we clarify the neural computation for double articulation analyzer and dynamic category formation. This project aims to contribute to:
- Understanding the neural mechanism that supports human language acquisition based on sensorimotor information,
- Developing the next-generation communication robot,
- Developing AI system that can estimate other’s intention,
- Understanding disease related to language, planning, and sensorimotor behaviors including schizophrenia, and
- Investigating theories bridging deep learning and Bayesian nonparametrics.
Our main research golas are listed as follows.
- (1) Research on a computational model for double articulation analyzer and dynamic categorization.
- (2) Research on neural mechanism of double articulation analysis and dynamic categorization.
- (3) Developing a robot that can automatically acquire language and motor behaviors from sensorimotor experiences.