CBS Brain Science Seminar Series(CBSS)
Dr. Terry Sejnowski
Date/Time
Monday, January 16, 2024 10:00-11:30 JST (January 15, 17:00-18:30 PST)
Abstract
Brain Transformers: How Cortical Waves Implement Temporal Context
The astonishing capabilities of transformer networks such as ChatGPT and other Large Language Models (LLMs) have captured the world’s attention. The key computational mechanism underlying their performance relies on transforming a complete input sequence – for example, the words in a sentence – into a long “encoding vector” that can more easily encode relationships between the features of the input sequence. Encoding input sequences in this parallel manner allows transformers to learn long-range temporal dependencies in naturalistic sequences. We suggest that waves of neural activity traveling over topographic maps in sensory cortex could implement a similar encoding principle. By encapsulating the recent history of sensory input in a single spatial pattern at each moment in time, cortical waves may enable sensory cortex to extract temporal context from sequences of sensory input, the same computational principle used in transformers.
For inquiries
cbss[at]ml.riken.jp