https://neurosciencenews.com/contextual-association-18882/

Brain Mechanism That Automatically Links Objects in Our Minds Identified

Artificial IntelligenceDeep LearningFeaturedMachine LearningNeuroscienceOpen Neuroscience Articles·July 8, 2021

Summary: Combining machine learning with neuroimaging data, researchers identified a brain region that appears to govern contextual associations.

Source: Johns Hopkins University

When people see a toothbrush, a car, a tree — any individual object — their brain automatically associates it with other things it naturally occurs with, allowing humans to build context for their surroundings and set expectations for the world.

By using machine-learning and brain imaging, researchers measured the extent of the “co-occurrence” phenomenon and identified the brain region involved.

The findings appear in Nature Communications.

“When we see a refrigerator, we think we’re just looking at a refrigerator, but in our mind, we’re also calling up all the other things in a kitchen that we associate with a refrigerator,” said corresponding author Mick Bonner, a Johns Hopkins University cognitive scientist. “This is the first time anyone has quantified this and identified the brain region where it happens.”

In a two-part study, Bonner and co-author, Russell Epstein, a psychology professor at the University of Pennsylvania, used a database with thousands of scenic photos with every object labeled. There were pictures of household scenes, city life, nature — and the pictures had labels for every mug, car, tree, etc.

To quantify object co-occurrences, or how often certain objects appeared with others, they created a statistical model and algorithm that demonstrated the likelihood of seeing a pen if you saw a keyboard, or seeing a boat if you saw a dishwasher.

With these contextual associations quantified, the researchers next attempted to map the brain region that handles the links.

This shows groups of different words in different colors
Objects often paired together, according to the study, displayed in a heat map-style image. Credit: Johns Hopkins University

https://39eee0394368129b80575bdf9125738f.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html

While subjects were having their brain activity monitored with functional magnetic resonance imaging, or fMRI, the team showed them pictures of individual objects and looked for evidence of a region whose responses tracked this co-occurrence information. The spot they identified was a region in the visual cortex commonly associated with the processing of spatial scenes.

“When you look at a plane, this region signals sky and clouds and all the other things,” Bonner said. “This region of the brain long thought to process the spatial environment is also coding information about what things go together in the world.”

Researchers have long-known that people are slower to recognize objects out of context. The team believes this is the first large-scale experiment to quantify the associations between objects in the visual environment as well as the first insight into how this visual context is represented in the brain.

“We show in a fine-grained way that the brain actually seems to represent this rich statistical information,” Bonner said.

About this neuroscience research news

Source: Johns Hopkins University
Contact: Jill Rosen – Johns Hopkins University
Image: The image is credited to Johns Hopkins University

Original Research: Open access.
Object representations in the human brain reflect the co-occurrence statistics of vision and language” by Michael F. Bonner & Russell A. Epstein. Nature Communications

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s