ISLAMABAD: New research shows that when we hear stories, brain patterns appear that transcend culture and language. There may be a universal code that underlies making sense of narratives.

Telling and listening to stories is a pastime that spans all cultures. From crime novels to bedtime stories and from ancient legends to spicy romances, humanity loves a good book.

We are all very used to the idea of stories, but the processes at work in the brain are more complex than it seems.

Following a narrative and understanding the story's meaning and themes, as well as the interaction of causes and effects across time, involves challenging cognitive gymnastics. But of course, our brains make it seem effortless.

Neuroscience has made headway in finding out which brain regions help us to understand smaller chunks of language - words and sentences, that is - but we still have a lot to learn about how the brain understands a narrative. Following a story involves a steady accumulation of meaning.

Recently, a group of researchers from the University of Southern California (USC) in Los Angeles designed a study to investigate the networks involved in understanding stories. Their findings are published in the journal Human Brain Mapping.

More specifically, they wanted to understand whether or not the same story but told in different languages would activate similar brain regions in native speakers of those languages.

Further to this, they planned to see whether they could work out which specific story a participant was reading by analyzing their brain activity alone, which is no mean feat.

The team was led by Morteza Dehghani, of the Brain and Creativity Institute at USC. Using software developed by the USC Institute for Creative Technologies, the team sifted through 20 million blog posts including personal stories.

They narrowed this wealth of stories down to just 40, all of which covered personal topics such as going through divorce or telling a lie. These stories were then condensed to a paragraph of around 150 words. Next, the English stories were translated into Mandarin Chinese and Farsi by translators.In total, 90 participants of American, Chinese, and Iranian descent read the stories while their brains were scanned using functional MRI scans.

The USC team used cutting-edge machine learning and text-analysis techniques, including an analysis involving 44 billion classifications to "reverse engineer" data from the scans.

In this way, they were able to determine which story any individual reader was listening to in any of the three languages purely from the brain activity that they were measuring. In other words, the researchers were reading the participants' minds as they read the stories.

The distinctive patterns created in the readers' brains were measured in an area called the default mode network. This region links a number of interconnected parts, including the medial prefrontal cortex, inferior parietal lobe, posterior cingulate cortex, hippocampal formation, and the lateral temporal cortex.

Historically, the default mode network was assumed to be "a sort of autopilot" function for the brain when it is at rest and not engaged in focused thinking. Over recent years, however, studies have shown that this may not be the case.

Previous findings suggest that the default mode network is activated when the mind appears to be at rest - for instance, when it is searching for narratives, retrieving autobiographical memories, and influencing the way we think relating to the past, present, and future, and our relationships with others.

Corresponding study author Jonas Kaplan said, "One of the biggest mysteries of neuroscience is how we create meaning out of the world. Stories are deep-rooted in the core of our nature and help us create this meaning."

This study and others like it bring us one step closer to understanding how we achieve this complex feat so quickly and seamlessly.