By Monique Tsang
Learning a new language can be a lot of hard work – especially for the brain. But it could be easier than you think.
If you ever thought you spoke a foreign language, chances are you must have had a similar experience. You wanted to say something in your newly acquired language, but after much struggle, what came out of your mouth was a jumble. You just couldn’t get your head around it.
Yet for many bilingual speakers it all seems so easy. They can constantly switch, and listen to other people switch, from one language to another without ever mixing or confusing the two. The bilingual brain is like an ultra-efficient postal sorting office, and its workers – the brain signals – rarely make mistakes when it comes to sorting the post.
But if you think the brain’s postal sorters are a perfectly cooperative team, then think again. The reality is that if you’re a bilingual speaker, there’s a constant battle in your head. “You might think that when you’re speaking it should be under your own volition – you should be in charge,” says Professor Judith Kroll, a cognitive psychologist at Penn State University who is studying how bilinguals understand and produce words in both languages. “There appears to be an effect of the language not in use regardless of what the bilingual intends to do.”
Research by Kroll and numerous other scientists suggests that both languages are active in the bilingual brain even when the speakers are just using one of their languages. Research on bilinguals have been based on observations of how quickly they react to tasks given in different languages and the brain electrical activity triggered by a thought or a visual cue. Unlike monolinguals, bilinguals have to do harder brainwork. “Bilinguals have to learn to actively control and suppress their dominant language to enable them to use their second language,” she adds.
Language processing primarily occurs in two areas of the brain, the Broca’s area and the Wernicke’s area. Broca’s area is in the inferior frontal gyrus of the brain’s left hemisphere, or roughly near the temple. It is associated with grammar comprehension and processing. The Wernicke’s area is located in the superior temporal gyrus of the cerebral cortex, near the ear. Brain imaging techniques have shown increased brain signals in both areas during speech and language activities. These areas are responsible for language and speech regardless of the number of languages a person speaks.
Studies on speech rehabilitation provide some of the swathe of evidence on language interaction in the bilingual brain. Professor Swathi Kiran, from the College of Health & Rehabilitation Sciences at Sargent College, Boston University, uses brain imaging to study what happens when stroke patients lose their speech abilities, a condition known as aphasia. Her research is helping individuals living with aphasia to regain their language abilities. Kiran has found that, remarkably, training bilingual patients in their less proficient language can help them regain both languages. So for example, for an individual with aphasia who speaks English and Spanish with Spanish as the weaker language, training the person in Spanish improves not just Spanish but also English. “This is because every time you hear or name a word in Spanish, the individual has to think about what it meant in English, so both languages improve,” says Kiran.
The constant juggling act between languages may set us up for potential language mix-ups, but a recent study suggests this may not always be the case. “The inherent characteristics of the words – how they sound – provides enough information to distinguish which language a word belongs to,” says Professor Michael Vitevitch, a cognitive psychologist from the University of Kansas. He compared the sounds of nearly 20,000 words from both Spanish and English, and found that of the two languages, the vast majority – more than 95% – of the words in one language sounds dissimilar to those in the other.
His study suggests that words in one language do not necessarily ‘invade’ those in the other. “So you don’t have to worry too much about Spanish creeping into an English conversation,” he says. If this is true, then the postal sorters in our brain may not have to work so hard after all. A package labelled with a colour sticker – here the sound of words – may help the worker to quickly identify its destination.
A study published last year seems to support Vitevitch’s claim. Using brain imaging techniques, Jan-Rouke Kuipers and Guillaume Thierry at the University of Bangor in the UK studied whether bilinguals processed languages differently because they were more exposed to different types of sounds from an early age. Comparing bilingual speakers of English and Welsh and monolingual speakers of English, they found that the bilinguals’ brains detected language change as early as 200 milliseconds – or one-fifth of a second – whereas monolinguals’ brains took at least 150 milliseconds more. In other words, the bilingual brain doesn’t need to have processed an entire word to be able to detect that the language has changed.
The first 200 milliseconds is how long it typically takes for the brain to detect an external stimulus such as an image or a word, and is often associated with attention. This response, called the P2 or P200, is evident in heightened electrical signals detected in the Broca’s area. It is amongst the least well understood electrical signals in the brain.
Kuipers admits there is as yet no definitive explanation as to why bilinguals detect language change more quickly than monolinguals. The bilinguals’ faster brain signal response could suggest that when they detect the basic distinguishing sounds of the other language this captures their brains’ attention, says Kuipers. However, it could also be the case that, regardless of attention, the incoming speech signal activates word representation in the language not in use. “Words typically activate their lexical representation around 200 milliseconds after onset. The switch in lexicon could be reflected in the increased P2 response,” he says.
Bilinguals could also distinguish languages using another trick. In the same study, Kuipers and Thierry found that after a language switch bilinguals displayed increased P600 activity, a brain signal which peaks at 600 milliseconds and elicited by hearing or reading anomalous speech. “This suggests that they consciously re-evaluated the input,” says Kuipers. The conscious monitoring for language change in the word input could allow the bilingual to weigh up an incoming word against its expectation based on the context of what was being spoken.
Not surprisingly, if two languages sound very different – as in the case of English and Spanish – then telling them apart would be easier than other language pairs such as Dutch–English and Spanish–Italian. And the sound of words is not the only issue. “You also have interesting cases like French–English where there is a very small overlap for spoken language, but a very large overlap for written language,” says Professor Paul Meara, a linguist at Swansea University who has spent decades studying how people acquire second languages.
New ways of thinking
However little two languages may overlap, bilinguals are still living with that constant battle in their heads. Kuipers says that Vitevitch’s research on Spanish and English sounds “does not take away that bilinguals activate words in both of their languages when only exposed to one.” This is evident in the processing of cognates (words descended from the same language) and false friends (words in different languages that look or sound similar but mean different things).
Vitevitch acknowledges existing studies on language interaction, but he claims they are limited if viewed on their own. “What studies like that don’t give you a feel for is how many or few points of contact there are between languages,” he says. His study could change the way we think about language processing. “There are points of contact between languages, but my work suggests there are not as many as one might think,” he says. “I think my study will challenge my colleagues to re-examine their assumptions and look for alternative possibilities.”
One such possibility is to picture bilingual processing as something that occurs over a simple network structure, he says. As an analogy, two languages are two spider webs and their words are the nodes. A spider can get from a point in one web to a point in another as long as there is just one or two strands connecting the two webs. The webs do not have to be completely entangled.
Such network structure was first proposed by Professor Paul Meara as a way to think about how words in a given language relate to one another. Most linguists agree that a person’s vocabulary is a structured, interconnected network of words. Meara says this interlocking network “might have some implications for the way we think about language loss in Alzheimer’s patients,” where the loss of one word may affect the recall of others.
However our brains learn, lose, and sort languages, being bilingual can only be good for you. Learning and regularly using a second language can delay the onset of Alzheimer’s for the elderly and improve children’s multitasking skills, according to research by Professor Ellen Bialystok, a psychologist at York University, and colleagues in Canada. Professor Judith Kroll has also found that bilinguals are better at multitasking. Scientists believe their linguistic juggling acts effectively become mental training, improving their cognitive skills.
And if language mix-up in the brain is not as bad as we once thought, then there’s one less excuse not to try learning a new lingo. Practice makes perfect. “The more you speak a language, the easier it is for the brain. The less you use it, the harder the brain has to work,” says Professor Swathi Kiran. Still not convinced? Well, you could always use Google Translate.
By Monique Tsang