Humanised mouse brain shows how speech likely evolved (Wired UK)


Shutterstock


Geneticists have narrowed down the root of human language
evolution after splicing our genes into mice
to enhance their learning abilities. In the
first study
to investigate the cognitive effects of
partially humanising another species’
brain, MIT researchers found that mice with the human
variant gene were able to learn maze paths faster than their normal
counterparts.

The improvement, which saw hybrid mice navigating novel routes
four days faster than wild-type ones, is thought to be due to the
human gene facilitating the change between declarative learning (“take the third path on the right
when looking at this wall”) and procedural (“turn left once your
feet are on carpet”). In mice, these two types of learning can
interfere with one another, and in a maze where landmarks and floor
texture were always changing, this proved to their detriment.

The gene is identical to comparative mammalian ones aside from a
human-specific substitution of just two amino acid building blocks
in its structure. This substitution is thought to be the start of
advanced language development only observed in humans, as other
animals can understand each other but lack the ability to generate
language. The gene itself — transcription factor forkhead box P2
(FOXP2) — is the sole gene linked to speech and language that was
positively selected for during human evolution.

FOXP2 helps the switch between two types of learning thought to
be necessary for strengthening routines until they are
subconscious, the same way we learn language. To learn the names
and sounds we use in speech for things, we repeat them while
looking at the object. Eventually looking at the object is enough
to immediately cause its name to come to us. FOXP2 also enables
this by providing a dose of dopamine to the area of the brain
associated with routine formation, and turning off neurons
elsewhere.

“This really is an important brick in the wall saying that the
form of the gene that allowed us to speak may have something to do
with a special kind of learning,” said Ann Graybiel, a neuroscientist at MIT. “Which takes us
from having to make conscious associations in order to act to a
nearly automatic-pilot way of acting based on the cues around
us.”

If the article suppose to have a video or a photo gallery and it does not appear on your screen, please Click Here

19 September 2014 | 9:20 am – Source: wired.co.uk

[ad_2]

Leave a Reply

Your email address will not be published.