top of page

Cortical representation of syllables in continuous speech

graph_abstract_v9.jpg

The continuous speech signal is characterized by prominent temporal modulations of its overall amplitude, reflective of its syllabic structure. In my postdoctoral work at UCSF, I probe how the speech envelope is encoded in the speech cortex using with intracranial recordings from neurosurgical patients. We discovered that neural populations in speech cortex represent the speech amplitude envelope through encoding of rapid increases in the envelope (acoustic edges). This representation reflects the rate of amplitude change, cueing the timing and stress of syllables (Oganian & Chang, Sci Adv, 2019).

​

This piece on npr contains a comprehensive summary of this work.

Foreign language use, emotions, and decision making

optimism.png

Self-related Optimism bias during foreign language use. Optimism bias, i.e. overestimation of positive vs negative event probability is larger for self-related than for other-related events in the native language (L1) and in a foreign language at high proficiency (HP) but not at low proficiency (LP) levels. (adapted from Oganian, Heekeren & Korn, 2018)

Recent developments in bilingualism research focus on the cognitive effects of foreign language use beyond immediate linguistic aspects. It has been postulated that foreign language use induces reduced emotionality, which in turns leads to a rationalization of complex decisions. In a series of projects I was able to dissociate between the effects of foreign language use on decision making and on affective processing.

Specifically, switching languages rationalized decision making under risk via an increase in cognitive load (Oganian et al., 2016, J Exp Psychol: LMC; Korn, Heekeren & Oganian, 2018, Q J Exp Psychol). In contrast, use of a low proficiency foreign language increases emotional distance and alleviates self-related biases (Oganian et al, 2018, Q J Exp Psychol).

Language decisions in bilingualism

Asset 1.jpg

The visual word form area (left hemispheric, green on top) contains information regarding the perceived language of  a word-like letter string (E.g. 'mift', Oganian et al. 2015 J Cog Neuro).

Bilinguals identify the language of an input based on prelexical and lexical statitics.

Bilingual individuals frequently encounter switches between languages in spoken and written (e.g. informal chat messages) communication. In same-script bilingual visual word recognition (=reading), a central question is at what stage the language of a word is detected. If this happens early on, it might constrain and guide word recognition. In my PhD, I used linguistic corpus analyses, behavioral experiments and functional brain imaging to demonstrate that the similarity of an input to each of a bilingual’s languages is represented at sub-lexical and lexical levels, including the visual word form area (Oganian et al., 2015, J Cogn Neurosci, see illustration).

Unlikely previously hypothesized in the field, we found that this information is independent of successful lexical access (Oganian et al. 2015, Biling: Lang Cogn) and that all levels of word processing are shaped by the structure of the native language (Oganian et al., 2016, Front Psychol).

bottom of page