Person:
Ahveninen, Jyrki

Loading...
Profile Picture

Email Address

AA Acceptance Date

Birth Date

Research Projects

Organizational Units

Job Title

Last Name

Ahveninen

First Name

Jyrki

Name

Ahveninen, Jyrki

Search Results

Now showing 1 - 8 of 8
  • Thumbnail Image
    Publication
    Evidence for distinct human auditory cortex regions for sound location versus identity processing
    (2014) Ahveninen, Jyrki; Huang, Samantha; Nummenmaa, Aapo; Belliveau, John William; Hung, An-Yi; Jääskeläinen, Iiro P.; Rauschecker, Josef P.; Rossi, Stephanie; Tiitinen, Hannu; Raij, Tommi
    Neurophysiological animal models suggest that anterior auditory cortex (AC) areas process sound-identity information, whereas posterior ACs specialize in sound location processing. In humans, inconsistent neuroimaging results and insufficient causal evidence have challenged the existence of such parallel AC organization. Here we transiently inhibit bilateral anterior or posterior AC areas using MRI-guided paired-pulse transcranial magnetic stimulation (TMS) while subjects listen to Reference/Probe sound pairs and perform either sound location or identity discrimination tasks. The targeting of TMS pulses, delivered 55–145 ms after Probes, is confirmed with individual-level cortical electric-field estimates. Our data show that TMS to posterior AC regions delays reaction times (RT) significantly more during sound location than identity discrimination, whereas TMS to anterior AC regions delays RTs significantly more during sound identity than location discrimination. This double dissociation provides direct causal support for parallel processing of sound identity features in anterior AC and sound location in posterior AC.
  • Thumbnail Image
    Publication
    Auditory-Cortex Short-Term Plasticity Induced by Selective Attention
    (Hindawi Publishing Corporation, 2014) Jääskeläinen, Iiro P.; Ahveninen, Jyrki
    The ability to concentrate on relevant sounds in the acoustic environment is crucial for everyday function and communication. Converging lines of evidence suggests that transient functional changes in auditory-cortex neurons, “short-term plasticity”, might explain this fundamental function. Under conditions of strongly focused attention, enhanced processing of attended sounds can take place at very early latencies (~50 ms from sound onset) in primary auditory cortex and possibly even at earlier latencies in subcortical structures. More robust selective-attention short-term plasticity is manifested as modulation of responses peaking at ~100 ms from sound onset in functionally specialized nonprimary auditory-cortical areas by way of stimulus-specific reshaping of neuronal receptive fields that supports filtering of selectively attended sound features from task-irrelevant ones. Such effects have been shown to take effect in ~seconds following shifting of attentional focus. There are findings suggesting that the reshaping of neuronal receptive fields is even stronger at longer auditory-cortex response latencies (~300 ms from sound onset). These longer-latency short-term plasticity effects seem to build up more gradually, within tens of seconds after shifting the focus of attention. Importantly, some of the auditory-cortical short-term plasticity effects observed during selective attention predict enhancements in behaviorally measured sound discrimination performance.
  • Thumbnail Image
    Publication
    Increasing fMRI Sampling Rate Improves Granger Causality Estimates
    (Public Library of Science, 2014) Lin, Fa-Hsuan; Ahveninen, Jyrki; Raij, Tommi; Witzel, Thomas; Chu, Ying-Hua; Jääskeläinen, Iiro P.; Tsai, Kevin Wen-Kai; Kuo, Wen-Jui; Belliveau, John W.
    Estimation of causal interactions between brain areas is necessary for elucidating large-scale functional brain networks underlying behavior and cognition. Granger causality analysis of time series data can quantitatively estimate directional information flow between brain regions. Here, we show that such estimates are significantly improved when the temporal sampling rate of functional magnetic resonance imaging (fMRI) is increased 20-fold. Specifically, healthy volunteers performed a simple visuomotor task during blood oxygenation level dependent (BOLD) contrast based whole-head inverse imaging (InI). Granger causality analysis based on raw InI BOLD data sampled at 100-ms resolution detected the expected causal relations, whereas when the data were downsampled to the temporal resolution of 2 s typically used in echo-planar fMRI, the causality could not be detected. An additional control analysis, in which we SINC interpolated additional data points to the downsampled time series at 0.1-s intervals, confirmed that the improvements achieved with the real InI data were not explainable by the increased time-series length alone. We therefore conclude that the high-temporal resolution of InI improves the Granger causality connectivity analysis of the human brain.
  • Thumbnail Image
    Publication
    Auditory Conflict Resolution Correlates with Medial–Lateral Frontal Theta/Alpha Phase Synchrony
    (Public Library of Science, 2014) Huang, Samantha; Rossi, Stephanie; Hamalainen, Matti; Ahveninen, Jyrki
    When multiple persons speak simultaneously, it may be difficult for the listener to direct attention to correct sound objects among conflicting ones. This could occur, for example, in an emergency situation in which one hears conflicting instructions and the loudest, instead of the wisest, voice prevails. Here, we used cortically-constrained oscillatory MEG/EEG estimates to examine how different brain regions, including caudal anterior cingulate (cACC) and dorsolateral prefrontal cortices (DLPFC), work together to resolve these kinds of auditory conflicts. During an auditory flanker interference task, subjects were presented with sound patterns consisting of three different voices, from three different directions (45° left, straight ahead, 45° right), sounding out either the letters “A” or “O”. They were asked to discriminate which sound was presented centrally and ignore the flanking distracters that were phonetically either congruent (50%) or incongruent (50%) with the target. Our cortical MEG/EEG oscillatory estimates demonstrated a direct relationship between performance and brain activity, showing that efficient conflict resolution, as measured with reduced conflict-induced RT lags, is predicted by theta/alpha phase coupling between cACC and right lateral frontal cortex regions intersecting the right frontal eye fields (FEF) and DLPFC, as well as by increased pre-stimulus gamma (60–110 Hz) power in the left inferior fontal cortex. Notably, cACC connectivity patterns that correlated with behavioral conflict-resolution measures were found during both the pre-stimulus and the pre-response periods. Our data provide evidence that, instead of being only transiently activated upon conflict detection, cACC is involved in sustained engagement of attentional resources required for effective sound object selection performance.
  • Thumbnail Image
    Publication
    Neural mechanisms supporting evaluation of others’ errors in real-life like conditions
    (Nature Publishing Group, 2016) Jääskeläinen, Iiro P.; Halme, Hanna-Leena; Agam, Yigal; Glerean, Enrico; Lahnakoski, Juha M; Sams, Mikko; Tapani, Karoliina; Ahveninen, Jyrki; Manoach, Dara
    The ability to evaluate others’ errors makes it possible to learn from their mistakes without the need for first-hand trial-and-error experiences. Here, we compared functional magnetic resonance imaging activation to self-committed errors during a computer game to a variety of errors committed by others during movie clips (e.g., figure skaters falling down and persons behaving inappropriately). While viewing errors by others there was activation in lateral and medial temporal lobe structures, posterior cingulate cortex, precuneus, and medial prefrontal cortex possibly reflecting simulation and storing for future use alternative action sequences that could have led to successful behaviors. During both self- and other-committed errors activation was seen in the striatum, temporoparietal junction, and inferior frontal gyrus. These areas may be components of a generic error processing mechanism. The ecological validity of the stimuli seemed to matter, since we largely failed to see activations when subjects observed errors by another player in the computer game, as opposed to observing errors in the rich real-life like human behaviors depicted in the movie clips.
  • Thumbnail Image
    Publication
    Brain Networks of Novelty-Driven Involuntary and Cued Voluntary Auditory Attention Shifting
    (Public Library of Science, 2012) Huang, Samantha; Belliveau, John William; Tengshe, Chinmayi; Ahveninen, Jyrki
    In everyday life, we need a capacity to flexibly shift attention between alternative sound sources. However, relatively little work has been done to elucidate the mechanisms of attention shifting in the auditory domain. Here, we used a mixed event-related/sparse-sampling fMRI approach to investigate this essential cognitive function. In each 10-sec trial, subjects were instructed to wait for an auditory “cue” signaling the location where a subsequent “target” sound was likely to be presented. The target was occasionally replaced by an unexpected “novel” sound in the uncued ear, to trigger involuntary attention shifting. To maximize the attention effects, cues, targets, and novels were embedded within dichotic 800-Hz vs. 1500-Hz pure-tone “standard” trains. The sound of clustered fMRI acquisition (starting at t = 7.82 sec) served as a controlled trial-end signal. Our approach revealed notable activation differences between the conditions. Cued voluntary attention shifting activated the superior intra­­parietal sulcus (IPS), whereas novelty-triggered involuntary orienting activated the inferior IPS and certain subareas of the precuneus. Clearly more widespread activations were observed during voluntary than involuntary orienting in the premotor cortex, including the frontal eye fields. Moreover, we found ­evidence for a frontoinsular-cingular attentional control network, consisting of the anterior insula, inferior frontal cortex, and medial frontal cortices, which were activated during both target discrimination and voluntary attention shifting. Finally, novels and targets activated much wider areas of superior temporal auditory cortices than shifting cues.
  • Thumbnail Image
    Publication
    Dissociable Influences of Auditory Object vs. Spatial Attention on Visual System Oscillatory Activity
    (Public Library of Science, 2012) Ahveninen, Jyrki; Jääskeläinen, Iiro P.; Belliveau, John William; Hämäläinen, Matti; Lin, Fa-Hsuan; Raij, Tommi
    Given that both auditory and visual systems have anatomically separate object identification (“what”) and spatial (“where”) pathways, it is of interest whether attention-driven cross-sensory modulations occur separately within these feature domains. Here, we investigated how auditory “what” vs. “where” attention tasks modulate activity in visual pathways using cortically constrained source estimates of magnetoencephalograpic (MEG) oscillatory activity. In the absence of visual stimuli or tasks, subjects were presented with a sequence of auditory-stimulus pairs and instructed to selectively attend to phonetic (“what”) vs. spatial (“where”) aspects of these sounds, or to listen passively. To investigate sustained modulatory effects, oscillatory power was estimated from time periods between sound-pair presentations. In comparison to attention to sound locations, phonetic auditory attention was associated with stronger alpha (7–13 Hz) power in several visual areas (primary visual cortex; lingual, fusiform, and inferior temporal gyri, lateral occipital cortex), as well as in higher-order visual/multisensory areas including lateral/medial parietal and retrosplenial cortices. Region-of-interest (ROI) analyses of dynamic changes, from which the sustained effects had been removed, suggested further power increases during Attend Phoneme vs. Location centered at the alpha range 400–600 ms after the onset of second sound of each stimulus pair. These results suggest distinct modulations of visual system oscillatory activity during auditory attention to sound object identity (“what”) vs. sound location (“where”). The alpha modulations could be interpreted to reflect enhanced crossmodal inhibition of feature-specific visual pathways and adjacent audiovisual association areas during “what” vs. “where” auditory attention.
  • Thumbnail Image
    Publication
    Sparsity enables estimation of both subcortical and cortical activity from MEG and EEG
    (National Academy of Sciences, 2017) Krishnaswamy, Pavitra; Obregon-Henao, Gabriel; Ahveninen, Jyrki; Khan, Sheraz; Babadi, Behtash; Iglesias, Juan Eugenio; Hamalainen, Matti; Purdon, Patrick
    Subcortical structures play a critical role in brain function. However, options for assessing electrophysiological activity in these structures are limited. Electromagnetic fields generated by neuronal activity in subcortical structures can be recorded noninvasively, using magnetoencephalography (MEG) and electroencephalography (EEG). However, these subcortical signals are much weaker than those generated by cortical activity. In addition, we show here that it is difficult to resolve subcortical sources because distributed cortical activity can explain the MEG and EEG patterns generated by deep sources. We then demonstrate that if the cortical activity is spatially sparse, both cortical and subcortical sources can be resolved with M/EEG. Building on this insight, we develop a hierarchical sparse inverse solution for M/EEG. We assess the performance of this algorithm on realistic simulations and auditory evoked response data, and show that thalamic and brainstem sources can be correctly estimated in the presence of cortical activity. Our work provides alternative perspectives and tools for characterizing electrophysiological activity in subcortical structures in the human brain.