Person:
Huang, Raymond

Loading...
Profile Picture

Email Address

AA Acceptance Date

Birth Date

Research Projects

Organizational Units

Job Title

Last Name

Huang

First Name

Raymond

Name

Huang, Raymond

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    Publication
    Defining language networks from resting-state fMRI for surgical planning-a feasibility study
    (Wiley-Blackwell, 2013) Tie, Yanmei; Rigolo, Laura; Norton, Isaiah Hakim; Huang, Raymond; Wu, Wentao; Orringer, Daniel; Mukundan, Srinivasan; Golby, Alexandra
    Presurgical language mapping for patients with lesions close to language areas is critical to neurosurgical decision-making for preservation of language function. As a clinical noninvasive imaging technique, functional MRI (fMRI) is used to identify language areas by measuring blood-oxygen-level dependent (BOLD) signal change while patients perform carefully timed language vs. control tasks. This task-based fMRI critically depends on task performance, excluding many patients who have difficulty performing language tasks due to neurologic deficits. On the basis of recent discovery of resting-state fMRI (rs-fMRI), we propose a “task-free” paradigm acquiring fMRI data when patients simply are at rest. This paradigm is less demanding for patients to perform and easier for technologists to administer. We investigated the feasibility of this approach in right-handed healthy control subjects. First, group independent component analysis (ICA) was applied on the training group (14 subjects) to identify group level language components based on expert rating results. Then, four empirically and structurally defined language network templates were assessed for their ability to identify language components from individuals' ICA output of the testing group (18 subjects) based on spatial similarity analysis. Results suggest that it is feasible to extract language activations from rs-fMRI at the individual subject level, and two empirically defined templates (that focuses on frontal language areas and that incorporates both frontal and temporal language areas) demonstrated the best performance. We propose a semi-automated language component identification procedure and discuss the practical concerns and suggestions for this approach to be used in clinical fMRI language mapping.
  • Thumbnail Image
    Publication
    Radiographic prediction of meningioma grade by semantic and radiomic features
    (Public Library of Science, 2017) Coroller, Thibaud; Bi, Wenya; Huynh, Elizabeth; Abedalthagafi, Malak; Aizer, Ayal A.; Greenwald, Noah; Parmar, Chintan; Narayan, Vivek; Wu, Winona; Miranda de Moura, Samuel; Gupta, Saksham; Beroukhim, Rameen; Wen, Patrick Y.; Al-Mefty, Ossama; Dunn, Ian; Santagata, Sandro; Alexander, Brian; Huang, Raymond; Aerts, Hugo
    Objectives: The clinical management of meningioma is guided by tumor grade and biological behavior. Currently, the assessment of tumor grade follows surgical resection and histopathologic review. Reliable techniques for pre-operative determination of tumor grade may enhance clinical decision-making. Methods: A total of 175 meningioma patients (103 low-grade and 72 high-grade) with pre-operative contrast-enhanced T1-MRI were included. Fifteen radiomic (quantitative) and 10 semantic (qualitative) features were applied to quantify the imaging phenotype. Area under the curve (AUC) and odd ratios (OR) were computed with multiple-hypothesis correction. Random-forest classifiers were developed and validated on an independent dataset (n = 44). Results: Twelve radiographic features (eight radiomic and four semantic) were significantly associated with meningioma grade. High-grade tumors exhibited necrosis/hemorrhage (ORsem = 6.6, AUCrad = 0.62–0.68), intratumoral heterogeneity (ORsem = 7.9, AUCrad = 0.65), non-spherical shape (AUCrad = 0.61), and larger volumes (AUCrad = 0.69) compared to low-grade tumors. Radiomic and sematic classifiers could significantly predict meningioma grade (AUCsem = 0.76 and AUCrad = 0.78). Furthermore, combining them increased the classification power (AUCradio = 0.86). Clinical variables alone did not effectively predict tumor grade (AUCclin = 0.65) or show complementary value with imaging data (AUCcomb = 0.84). Conclusions: We found a strong association between imaging features of meningioma and histopathologic grade, with ready application to clinical management. Combining qualitative and quantitative radiographic features significantly improved classification power.