- Hearing Impairment and Communication
- Hand Gesture Recognition Systems
- Neurobiology of Language and Bilingualism
- Action Observation and Synchronization
- Language, Metaphor, and Cognition
- Tactile and Sensory Interactions
- Phonetics and Phonology Research
- Multisensory perception and integration
- Reading and Literacy Development
- Hemispheric Asymmetry in Neuroscience
- Neural dynamics and brain function
- EEG and Brain-Computer Interfaces
- Linguistics, Language Diversity, and Identity
- Syntax, Semantics, Linguistic Variation
- linguistics and terminology studies
- Hearing Loss and Rehabilitation
- Language, Discourse, Communication Strategies
- Linguistic Variation and Morphology
- Neural and Behavioral Psychology Studies
- Functional Brain Connectivity Studies
- Advanced MRI Techniques and Applications
- Neuroscience and Music Perception
- Motor Control and Adaptation
- Neural Networks and Applications
- Language Development and Disorders
University of California, Davis
2015-2024
Google (United States)
2022
Cognitive Research (United States)
2021
Center for Applied Linguistics
2021
University of Washington
2002-2016
University of California System
2013
Seattle University
1998-2007
University of Oregon
1998
Weizmann Institute of Science
1998
National Institutes of Health
1998
Cerebral organization during sentence processing in English and American Sign Language (ASL) was characterized by employing functional magnetic resonance imaging (fMRI) at 4 T. Effects of deafness, age language acquisition, bilingualism were assessed comparing results from ( i ) normally hearing, monolingual, native speakers English, ii congenitally, genetically deaf, signers ASL who learned late through the visual modality, iii hearing bilinguals English. All groups, their language, or ASL,...
We compared normally hearing individuals and congenitally deaf as they monitored moving stimuli either in the periphery or center of visual field. When participants peripheral field, greater recruitment (as measured by functional magnetic resonance imaging) motion-selective area MT/MST was observed than individuals, whereas two groups were comparable when attending to central This finding indicates an enhancement attention space individuals. Structural equation modeling used further...
Abstract In this study, changes in blood oxygenation and volume were monitored while monolingual right-handed subjects read English sentences. Our results confirm the role of left peri-sylvian cortex language processing. Interestingly, individual subject analyses reveal a pattern activation characterized by several small, limited patches rather than few large, anatomically well-circumscribed centers. Between-subject lateralized active classical areas including Broca's area, Wernicke's...
Two experiments are reported which investigate lexical recognition in American Sign Language (ASL). Exp. 1 examined identification of monomorphemic signs and investigated how the manipulation phonological parameters affected sign identification. Over-all was much faster than what has been found for spoken language The phonetic structure (the simultaneous availability Handshape Location information) phonotactics ASL lexicon argued to account this difference. 2 compared time course...
The study of phonological structure and patterns across languages is seen by contemporary phonologists as a way gaining insight into language cognitive system. Traditionally, have focused on spoken languages. More recently, we observed growing interest in the grammatical system underlying signed deaf. This development field phonology provides natural laboratory for investigating universals. As systems, part, reflect modality which they are expressed, comparison permits us to separate those...
Abstract The studies reported here investigate deaf and hearing subjects' ratings of American Sign Language (ASL) signs to assess whether linguistic experience shapes judgements sign similarity. language stimuli were constructed in such a way as vary the formational similarity along well-accepted 'phonological' parameters. Deaf subjects native nonnative users ASL, whereas sign-nä L ve. In Study 1, asked choose from that shared different combinations two parameters with target (movement +...
Abstract This cortical stimulation mapping study investigates the neural representation of action and object naming. Data from 13 neurosurgical subjects undergoing awake is presented. Our findings indicate clear evidence differential disruption noun verb naming in context this task. At individual level, was found for punctuate regions perisylvian cortex subserving function. Across subjects, however, location these sites varied. finding may help explain discrepancies between lesion functional...
In humans the two cerebral hemispheres of brain are functionally specialized with left hemisphere predominantly mediating language skills. The basis this lateralization has been proposed to be differential localization linguistic, motoric, or symbolic properties language. To distinguish among these possibilities, spoken language, signed and nonlinguistic gesture have compared in deaf hearing individuals. This analysis, plus additional clinical findings, support a linguistic specialization.
Studies of written and spoken language suggest that nonidentical brain networks support semantic syntactic processing. Event-related potential (ERP) studies languages show anomalies elicit a posterior bilateral N400, whereas left anterior negativity, followed by broadly distributed late positivity. The present study assessed whether these ERP indicators index the activity systems specific for processing aural-oral or if they neural underlying any natural language, including sign language....
For deaf users of American Sign Language (ASL), facial behaviors function in two distinct ways: to convey affect (as with spoken languages) and mark certain specific g rammatical structures (e.g., relative clauses), thus subserving distinctly linguistic functions ways that are unique signed languages. The existence functionally different classes raises questions concerning neural control language nonlanguage functions. Examining patterns mediation for differential expressions, versus...
When children interpret spoken language in real time, linguistic information drives rapid shifts visual attention to objects the world. This language-vision interaction can provide insights into children's developing efficiency comprehension. But how does influence when signal and world are both processed via channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning (16-53 mos, 16 deaf, 13...
During fMRI, dyslexic and control boys completed auditory language tasks (judging whether pairs of real and/or pseudo words rhymed or were words) in 30 s ‘on’ conditions alternating with a ‘off’ condition tone same). phonological judgment, dyslexics had more activity than controls right left inferior temporal gyrus precentral gyrus. lexical less active bilateral middle frontal orbital cortex. Individual reliably insula Dyslexic children differ brain activation during processing skills that...
FUNCTIONAL magnetic resonance imaging (fMRI) was used to compare the cerebral organization during sentence processing in English and American sign language (ASL). Classical areas within left hemisphere were recruited by both native speakers ASL signers. This suggests a bias of process natural languages independently modality through which is perceived. Furthermore, contrast English, strongly right structures. true irrespective whether signers deaf or hearing. Thus, specific requirements also...
Abstract Unlike spoken languages, sign languages of the deaf make use two primary articulators, right and left hands, to produce signs. This situation has no obvious parallel in which speech articulation is carried out by symmetrical unitary midline vocal structures. arrangement affords a unique opportunity examine robustness linguistic systems that underlie language production face contrasting articulatory demands chart differential effects handedness for highly skilled movements. Positron...
Unlike spoken languages, sign languages of the deaf make use two primary articulators, right and left hands, to produce signs. This situation has no obvious parallel in which speech articulation is carried out by symmetrical unitary midline vocal structures. arrangement affords a unique opportunity examine robustness linguistic systems that underlie language production face contrasting articulatory demands chart differential effects handedness for highly skilled movements. Positron emission...