NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing 1 to 15 of 294 results
Peer reviewed Peer reviewed
Direct linkDirect link
Tye-Murray, Nancy; Hale, Sandra; Spehar, Brent; Myerson, Joel; Sommers, Mitchell S. – Journal of Speech, Language, and Hearing Research, 2014
Purpose: The study addressed three research questions: Does lipreading improve between the ages of 7 and 14 years? Does hearing loss affect the development of lipreading? How do individual differences in lipreading relate to other abilities? Method: Forty children with normal hearing (NH) and 24 with hearing loss (HL) were tested using 4…
Descriptors: Hearing Impairments, Deafness, Comparative Analysis, Children
Peer reviewed Peer reviewed
Direct linkDirect link
Cleland, Joanne; Mccron, Caitlin; Scobbie, James M. – Clinical Linguistics & Phonetics, 2013
Speakers possess a natural capacity for lip reading; analogous to this, there may be an intuitive ability to "tongue-read." Although the ability of untrained participants to perceive aspects of the speech signal has been explored for some visual representations of the vocal tract (e.g. talking heads), it is not yet known to what extent there is a…
Descriptors: Speech, Comparative Analysis, Adults, Undergraduate Students
Peer reviewed Peer reviewed
Direct linkDirect link
Kyle, Fiona E.; Campbell, Ruth; Mohammed, Tara; Coleman, Mike; MacSweeney, Mairead – Journal of Speech, Language, and Hearing Research, 2013
Purpose: In this article, the authors describe the development of a new instrument, the Test of Child Speechreading (ToCS), which was specifically designed for use with deaf and hearing children. Speechreading is a skill that is required for deaf children to access the language of the hearing community. ToCS is a deaf-friendly, computer-based test…
Descriptors: Test Construction, Deafness, Lipreading, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Meronen, Auli; Tiippana, Kaisa; Westerholm, Jari; Ahonen, Timo – Journal of Speech, Language, and Hearing Research, 2013
Purpose: The effect of the signal-to-noise ratio (SNR) on the perception of audiovisual speech in children with and without developmental language disorder (DLD) was investigated by varying the noise level and the sound intensity of acoustic speech. The main hypotheses were that the McGurk effect (in which incongruent visual speech alters the…
Descriptors: Auditory Perception, Visual Perception, Speech, Children
Troiano, Claire A. – PEPNet 2, 2010
An oral transliterator provides communication access to a person who is deaf or hard of hearing and who uses speechreading and speaking as a means of communicating. The oral transliterator, positioned in front of the speechreader, inaudibly repeats the spoken message, making it as speechreadable as possible. This is called Expressive Oral…
Descriptors: Deafness, Partial Hearing, Lipreading, Deaf Interpreting
PEPNet 2, 2009
Individuals who are deaf or hard of hearing are just like other students except they do not hear as well. They come in all shapes and sizes and call themselves by many names such as: deaf, hard of hearing, or hearing impaired. Just remember the student is a person first--and should be treated the same as anyone else. The biggest issue a residence…
Descriptors: Sign Language, Dormitories, Deafness, Hearing Impairments
Peer reviewed Peer reviewed
Direct linkDirect link
Woll, Bencie – Deafness and Education International, 2012
Although speechreading has always served an important role in the communication of deaf people, educational interest in speechreading has decreased in recent decades. This paper reviews speechreading in terms of speech processing, neural activity and literacy, and suggests that it has an important role in intervention programmes for all deaf…
Descriptors: Deafness, Assistive Technology, Brain, Lipreading
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Mather, Susan M.; Clark, M. Diane – Odyssey: New Directions in Deaf Education, 2012
One of the ongoing challenges teachers of students who are deaf or hard of hearing face is managing the visual split attention implicit in multimedia learning. When a teacher presents various types of visual information at the same time, visual learners have no choice but to divide their attention among those materials and the teacher and…
Descriptors: Partial Hearing, Deafness, Attention, Learning Strategies
Peer reviewed Peer reviewed
Direct linkDirect link
Palmer, Terry D.; Ramsey, Ashley K. – Cognition, 2012
The function of consciousness was explored in two contexts of audio-visual speech, cross-modal visual attention guidance and McGurk cross-modal integration. Experiments 1, 2, and 3 utilized a novel cueing paradigm in which two different flash suppressed lip-streams cooccured with speech sounds matching one of these streams. A visual target was…
Descriptors: Attention, Probability, Cues, Lipreading
Peer reviewed Peer reviewed
Direct linkDirect link
Picou, Erin M.; Ricketts, Todd A; Hornsby, Benjamin W. Y. – Journal of Speech, Language, and Hearing Research, 2011
Purpose: To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Method: Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and…
Descriptors: Cues, Listening, Short Term Memory, Lipreading
Peer reviewed Peer reviewed
Direct linkDirect link
Capek, Cheryl M.; Woll, Bencie; MacSweeney, Mairead; Waters, Dafydd; McGuire, Philip K.; David, Anthony S.; Brammer, Michael J.; Campbell, Ruth – Brain and Language, 2010
Studies of spoken and signed language processing reliably show involvement of the posterior superior temporal cortex. This region is also reliably activated by observation of meaningless oral and manual actions. In this study we directly compared the extent to which activation in posterior superior temporal cortex is modulated by linguistic…
Descriptors: Sign Language, Deafness, Language Processing, Language Enrichment
Peer reviewed Peer reviewed
Direct linkDirect link
Hessler, Dorte; Jonkers, Roel; Bastiaanse, Roelien – Clinical Linguistics & Phonetics, 2010
Individuals with aphasia have more problems detecting small differences between speech sounds than larger ones. This paper reports how phonemic processing is impaired and how this is influenced by speechreading. A non-word discrimination task was carried out with "audiovisual", "auditory only" and "visual only" stimulus display. Subjects had to…
Descriptors: Articulation (Speech), Phonetics, Aphasia, Task Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Kyle, Fiona E.; Harris, Margaret – Journal of Experimental Child Psychology, 2010
The development of reading ability in a group of deaf children was followed over a 3-year period. A total of 29 deaf children (7-8 years of age at the first assessment) participated in the study, and every 12 months they were given a battery of literacy, cognitive, and language tasks. Earlier vocabulary and speechreading skills predicted…
Descriptors: Phonology, Reading Achievement, Deafness, Phonological Awareness
Peer reviewed Peer reviewed
Direct linkDirect link
Vroomen, Jean; Baart, Martijn – Language and Speech, 2009
Listeners hearing an ambiguous speech sound flexibly adjust their phonetic categories in accordance with lipread information telling what the phoneme should be (recalibration). Here, we tested the stability of lipread-induced recalibration over time. Listeners were exposed to an ambiguous sound halfway between /t/ and /p/ that was dubbed onto a…
Descriptors: Phonetics, Lipreading, Phonemes, Classification
Peer reviewed Peer reviewed
Direct linkDirect link
Davies, Rebecca; Kidd, Evan; Lander, Karen – International Journal of Language & Communication Disorders, 2009
Background: Previous research has found that newborn infants can match phonetic information in the lips and voice from as young as ten weeks old. There is evidence that access to visual speech is necessary for normal speech development. Although we have an understanding of this early sensitivity, very little research has investigated older…
Descriptors: Feedback (Response), Research Needs, Phonology, Preschool Children
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  20