Sentence Context

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Marta Kutas - One of the best experts on this subject based on the ideXlab platform.

  • the phonemic restoration effect reveals pre n400 effect of supportive Sentence Context in speech perception
    Brain Research, 2010
    Co-Authors: David M Groppe, Marvin Choi, Tiffany Huang, Joseph Schilz, Ben Topkins, Thomas P Urbach, Marta Kutas
    Abstract:

    The phonemic restoration effect refers to the tendency for people to hallucinate a phoneme replaced by a non-speech sound (e.g., a tone) in a word. This illusion can be influenced by preceding sentential Context providing information about the likelihood of the missing phoneme. The saliency of the illusion suggests that supportive Context can affect relatively low (phonemic or lower) levels of speech processing. Indeed, a previous event-related brain potential (ERP) investigation of the phonemic restoration effect found that the processing of coughs replacing high versus low probability phonemes in sentential words differed from each other as early as the auditory N1 (120-180 ms post-stimulus); this result, however, was confounded by physical differences between the high and low probability speech stimuli, thus it could have been caused by factors such as habituation and not by supportive Context. We conducted a similar ERP experiment avoiding this confound by using the same auditory stimuli preceded by text that made critical phonemes more or less probable. We too found the robust N400 effect of phoneme/word probability, but did not observe the early N1 effect. We did however observe a left posterior effect of phoneme/word probability around 192-224 ms-clear evidence of a relatively early effect of supportive Sentence Context in speech comprehension distinct from the N400.

  • right hemisphere sensitivity to word and Sentence level Context evidence from event related brain potentials
    Journal of Experimental Psychology: Learning Memory and Cognition, 2005
    Co-Authors: Seana Coulson, Kara D. Federmeier, Cyma Van Petten, Marta Kutas
    Abstract:

    Researchers using lateralized stimuli have suggested that the left hemisphere is sensitive to Sentence-level Context, whereas the right hemisphere (RH) primarily processes word-level meaning. The authors investigated this message-blind RH model by measuring associative priming with event-related brain potentials (ERPs). For word pairs in isolation, associated words elicited more positive ERPs than unassociated words with similar magnitudes and onset latencies in both visual fields. Embedded in Sentences, these same pairs showed large sentential Context effects in both fields. Small effects of association were observed, confined to incongruous Sentences after right visual hemifield presentation but present for both congruous and incongruous Sentences after left visual hemifield presentation. Results do not support the message-blind RH model but do suggest hemispheric asymmetries in the use of word and Sentence Context during real-time processing.

  • The impact of semantic memory organization and Sentence Context information on spoken language processing by younger and older adults: An ERP study
    Psychophysiology, 2002
    Co-Authors: Kara D. Federmeier, Devon B. Mclennan, Esmeralda De Ochoa, Marta Kutas
    Abstract:

    To examine changes in semantic memory organization and use during aging, we recorded event-related potentials as younger and older adults listened to Sentences ending with the expected word, an unexpected word from the same semantic category, or an unexpected word from a different category. Half of the Contexts were highly constraining. In both groups, expected words elicited less negativity 300–500 ms ~N400! than unexpected ones, and unexpected words elicited smaller N400s when these were categorically related. Whereas younger adults showed the greatest N400 reduction to unexpected but related words in high constraint Contexts, older adults showed the opposite tendency. Thus, unlike younger adults, older adults as a group do not seem to be using Context predictively. Older adults with higher verbal fluency and larger vocabularies, however, showed the younger response pattern, suggesting resource availability may offset certain age-related changes.

  • meaning and modality influences of Context semantic memory organization and perceptual predictability on picture processing
    Journal of Experimental Psychology: Learning Memory and Cognition, 2001
    Co-Authors: Kara D. Federmeier, Marta Kutas
    Abstract:

    : Using event-related potentials (ERPs), the authors investigated the influences of Sentence Context, semantic memory organization, and perceptual predictability on picture processing. Participants read pairs of highly or weakly constraining Sentences that ended with (a) the expected item, (b) an unexpected item from the expected semantic category, or (c) an unexpected item from an unexpected category. Pictures were unfamiliar in Experiment 1 but preexposed in Experiment 2. ERPs to pictures reflected both Contextual fit and memory organization, as do ERPs to words in the same Contexts (K. D. Federmeier & M. Kutas, 1999). However, different response patterns were observed to pictures than to words. Some of these arose from perceptual predictability differences, whereas others seem to reflect true modality-based differences in semantic feature activation. Although words and pictures may share semantic memory, the authors' results show that semantic processing is not amodal.

  • in the company of other words electrophysiological evidence for single word and Sentence Context effects
    Language and Cognitive Processes, 1993
    Co-Authors: Marta Kutas
    Abstract:

    Abstract The qualitative and quantitative similarities between lexical and Sentence-level Context effects were assessed by means of scalp-recorded electrophysiological measures. Event-related brain potentials (ERPs) were recorded to the second of a pair of words in a delayed letter search task and to the final words of a series of Sentences presented one word at a time, and read for meaning and subsequent recognition. The critical words in both Context conditions varied in the degree to which they were semantically or associatively related to the preceding Context. In both cases, the ERPs to the critical words were associated with N400 components whose amplitude varied with expectancy and association. Neither the latency nor scalp distribution of the early phase of these two N400 effects differed as a function of Context; the effects differed only in amplitude, with the word-level effect being smaller. Thus, as indexed by the N400 effect, there appears to be a remarkable qualitative similarity between the...

Robert J. Hartsuiker - One of the best experts on this subject based on the ideXlab platform.

  • the influence of Sentence Context and accented speech on lexical access in second language auditory word recognition
    Bilingualism: Language and Cognition, 2013
    Co-Authors: Evelyne Lagrou, Robert J. Hartsuiker, Wouter Duyck
    Abstract:

    Until now, research on bilingual auditory word recognition has been scarce, and although most studies agree that lexical access is language-nonselective, there is less consensus with respect to the influence of potentially constraining factors. The present study investigated the influence of three possible constraints. We tested whether language nonselectivity is restricted by (a) a Sentence Context in a second language (L2), (b) the semantic constraint of the Sentence, and (c) the native language of the speaker. Dutch–English bilinguals completed an English auditory lexical decision task on the last word of low- and high-constraining Sentences. Sentences were pronounced by a native Dutch speaker with English as the L2, or by a native English speaker with Dutch as the L2. Interlingual homophones (e.g., lief “sweet” – leaf /liːf/) were always recognized more slowly than control words. The semantic constraint of the Sentence and the native accent of the speaker modulated, but did not eliminate interlingual homophone effects. These results are discussed within language-nonselective models of lexical access in bilingual auditory word recognition.

  • bilingual word recognition in a Sentence Context
    Frontiers in Psychology, 2012
    Co-Authors: Eva Van Assche, Wouter Duyck, Robert J. Hartsuiker
    Abstract:

    This article provides an overview of bilingualism research on visual word recognition in isolation and in Sentence Context. Many studies investigating the processing of words out-of-Context have shown that lexical representations from both languages are activated when reading in one language (language-nonselective lexical access). A newly developed research line asks whether language-nonselective access generalizes to word recognition in Sentence Contexts, providing a language cue and/or semantic constraint information for upcoming words. Recent studies suggest that the language of the preceding words is insufficient to restrict lexical access to words of the target language, even when reading in the native language. Eyetracking studies revealing the time course of word activation further showed that semantic constraint does not restrict language-nonselective access at early reading stages, but there is evidence that it has a relatively late effect. The theoretical implications for theories of bilingual word recognition are discussed in light of the Bilingual Interactive Activation + model (Dijkstra & Van Heuven, 2002).

  • does bilingualism change native language reading cognate effects in a Sentence Context
    Psychological Science, 2009
    Co-Authors: Eva Van Assche, Robert J. Hartsuiker, Wouter Duyck, Kevin Diependaele
    Abstract:

    Becoming a bilingual can change a person's cognitive functioning and language processing in a number of ways. This study focused on how knowledge of a second language influences how people read Sentences written in their native language. We used the cognate-facilitation effect as a marker of cross-lingual activations in both languages. Cognates (e.g., Dutch-English schip [ship]) and controls were presented in a Sentence Context, and eye movements were monitored. Results showed faster reading times for cognates than for controls. Thus, this study shows that one of people's most automated skills, reading in one's native language, is changed by the knowledge of a second language.

  • Visual word recognition by bilinguals in a Sentence Context: Evidence for nonselective lexical access
    Journal of experimental psychology. Learning memory and cognition, 2007
    Co-Authors: Wouter Duyck, Eva Van Assche, Denis Drieghe, Robert J. Hartsuiker
    Abstract:

    Recent research on bilingualism has shown that lexical access in visual word recognition by bilinguals is not selective with respect to language. In the present study, the authors investigated language-independent lexical access in bilinguals reading Sentences, which constitutes a strong unilingual linguistic Context. In the first experiment, Dutch-English bilinguals performing a 2nd language (L2) lexical decision task were faster to recognize identical and nonidentical cognate words (e.g., banaan-banana) presented in isolation than control words. A second experiment replicated this effect when the same set of cognates was presented as the final words of low-constraint Sentences. In a third experiment that used eyetracking, the authors showed that early target reading time measures also yield cognate facilitation but only for identical cognates. These results suggest that a Sentence Context may influence, but does not nullify, cross-lingual lexical interactions during early visual word recognition by bilinguals.

Wouter Duyck - One of the best experts on this subject based on the ideXlab platform.

  • the influence of Sentence Context and accented speech on lexical access in second language auditory word recognition
    Bilingualism: Language and Cognition, 2013
    Co-Authors: Evelyne Lagrou, Robert J. Hartsuiker, Wouter Duyck
    Abstract:

    Until now, research on bilingual auditory word recognition has been scarce, and although most studies agree that lexical access is language-nonselective, there is less consensus with respect to the influence of potentially constraining factors. The present study investigated the influence of three possible constraints. We tested whether language nonselectivity is restricted by (a) a Sentence Context in a second language (L2), (b) the semantic constraint of the Sentence, and (c) the native language of the speaker. Dutch–English bilinguals completed an English auditory lexical decision task on the last word of low- and high-constraining Sentences. Sentences were pronounced by a native Dutch speaker with English as the L2, or by a native English speaker with Dutch as the L2. Interlingual homophones (e.g., lief “sweet” – leaf /liːf/) were always recognized more slowly than control words. The semantic constraint of the Sentence and the native accent of the speaker modulated, but did not eliminate interlingual homophone effects. These results are discussed within language-nonselective models of lexical access in bilingual auditory word recognition.

  • bilingual word recognition in a Sentence Context
    Frontiers in Psychology, 2012
    Co-Authors: Eva Van Assche, Wouter Duyck, Robert J. Hartsuiker
    Abstract:

    This article provides an overview of bilingualism research on visual word recognition in isolation and in Sentence Context. Many studies investigating the processing of words out-of-Context have shown that lexical representations from both languages are activated when reading in one language (language-nonselective lexical access). A newly developed research line asks whether language-nonselective access generalizes to word recognition in Sentence Contexts, providing a language cue and/or semantic constraint information for upcoming words. Recent studies suggest that the language of the preceding words is insufficient to restrict lexical access to words of the target language, even when reading in the native language. Eyetracking studies revealing the time course of word activation further showed that semantic constraint does not restrict language-nonselective access at early reading stages, but there is evidence that it has a relatively late effect. The theoretical implications for theories of bilingual word recognition are discussed in light of the Bilingual Interactive Activation + model (Dijkstra & Van Heuven, 2002).

  • does bilingualism change native language reading cognate effects in a Sentence Context
    Psychological Science, 2009
    Co-Authors: Eva Van Assche, Robert J. Hartsuiker, Wouter Duyck, Kevin Diependaele
    Abstract:

    Becoming a bilingual can change a person's cognitive functioning and language processing in a number of ways. This study focused on how knowledge of a second language influences how people read Sentences written in their native language. We used the cognate-facilitation effect as a marker of cross-lingual activations in both languages. Cognates (e.g., Dutch-English schip [ship]) and controls were presented in a Sentence Context, and eye movements were monitored. Results showed faster reading times for cognates than for controls. Thus, this study shows that one of people's most automated skills, reading in one's native language, is changed by the knowledge of a second language.

  • Visual word recognition by bilinguals in a Sentence Context: Evidence for nonselective lexical access
    Journal of experimental psychology. Learning memory and cognition, 2007
    Co-Authors: Wouter Duyck, Eva Van Assche, Denis Drieghe, Robert J. Hartsuiker
    Abstract:

    Recent research on bilingualism has shown that lexical access in visual word recognition by bilinguals is not selective with respect to language. In the present study, the authors investigated language-independent lexical access in bilinguals reading Sentences, which constitutes a strong unilingual linguistic Context. In the first experiment, Dutch-English bilinguals performing a 2nd language (L2) lexical decision task were faster to recognize identical and nonidentical cognate words (e.g., banaan-banana) presented in isolation than control words. A second experiment replicated this effect when the same set of cognates was presented as the final words of low-constraint Sentences. In a third experiment that used eyetracking, the authors showed that early target reading time measures also yield cognate facilitation but only for identical cognates. These results suggest that a Sentence Context may influence, but does not nullify, cross-lingual lexical interactions during early visual word recognition by bilinguals.

Kara D. Federmeier - One of the best experts on this subject based on the ideXlab platform.

  • finding the right word hemispheric asymmetries in the use of Sentence Context information
    Neuropsychologia, 2007
    Co-Authors: Edward W Wlotko, Kara D. Federmeier
    Abstract:

    The cerebral hemispheres have been shown to be differentially sensitive to Sentence-level information; in particular, it has been suggested that only the left hemisphere (LH) makes predictions about upcoming items, whereas the right (RH) processes words in a more integrative fashion. The current study used event-related potentials to jointly examine the effects of expectancy and sentential constraint on word processing. Expected and unexpected but plausible words matched for Contextual fit were inserted into strongly and weakly constraining Sentence frames and presented to the left and right visual fields (LVF and RVF). Consistent with the prediction/integration view, the P2 was sensitive to constraint: words in strongly constraining Contexts elicited larger P2s than those in less predictive Contexts, for RVF/LH presentation only. N400 responses for both VFs departed from the typical pattern of amplitudes graded by cloze probability. Expected endings in strongly and weakly constraining Contexts were facilitated to a similar degree with RVF/LH presentation, and expected endings in weakly constraining Contexts were not facilitated compared to unexpected endings in those Contexts for LVF/RH presentation. These data suggest that responses seen for central presentation reflect contributions from both hemispheres. Finally, a late positivity, larger for unexpected endings in strongly constraining Contexts, observed for these stimuli with central presentation was not seen here for either VF. Thus, some phenomena observed with central presentation may be an emergent property of mechanisms that require interhemispheric cooperation. These data highlight the importance of understanding hemispheric asymmetries and their implications for normal language processing.

  • right hemisphere sensitivity to word and Sentence level Context evidence from event related brain potentials
    Journal of Experimental Psychology: Learning Memory and Cognition, 2005
    Co-Authors: Seana Coulson, Kara D. Federmeier, Cyma Van Petten, Marta Kutas
    Abstract:

    Researchers using lateralized stimuli have suggested that the left hemisphere is sensitive to Sentence-level Context, whereas the right hemisphere (RH) primarily processes word-level meaning. The authors investigated this message-blind RH model by measuring associative priming with event-related brain potentials (ERPs). For word pairs in isolation, associated words elicited more positive ERPs than unassociated words with similar magnitudes and onset latencies in both visual fields. Embedded in Sentences, these same pairs showed large sentential Context effects in both fields. Small effects of association were observed, confined to incongruous Sentences after right visual hemifield presentation but present for both congruous and incongruous Sentences after left visual hemifield presentation. Results do not support the message-blind RH model but do suggest hemispheric asymmetries in the use of word and Sentence Context during real-time processing.

  • The impact of semantic memory organization and Sentence Context information on spoken language processing by younger and older adults: An ERP study
    Psychophysiology, 2002
    Co-Authors: Kara D. Federmeier, Devon B. Mclennan, Esmeralda De Ochoa, Marta Kutas
    Abstract:

    To examine changes in semantic memory organization and use during aging, we recorded event-related potentials as younger and older adults listened to Sentences ending with the expected word, an unexpected word from the same semantic category, or an unexpected word from a different category. Half of the Contexts were highly constraining. In both groups, expected words elicited less negativity 300–500 ms ~N400! than unexpected ones, and unexpected words elicited smaller N400s when these were categorically related. Whereas younger adults showed the greatest N400 reduction to unexpected but related words in high constraint Contexts, older adults showed the opposite tendency. Thus, unlike younger adults, older adults as a group do not seem to be using Context predictively. Older adults with higher verbal fluency and larger vocabularies, however, showed the younger response pattern, suggesting resource availability may offset certain age-related changes.

  • meaning and modality influences of Context semantic memory organization and perceptual predictability on picture processing
    Journal of Experimental Psychology: Learning Memory and Cognition, 2001
    Co-Authors: Kara D. Federmeier, Marta Kutas
    Abstract:

    : Using event-related potentials (ERPs), the authors investigated the influences of Sentence Context, semantic memory organization, and perceptual predictability on picture processing. Participants read pairs of highly or weakly constraining Sentences that ended with (a) the expected item, (b) an unexpected item from the expected semantic category, or (c) an unexpected item from an unexpected category. Pictures were unfamiliar in Experiment 1 but preexposed in Experiment 2. ERPs to pictures reflected both Contextual fit and memory organization, as do ERPs to words in the same Contexts (K. D. Federmeier & M. Kutas, 1999). However, different response patterns were observed to pictures than to words. Some of these arose from perceptual predictability differences, whereas others seem to reflect true modality-based differences in semantic feature activation. Although words and pictures may share semantic memory, the authors' results show that semantic processing is not amodal.

Falk Huettig - One of the best experts on this subject based on the ideXlab platform.

  • encouraging prediction during production facilitates subsequent comprehension evidence from interleaved object naming in Sentence Context and Sentence reading
    Quarterly Journal of Experimental Psychology, 2016
    Co-Authors: Florian Hintz, Antje S Meyer, Falk Huettig
    Abstract:

    Many studies have shown that a supportive Context facilitates language comprehension. A currently influential view is that language production may support prediction in language comprehension. Experimental evidence for this, however, is relatively sparse. Here we explored whether encouraging prediction in a language production task encourages the use of predictive Contexts in an interleaved comprehension task. In Experiment 1a, participants listened to the first part of a Sentence and provided the final word by naming aloud a picture. The picture name was predictable or not predictable from the Sentence Context. Pictures were named faster when they could be predicted than when this was not the case. In Experiment 1b the same Sentences, augmented by a final spill-over region, were presented in a self-paced reading task. No difference in reading times for predictive versus non-predictive Sentences was found. In Experiment 2, reading and naming trials were intermixed. In the naming task, the advantage for predictable picture names was replicated. More importantly, now reading times for the spill-over region were considerable faster for predictive than for non-predictive Sentences. We conjecture that these findings fit best with the notion that prediction in the service of language production encourages the use of predictive Contexts in comprehension. Further research is required to identify the exact mechanisms by which production exerts its influence on comprehension.

  • encouraging prediction during production facilitates subsequent comprehension evidence from interleaved object naming in Sentence Context and Sentence reading
    Quarterly Journal of Experimental Psychology, 2016
    Co-Authors: Florian Hintz, Antje S Meyer, Falk Huettig
    Abstract:

    Many studies have shown that a supportive Context facilitates language comprehension. A currently influential view is that language production may support prediction in language comprehension. Expe...

  • limits to cross modal semantic and object shape priming in Sentence Context
    the 20th Architectures and Mechanisms for Language Processing Conference (AMLAP 2014), 2014
    Co-Authors: Joost Rommers, Falk Huettig
    Abstract:

    Many studies have documented semantic priming effects from both words and pictures on word targets, but the literature on object shape priming in language processing is less well developed. Priming is typically observed with isolated words as targets. Some studies have shown that in Sentence Contexts, priming is not an automatic consequence of speech processing (Norris et al., Cognitive Psychology 2006). In addition, priming tasks tend to involve meta-linguistic judgments. In the present study we focused on cross-modal influences, which may occur when listening to spoken Sentences while being situated in a visual environment. We tested effects of picture and written word primes on processing of target words embedded in Sentences. The primes were related to the targets in meaning or object shape. We investigated whether these aspects automatically prime spoken-word processing even in Sentence Contexts and without a judgment task.