6 found
Order:
  1.  69
    Visible Cohesion: A Comparison of Reference Tracking in Sign, Speech, and Co‐Speech Gesture.Pamela Perniss & Asli Özyürek - 2015 - Topics in Cognitive Science 7 (1):36-60.
    Establishing and maintaining reference is a crucial part of discourse. In spoken languages, differential linguistic devices mark referents occurring in different referential contexts, that is, introduction, maintenance, and re-introduction contexts. Speakers using gestures as well as users of sign languages have also been shown to mark referents differentially depending on the referential context. This article investigates the modality-specific contribution of the visual modality in marking referential context by providing a direct comparison between sign language and co-speech gesture with speech in (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  2.  9
    Why We Should Study Multimodal Language.Pamela Perniss - 2018 - Frontiers in Psychology 9:342098.
  3.  68
    The Influence of the Visual Modality on Language Structure and Conventionalization: Insights From Sign Language and Gesture.Pamela Perniss, Asli Özyürek & Gary Morgan - 2015 - Topics in Cognitive Science 7 (1):2-11.
    For humans, the ability to communicate and use language is instantiated not only in the vocal modality but also in the visual modality. The main examples of this are sign languages and gestures. Sign languages, the natural languages of Deaf communities, use systematic and conventionalized movements of the hands, face, and body for linguistic expression. Co-speech gestures, though non-linguistic, are produced in tight semantic and temporal integration with speech and constitute an integral part of language together with speech. The articles (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  4.  11
    Making Sense of the Hands and Mouth: The Role of “Secondary” Cues to Meaning in British Sign Language and English.Pamela Perniss, David Vinson & Gabriella Vigliocco - 2020 - Cognitive Science 44 (7):e12868.
    Successful face‐to‐face communication involves multiple channels, notably hand gestures in addition to speech for spoken language, and mouth patterns in addition to manual signs for sign language. In four experiments, we assess the extent to which comprehenders of British Sign Language (BSL) and English rely, respectively, on cues from the hands and the mouth in accessing meaning. We created congruent and incongruent combinations of BSL manual signs and mouthings and English speech and gesture by video manipulation and asked participants to (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  5. Event representations in signed languages.Asli Ozyurek & Pamela Perniss - 2011 - In Jürgen Bohnemeyer & Eric Pederson (eds.), Event representation in language and cognition. New York: Cambridge University Press.
     
    Export citation  
     
    Bookmark  
  6.  24
    Comprehending Sentences With the Body: Action Compatibility in British Sign Language?David Vinson, Pamela Perniss, Neil Fox & Gabriella Vigliocco - 2017 - Cognitive Science 41 (S6):1377-1404.
    Previous studies show that reading sentences about actions leads to specific motor activity associated with actually performing those actions. We investigate how sign language input may modulate motor activation, using British Sign Language sentences, some of which explicitly encode direction of motion, versus written English, where motion is only implied. We find no evidence of action simulation in BSL comprehension, but we find effects of action simulation in comprehension of written English sentences by deaf native BSL signers. These results provide (...)
    No categories
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark