Date Thesis Awarded
Honors Thesis -- Access Restricted On-Campus Only
Bachelors of Arts (BA)
Language and categorization are intrinsically linked. Infants must be able to notice similarities across objects and build categories defined by those similarities in order to understand nouns. Additionally, it has been demonstrated that the presence of language can help direct the attention of infants to similarities across objects, aiding them in the task of forming categories (Waxman & Markow, 1995; Fulkerson & Walkman, 2007; Ferry et al., 2010; Ferry et al., 2013; Novack et al., 2021). In young infants, this link is broadened. During the first few months of life, it is not just language that acts as a categorization aid; environmental stimuli that are similar to language in specific ways can also support categorization (Ferry et al., 2013). While much research has been done to investigate this phenomenon in the mode of oral language and auditory stimuli, less is known in the domain of manual language and visual stimuli. This current work aims to help close that gap, presenting two studies that expand upon what we know about the relationship between sign language and categorization. The first study examines the saliency for adults of facial expression and manual movements, revealing that facial expression is highly salient, more so than manual movements; this finding suggests that grammatical facial expression might be a key cue that helps infants distinguish between sign language and other visual stimuli. The second study directly investigates the effect of the presence of grammatical facial expression on infant ability to form categories.
Eubanks, Nicole, "Sign Language and Constructing Categories: The Cognitive Link" (2023). Undergraduate Honors Theses. William & Mary. Paper 2068.
On-Campus Access Only