Sign Language Project Wins People's Choice Award

ASL-LEX offers linguists, psychologists and language learners a tool to explore the properties of sign language more deeply.

Wednesday, March 29, 2017
ASL-LEX: A Visualization of American Sign Language (Credit: Naomi Caselli, Zed Sevcikova Sehyr, Ariel Cohen-Goldberg, Ben Tanen, Karen Emmorey)
ASL-LEX: A Visualization of American Sign Language (Credit: Naomi Caselli, Zed Sevcikova Sehyr, Ariel Cohen-Goldberg, Ben Tanen, Karen Emmorey)
Somewhere between 500,000 and two million people—exact numbers are hard to come by—use American Sign Language (ASL) in the United States and Canada, and sign language research is an active area of investigation for psychologists and neuroscientists hoping to learn more about the underpinnings of language in the brain. Learners and researchers of any language benefit from linguistic resources that delve deep into its formal and syntactic features, but such a resource for ASL has never existed—until now.

A collaboration between researchers at San Diego State University, Boston University and Tufts University has created ASL-LEX, the first comprehensive lexical database for ASL. The project recently received the People’s Choice award in the Interactive category of the 15th annual National Science Foundation (NSF) Visualization Challenge, also known as the “Vizzies.”

When you visit the ASL-LEX website, you can look up a word and be directed to a short video that shows corresponding ASL sign. But it’s more than just a translation website; a sidebar gives technical information about the sign’s frequency of use by signers; a score for its iconicity, or how much the sign resembles the action or object it represents; and a host of phonological and syntactic details about the signs. The site also offers a visual map of sign clusters organized by signs that share similar phonological properties, such as hand shape and movement.

These details are incredibly valuable to linguists hoping to learn how signs relate to one another, as well as for neuroscientists hoping to understand how the brain processes visual-manual language differently from auditory-vocal languages, said Karen Emmorey, one of the project’s leaders and a cognitive neuroscientist at SDSU. Her research focuses on what sign language can tell us about cognition more broadly. It’s also a resource for ASL learners to quickly find signs for words. So far, the project has data on about 1,000 signs, but a recent NSF grant will expand that to around 2,500 signs in the coming years.

“Almost everything we know about human language comes from the study of spoken languages,” said Emmorey, director of the SDSU Laboratory for Language and Cognitive Neuroscience. “However, only by studying sign languages is it possible to discover which linguistic rules and constraints are universal to all human languages and which depend on the modality of a language.”
Categorized As