MS in Computer Science (M.S.C.S.)
Degree Granting Department
Computer Science and Engineering
John Licato, Ph.D.
Sriram Chellappan, Ph.D.
Tempestt Neal, Ph.D.
Age of Acquisition, Deep Learning, Language Models, Natural Language Processing, Psychometrics
Deep learning has the potential to help solve numerous problems in cognitive science andeducation, by providing us a way to model the cognitive profiles of individual people. If this were possible, it would allow us to design targeted tests and suggest specific remediation based on each individual’s needs. On the flip side, employing techniques from psychology can give us insight into the underlying skillsets neural networks have acquired during training, addressing the interpretability concern. This thesis explores these ideas in the context of transformer language models, which have achieved state-of-the-art results on virtually every natural language processing (NLP) task. First, we study the ability of transformers to model Age of Acquisition, an important variable in predicting word processing in humans. We then examine the broader challenge of using transformers to model the human responses to a test of linguistic competencies, this time employing measures from psychometrics as part of our evaluation. Compared to simpler models, we find that transformers can achieve superior results in both cases, suggesting they are more suitable for modeling psychological properties. The use of psychometric measures also allows us to study what linguistic skills transformers cannot learn, suggesting where future architectures can make improvements.
Scholar Commons Citation
Laverghetta, Antonio Jr., "Exploring the Use of Neural Transformers for Psycholinguistics" (2021). Graduate Theses and Dissertations.