My primary research interests are on language processing and learning. I study language both through computational models and also through psycholinguistic and neuroimaging studies.

In my early work I was interested in speech perception, and what the mechanisms are that make it possible for humans to perceive complex acoustic inputs with such apparent ease. With Jay McClelland, I developed the TRACE model. TRACE is a neural network that takes either simulated or real speech as input, and exhibits a number of phenomena characteristic of humans perception.

A continuing focus that grew out of the work on TRACE was the need to develop new computational approaches to processing inputs that are presented incrementally in sequence. This is of course a hallmark of natural language, but time series arise in a broad range of domains (not surprising given that time is an underlying dimension for almost all activity). This led me to explore recurrent neural networks, and the introduction in 1990 of the Simple Recurrent Network (or SRN) and the use of prediction as a training task. Prediction is a hard task master, particularly in cases where the sequential behaviors are governed by regularities that are not superficially apparent. Success in the task — when success is measured against novel untrained inputs — requires the network to discover the underlying generators that have given rise to the observed sequences. In the three decades since the SRN and prediction task were introduced as computational models, there have been numerous demonstrations, from both behavioral and neuroimaging studies, that prediction is a core skill that humans use as a way to learn patterned behavior, as well as serving as a guide for those behaviors.

In more recent work, I have studied both sentence-level and discourse-level language phenomena. In research with Ken, McRae, Marta Kutas, Mary Hare, and others, I have tried to understand expectancy generation in sentence processing. This work has pushed us towards an appreciation for the role of ‘higher-level’ knowledge, including event representations. This is currently the primary focus of my research. This work utilizes behavioral measures, computer simulations, and large-scale norming data.

Current projects include the above, as well as developmental studies that focus on language learning in the early years of life. This collaborative work–with Arielle Borovsky, Julia Evans, Erica Ellis, Katie Travis, Matt Leonard, Eric Halgren, and others–uses behavioral measures (including eye tracking), and neurogimaging (including ERPs , MRI, and EEG). A major focus here is on word learning and the ability to integrate information that is presented incrementally in order to generate expectancies about upcoming words.

The Publications link goes to a page listing downloadable PDFs of recent papers. For other papers, please email me using the Contact link. Selected videos of talks are found on the Videos page.