Kanishka Misra

Kanishka Misra

Assistant Professor of Linguistics and Harrington Fellow at UT-Austin

KOALAB

UT Austin CompLing Group

UT Austin NLP

I am an Assistant Professor in the Linguistics Department at UT Austin, and a Harrington Faculty Fellow for 2025-26. I am a member of the Computational Linguistics Research Group and also the wider NLP Research Community at UT. I maintain a courtesy appointment at Toyota Technological Institute at Chicago.

My research program lies at the intersection of Cognitive Science, Linguistics, and Artificial Intelligence. I am primarily interested in characterizing the statistical mechanisms that underlie the acquisition and generalization of complex linguistic phenomena and conceptual meaning. To this end, I: (1) develop methods to evaluate and analyze AI models from the perspective of semantic cognition; and (2) use AI models as simulated learners to test and generate novel hypotheses about language acquisition and generalization. My research has been recognized with awards at EACL 2023, ACL 2023, and EMNLP 2024!

I am no longer recruiting, but watch this space for more updates!

Previously, I was a Research Assistant Professor at the Toyota Technological Institute at Chicago, a philanthropically endowed academic computer science institute located on the University of Chicago campus. Before, I was a postdoctoral fellow in the Linguistics department at UT Austin, working with Dr. Kyle Mahowald. Before that, I was a PhD student at Purdue University, where I worked on Natural Language Understanding with Dr. Julia Taylor Rayz at the AKRaNLU Lab. I also worked closely with Dr. Allyson Ettinger and her lab at UChicago.

I am the author of minicons, a python library that facilitates large scale behavioral analyses of transformer language models.

My email is kmisra [at] utexas [dot] edu.[why is it like that?]

Representative Papers

Some upcoming talks:

Recent News

SEE ALL

Recent Posts

Introducing minicons: Running large scale behavioral analyses on transformer language models

In this post, I showcase my new python library that implements simple computations to facilitate large-scale evaluation of transformer language models.

Contact