My PhD research is in models of child language acquisition.
Learning what words mean and how to use them goes far beyond learning that an "apple" is a tasty fruit.
The lexicon of any language is full of abstract structure, and it's that structure that makes languages so productive.
Suppose I said to you, "This new computer is a pain. It has a crazy process called glummoxing, and it took me an hour and a half just to glummox a paper to my advisor."
Sure, I just made up the word glummox, but you can tell it has something to do with sending things.
The structure in the verb's arguments tells you something about what it means.
How much of that structure can we discover from the way people naturally speak?
I use probabilistic topic models to show that we can learn many aspects of verb argument structure, alternation patterns and verb classes from the statistical patterns in the language children would naturally hear.
Beyond learning about child language, this has applications in building detailed lexical resources automatically from messy, sparse corpus data.
These kinds of resources are extremely valuable for interpreting text in new domains and new genres, particularly the kind of unstructured information found on the web.
- NVIDIA - Senior Manager, Applied Research (2023 - Present)
- NVIDIA - Research Manager (2022 - 2023)
- NVIDIA - Senior Deep Learning Applied Scientist (2020 - 2022)
- NexJ Health Inc. - Chief Technology Officer (2017 - 2020)
- NexJ Health Inc. - Senior Developer (2016 - 2017)
- Nuance Communications - Principal NLP Research Engineer, Clinical Language Understanding (2014 - 2016)
- Nuance Communications - Manager, NLP Research (2013 - 2014)
- Nuance Communications - Research Engineer, Natural Language Understanding (2011 - 2013)
- University of Toronto - PhD Candidate, Computational Linguistics (2006 - 2011)
- University of Waterloo - Undergraduate Research Assistant, Computational Neuroscience (2005)
- Environment Canada - Ice Model Research Assistant, Canadian Ice Service (2002 - 2004)