Michael Hahn
Stanford University
H-index: 13
North America-United States
Top articles of Michael Hahn
Title | Journal | Author(s) | Publication Date |
---|---|---|---|
Why are Sensitive Functions Hard for Transformers? | arXiv preprint arXiv:2402.09963 | Michael Hahn Mark Rofin | 2024/2/15 |
A unifying theory explains seemingly contradictory biases in perceptual estimation | Nature Neuroscience | Michael Hahn Xue-Xin Wei | 2024/2/15 |
A theory of emergent in-context learning as implicit structure induction | arXiv preprint arXiv:2303.07971 | Michael Hahn Navin Goyal | 2023/3/14 |
How do syntactic statistics and semantic plausibility modulate local coherence effects | Proceedings of the Annual Meeting of the Cognitive Science Society | Hailin Hao Michael Hahn Elsi Kaiser | 2023 |
A Cross-Linguistic Pressure for Uniform Information Density in Word Order | Transactions of the Association for Computational Linguistics | Thomas Hikaru Clark Clara Meister Tiago Pimentel Michael Hahn Ryan Cotterell | 2023/8/15 |
Computational and Communicative Efficiency in Language | Michael Hermann Hahn | 2022 | |
Modeling fixation behavior in reading with character-level neural attention | Proceedings of the Annual Meeting of the Cognitive Science Society | Songpeng Yan Michael Hahn Frank Keller | 2022 |
Explaining patterns of fusion in morphological paradigms using the memory--surprisal tradeoff | Proceedings of the annual meeting of the cognitive science society | Neil Rathi Michael Hahn Richard Futrell | 2022 |
A resource-rational model of human processing of recursive linguistic structure | Proceedings of the National Academy of Sciences | Michael Hahn Richard Futrell Roger Levy Edward Gibson | 2022/10/25 |
Crosslinguistic word order variation reflects evolutionary pressures of dependency and information locality | Proceedings of the National Academy of Sciences | Michael Hahn Yang Xu | 2022/6/14 |
Information theory as a bridge between language function and language form | Richard Futrell Michael Hahn | 2022/5/11 | |
Sensitivity as a complexity measure for sequence classification tasks | Transactions of the Association for Computational Linguistics | Michael Hahn Dan Jurafsky Richard Futrell | 2021/8/18 |
Modeling word and morpheme order in natural language as an efficient trade-off of memory and surprisal. | Psychological Review | Michael Hahn Judith Degen Richard Futrell | 2021/7 |
An information-theoretic characterization of morphological fusion | An Information-Theoretic Characterization of Morphological Fusion | Neil Rathi Michael Hahn Richard Futrell | 2021/1 |
Morpheme ordering across languages reflects optimization for processing efficiency | Open Mind | Michael Hahn Rebecca Mathew Judith Degen | 2021/12/9 |
Supplementary Information: Morpheme Ordering across Languages Reflects Optimization for Processing Efficiency | Michael Hahn Rebecca Mathew Judith Degen | 2021/11/18 | |
Crosslinguistic Word Orders Enable an Efficient Tradeoff of Memory and Surprisal | Society for Computation in Linguistics | Michael Hahn Richard Futrell | 2020/1/1 |
Theoretical limitations of self-attention in neural sequence models | Transactions of the Association for Computational Linguistics | Michael Hahn | 2020/1/1 |
RNNs can generate bounded hierarchical languages with optimal memory | arXiv preprint arXiv:2010.07515 | John Hewitt Michael Hahn Surya Ganguli Percy Liang Christopher D Manning | 2020/10/15 |
Supplementary Information for: Modeling word and morpheme order in natural language as an efficient tradeoff of memory and surprisal | Michael Hahn Judith Degen Richard Futrell | 2020/9/15 |