Tianyu Gao

Tianyu Gao

Princeton University

H-index: 18

North America-United States

About Tianyu Gao

Tianyu Gao, With an exceptional h-index of 18 and a recent h-index of 18 (since 2020), a distinguished researcher at Princeton University, specializes in the field of Natural Language Processing.

His recent articles reflect a diverse array of research interests and contributions to the field:

Long-Context Language Modeling with Parallel Context Encoding

Improving Language Understanding from Screenshots

Fine-tuning language models with just forward passes

Evaluating large language models at evaluating instruction following

Sheared llama: Accelerating language model pre-training via structured pruning

MoQA: Benchmarking Multi-Type Open-Domain Question Answering

Enabling large language models to generate text with citations

Text sentence processing method and apparatus, computer device, and storage medium

Tianyu Gao Information

University

Position

___

Citations(all)

5321

Citations(since 2020)

5319

Cited By

223

hIndex(all)

18

hIndex(since 2020)

18

i10Index(all)

20

i10Index(since 2020)

20

Email

University Profile Page

Google Scholar

Tianyu Gao Skills & Research Interests

Natural Language Processing

Top articles of Tianyu Gao

Long-Context Language Modeling with Parallel Context Encoding

arXiv preprint arXiv:2402.16617

2024/2/26

Tianyu Gao
Tianyu Gao

H-Index: 8

Danqi Chen
Danqi Chen

H-Index: 18

Improving Language Understanding from Screenshots

arXiv preprint arXiv:2402.14073

2024/2/21

Fine-tuning language models with just forward passes

Advances in Neural Information Processing Systems

2024/2/13

Evaluating large language models at evaluating instruction following

arXiv preprint arXiv:2310.07641

2023/10/11

Sheared llama: Accelerating language model pre-training via structured pruning

arXiv preprint arXiv:2310.06694

2023/10/10

MoQA: Benchmarking Multi-Type Open-Domain Question Answering

2023/7

Enabling large language models to generate text with citations

arXiv preprint arXiv:2305.14627

2023/5/24

Tianyu Gao
Tianyu Gao

H-Index: 8

Danqi Chen
Danqi Chen

H-Index: 18

Text sentence processing method and apparatus, computer device, and storage medium

2023/3/30

What In-Context Learning “Learns” In-Context: Disentangling Task Recognition and Task Learning

2023

Recovering private text in federated learning of language models

NeurIPS 2022

2022/12/6

Automatic label sequence generation for prompting sequence-to-sequence models

arXiv preprint arXiv:2209.09401

2022/9/20

Should you mask 15% in masked language modeling?

EACL 2023

2023

Ditch the gold standard: Re-evaluating conversational question answering

arXiv preprint arXiv:2112.08812

2021/12/16

Tianyu Gao
Tianyu Gao

H-Index: 8

Danqi Chen
Danqi Chen

H-Index: 18

Manual evaluation matters: reviewing test protocols of distantly supervised relation extraction

arXiv preprint arXiv:2105.09543

2021/5/20

Simcse: Simple contrastive learning of sentence embeddings

arXiv preprint arXiv:2104.08821

2021/4/18

Tianyu Gao
Tianyu Gao

H-Index: 8

Danqi Chen
Danqi Chen

H-Index: 18

KEPLER: A unified model for knowledge embedding and pre-trained language representation

Transactions of the Association for Computational Linguistics

2021/2/1

Making pre-trained language models better few-shot learners

arXiv preprint arXiv:2012.15723

2020/12/31

Meta-information guided meta-learning for few-shot relation classification

2020/12

Few-shot relation extraction via bayesian meta-learning on relation graphs

ICML'20

2020/7/5

Learning from context or names? an empirical study on neural relation extraction

arXiv preprint arXiv:2010.01923

2020/10/5

See List of Professors in Tianyu Gao University(Princeton University)

Co-Authors

academic-engine