Zhilin Yang

Zhilin Yang

Carnegie Mellon University

H-index: 28

North America-United States

About Zhilin Yang

Zhilin Yang, With an exceptional h-index of 28 and a recent h-index of 27 (since 2020), a distinguished researcher at Carnegie Mellon University, specializes in the field of Deep Learning, Machine Learning, Natural Language Processing.

His recent articles reflect a diverse array of research interests and contributions to the field:

Codegeex: A pre-trained model for code generation with multilingual benchmarking on humaneval-x

GPT understands, too

Computationally efficient expressive output layers for neural networks

Not all tasks are born equal: Understanding zero-shot generalization

Learning to Detect Noisy Labels Using Model-Based Features

Compositional task representations for large language models

A universal discriminator for zero-shot generalization

Zero-label prompt selection

Zhilin Yang Information

University

Position

___

Citations(all)

22006

Citations(since 2020)

20717

Cited By

7320

hIndex(all)

28

hIndex(since 2020)

27

i10Index(all)

34

i10Index(since 2020)

33

Email

University Profile Page

Carnegie Mellon University

Google Scholar

View Google Scholar Profile

Zhilin Yang Skills & Research Interests

Deep Learning

Machine Learning

Natural Language Processing

Top articles of Zhilin Yang

Title

Journal

Author(s)

Publication Date

Codegeex: A pre-trained model for code generation with multilingual benchmarking on humaneval-x

Qinkai Zheng

Xiao Xia

Xu Zou

Yuxiao Dong

Shan Wang

...

2023/8/6

GPT understands, too

AI Open

Xiao Liu

Yanan Zheng

Zhengxiao Du

Ming Ding

Yujie Qian

...

2023/8/26

Computationally efficient expressive output layers for neural networks

2022/10/25

Not all tasks are born equal: Understanding zero-shot generalization

Jing Zhou

Zongyu Lin

Yanan Zheng

Jian Li

Zhilin Yang

2022/9/29

Learning to Detect Noisy Labels Using Model-Based Features

arXiv preprint arXiv:2212.13767

Zhihao Wang

Zongyu Lin

Peiqi Liu

Guidong ZHeng

Junjie Wen

...

2022/12/28

Compositional task representations for large language models

Nan Shao

Zefan Cai

Chonghua Liao

Yanan Zheng

Zhilin Yang

2022/9/29

A universal discriminator for zero-shot generalization

arXiv preprint arXiv:2211.08099

Haike Xu

Zongyu Lin

Jing Zhou

Yanan Zheng

Zhilin Yang

2022/11/15

Zero-label prompt selection

arXiv preprint arXiv:2211.04668

Chonghua Liao

Yanan Zheng

Zhilin Yang

2022/11/9

Nlp from scratch without large-scale pretraining: A simple and efficient framework

Xingcheng Yao

Yanan Zheng

Xiaocong Yang

Zhilin Yang

2022/6/28

ZeroPrompt: scaling prompt-based pretraining to 1,000 tasks improves zero-shot generalization

arXiv preprint arXiv:2201.06910

Hanwei Xu

Yujun Chen

Yulun Du

Nan Shao

Yanggang Wang

...

2022/1/18

GPS: Genetic prompt search for efficient few-shot learning

arXiv preprint arXiv:2210.17041

Hanwei Xu

Yujun Chen

Yulun Du

Nan Shao

Yanggang Wang

...

2022/10/31

Distribution matching for rationalization

Proceedings of the AAAI Conference on Artificial Intelligence

Yongfeng Huang

Yujun Chen

Yulun Du

Zhilin Yang

2021/5/18

P-tuning v2: Prompt tuning can be comparable to fine-tuning universally across scales and tasks

arXiv preprint arXiv:2110.07602

Xiao Liu

Kaixuan Ji

Yicheng Fu

Weng Lam Tam

Zhengxiao Du

...

2021/10/14

Fastmoe: A fast mixture-of-expert training system

arXiv preprint arXiv:2103.13262

Jiaao He

Jiezhong Qiu

Aohan Zeng

Zhilin Yang

Jidong Zhai

...

2021/3/24

Fewnlu: Benchmarking state-of-the-art methods for few-shot natural language understanding

Yanan Zheng

Jing Zhou

Yujie Qian

Ming Ding

Jian Li

...

2021/9/27

Glm: General language model pretraining with autoregressive blank infilling

Zhengxiao Du

Yujie Qian

Xiao Liu

Ming Ding

Jiezhong Qiu

...

2022

The International Workshop on Pretraining: Algorithms, Architectures, and Applications (Pretrain@ KDD 2021)

Ming Ding

Yuxiao Dong

Xiao Liu

Jiezhong Qiu

Jie Tang

...

2021/8/14

Wudaocorpora: A super large-scale chinese corpora for pre-training language models

AI Open

Sha Yuan

Hanyu Zhao

Zhengxiao Du

Ming Ding

Xiao Liu

...

2021/1/1

Controllable generation from pre-trained language models via inverse prompting

arXiv preprint arXiv:2103.10685

Xu Zou

Da Yin

Qingyang Zhong

Ming Ding

Zhilin Yang

...

2021/3/19

Flipda: Effective and robust data augmentation for few-shot learning

arXiv preprint arXiv:2108.06332

Jing Zhou

Yanan Zheng

Jie Tang

Jian Li

Zhilin Yang

2021/8/13

See List of Professors in Zhilin Yang University(Carnegie Mellon University)

Co-Authors

H-index: 227
Yoshua Bengio

Yoshua Bengio

Université de Montréal

H-index: 158
Christopher D Manning

Christopher D Manning

Stanford University

H-index: 145
Yann LeCun

Yann LeCun

New York University

H-index: 115
Ruslan Salakhutdinov

Ruslan Salakhutdinov

Carnegie Mellon University

H-index: 97
Tang Jie

Tang Jie

Tsinghua University

H-index: 19
Junbo Zhao

Junbo Zhao

New York University

academic-engine