Recently, the emergence of pre-trained models(PTMs) has brought natural language processing(NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language rep...Recently, the emergence of pre-trained models(PTMs) has brought natural language processing(NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language representation learning and its research progress. Then we systematically categorize existing PTMs based on a taxonomy from four different perspectives. Next,we describe how to adapt the knowledge of PTMs to downstream tasks. Finally, we outline some potential directions of PTMs for future research. This survey is purposed to be a hands-on guide for understanding, using, and developing PTMs for various NLP tasks.展开更多
The irregular domain and lack of ordering make it challenging to design deep neural networks for point cloud processing.This paper presents a novel framework named Point Cloud Transformer(PCT)for point cloud learning....The irregular domain and lack of ordering make it challenging to design deep neural networks for point cloud processing.This paper presents a novel framework named Point Cloud Transformer(PCT)for point cloud learning.PCT is based on Transformer,which achieves huge success in natural language processing and displays great potential in image processing.It is inherently permutation invariant for processing a sequence of points,making it well-suited for point cloud learning.To better capture local context within the point cloud,we enhance input embedding with the support of farthest point sampling and nearest neighbor search.Extensive experiments demonstrate that the PCT achieves the state-of-the-art performance on shape classification,part segmentation,semantic segmentation,and normal estimation tasks.展开更多
基金the National Natural Science Foundation of China(Grant Nos.61751201 and 61672162)the Shanghai Municipal Science and Technology Major Project(Grant No.2018SHZDZX01)and ZJLab。
文摘Recently, the emergence of pre-trained models(PTMs) has brought natural language processing(NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language representation learning and its research progress. Then we systematically categorize existing PTMs based on a taxonomy from four different perspectives. Next,we describe how to adapt the knowledge of PTMs to downstream tasks. Finally, we outline some potential directions of PTMs for future research. This survey is purposed to be a hands-on guide for understanding, using, and developing PTMs for various NLP tasks.
基金supported by the National Natural Science Foundation of China(Project Number 61521002)the Joint NSFC–DFG Research Program(Project Number 61761136018).
文摘The irregular domain and lack of ordering make it challenging to design deep neural networks for point cloud processing.This paper presents a novel framework named Point Cloud Transformer(PCT)for point cloud learning.PCT is based on Transformer,which achieves huge success in natural language processing and displays great potential in image processing.It is inherently permutation invariant for processing a sequence of points,making it well-suited for point cloud learning.To better capture local context within the point cloud,we enhance input embedding with the support of farthest point sampling and nearest neighbor search.Extensive experiments demonstrate that the PCT achieves the state-of-the-art performance on shape classification,part segmentation,semantic segmentation,and normal estimation tasks.