期刊文献+

A Survey of LLM Datasets:From Autoregressive Model to AI Chatbot

原文传递
导出
摘要 Since OpenAI opened access to ChatGPT,large language models(LLMs)become an increasingly popular topic attracting researchers’attention from abundant domains.However,public researchers meet some problems when developing LLMs given that most of the LLMs are produced by industries and the training details are typically unrevealed.Since datasets are an important setup of LLMs,this paper does a holistic survey on the training datasets used in both the pre-train and fine-tune processes.The paper first summarizes 16 pre-train datasets and 16 fine-tune datasets used in the state-of-the-art LLMs.Secondly,based on the properties of the pre-train and fine-tune processes,it comments on pre-train datasets from quality,quantity,and relation with models,and comments on fine-tune datasets from quality,quantity,and concerns.This study then critically figures out the problems and research trends that exist in current LLM datasets.The study helps public researchers train and investigate LLMs by visual cases and provides useful comments to the research community regarding data development.To the best of our knowledge,this paper is the first to summarize and discuss datasets used in both autoregressive and chat LLMs.The survey offers insights and suggestions to researchers and LLM developers as they build their models,and contributes to the LLM study by pointing out the existing problems of LLM studies from the perspective of data.
作者 杜非 马新建 杨婧如 柳熠 罗超然 王学斌 姜海鸥 景翔 Fei Du;Xin-Jian Ma;Jing-Ru Yang;Yi Liu;Chao-Ran Luo;Xue-Bin Wang;Hai-Ou Jiang;Xiang Jing(National Key Laboratory of Data Space Technology and System,Beijing 100195,China;Advanced Institute of Big Data,Beijing 100195,China;Fu Foundation School of Engineering and Applied Science,Columbia University,NY 10027,U.S.A.;School of Software and Microelectronics,Peking University,Beijing 100091,China;CCF;IEEE)
出处 《Journal of Computer Science & Technology》 SCIE EI CSCD 2024年第3期542-566,共25页 计算机科学技术学报(英文版)
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部