HugNLP: A Unified and Comprehensive Library for Natural Language Processing

  • Jianing Wang
  • , Nuo Chen
  • , Qiushi Sun
  • , Wenkang Huang
  • , Chengyu Wang
  • , Ming Gao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

In this paper, we introduce HugNLP, a unified and comprehensive library for natural language processing (NLP) with the prevalent backend of Hugging Face Transformers, which is designed for NLP researchers to easily utilize off-the-shelf algorithms and develop novel methods with user-defined models and tasks in real-world scenarios. HugNLP consists of a hierarchical structure including models, processors and applications that unifies the learning process of pre-trained language models (PLMs) on different NLP tasks. Additionally, we present some featured NLP applications to show the effectiveness of HugNLP, such as knowledge-enhanced PLMs, universal information extraction, low-resource mining, and code understanding and generation, etc. The source code will be released on GitHub (https://github.com/HugAILab/HugNLP).

Original languageEnglish
Title of host publicationCIKM 2023 - Proceedings of the 32nd ACM International Conference on Information and Knowledge Management
PublisherAssociation for Computing Machinery
Pages5111-5116
Number of pages6
ISBN (Electronic)9798400701245
DOIs
StatePublished - 21 Oct 2023
Event32nd ACM International Conference on Information and Knowledge Management, CIKM 2023 - Birmingham, United Kingdom
Duration: 21 Oct 202325 Oct 2023

Publication series

NameInternational Conference on Information and Knowledge Management, Proceedings

Conference

Conference32nd ACM International Conference on Information and Knowledge Management, CIKM 2023
Country/TerritoryUnited Kingdom
CityBirmingham
Period21/10/2325/10/23

Keywords

  • Deep Learning Framework
  • Natural Language Processing
  • Pre-trained Language Models

Fingerprint

Dive into the research topics of 'HugNLP: A Unified and Comprehensive Library for Natural Language Processing'. Together they form a unique fingerprint.

Cite this