ECNU at SemEval-2018 Task 1: Emotion Intensity Prediction Using Effective Features and Machine Learning Models

Huimin Xu, Man Lan, Yuanbin Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

In this paper we describe our systems submitted to Semeval 2018 Task 1 “Affect in Tweet” (Mohammad et al., 2018). We participated in all subtasks of English tweets, including emotion intensity classification and quantification, valence intensity classification and quantification. In our systems, we extracted four types of features, including linguistic, sentiment lexicon, emotion lexicon and domain-specific features, then fed them to different regressors, finally combined the models to create an ensemble for the better performance. Officially released results showed that our system can be further extended.

Original languageEnglish
Title of host publicationNAACL HLT 2018 - International Workshop on Semantic Evaluation, SemEval 2018 - Proceedings of the 12th Workshop
EditorsMarianna Apidianaki, Marianna Apidianaki, Saif M. Mohammad, Jonathan May, Ekaterina Shutova, Steven Bethard, Marine Carpuat
PublisherAssociation for Computational Linguistics (ACL)
Pages231-235
Number of pages5
ISBN (Electronic)9781948087209
DOIs
StatePublished - 2018
Event12th International Workshop on Semantic Evaluation, SemEval 2018, co-located with the 16th Annual Conference of the North American Chapter of the - New Orleans, United States
Duration: 5 Jun 20186 Jun 2018

Publication series

NameNAACL HLT 2018 - International Workshop on Semantic Evaluation, SemEval 2018 - Proceedings of the 12th Workshop

Conference

Conference12th International Workshop on Semantic Evaluation, SemEval 2018, co-located with the 16th Annual Conference of the North American Chapter of the
Country/TerritoryUnited States
CityNew Orleans
Period5/06/186/06/18

Fingerprint

Dive into the research topics of 'ECNU at SemEval-2018 Task 1: Emotion Intensity Prediction Using Effective Features and Machine Learning Models'. Together they form a unique fingerprint.

Cite this