Learning sentiment-inherent word embedding for word-level and sentence-level sentiment analysis

Zhihua Zhang, Man Lan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

15 Scopus citations

Abstract

Vector-based word representations have made great progress on many Natural Language Processing tasks. However, due to the lack of sentiment information, the traditional word vectors are insufficient to settle sentiment analysis tasks. In order to capture the sentiment information, we extended Continuous Skip-gram model (Skip-gram) and presented two sentiment word embedding models by integrating sentiment information into semantic word representations. Experimental results showed that the sentiment word embeddings learned by two models indeed capture sentiment and semantic information as well. Moreover, the proposed sentiment word embedding models outperform traditional word vectors on both Chinese and English corpora.

Original languageEnglish
Title of host publicationProceedings of 2015 International Conference on Asian Language Processing, IALP 2015
EditorsBin Ma, Min Zhang, Yanfeng Lu, Minghui Dong, Wenliang Chen
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages94-97
Number of pages4
ISBN (Electronic)9781467395953
DOIs
StatePublished - 12 Apr 2016
EventInternational Conference on Asian Language Processing, IALP 2015 - Suzhou, China
Duration: 24 Oct 201525 Oct 2015

Publication series

NameProceedings of 2015 International Conference on Asian Language Processing, IALP 2015

Conference

ConferenceInternational Conference on Asian Language Processing, IALP 2015
Country/TerritoryChina
CitySuzhou
Period24/10/1525/10/15

Cite this