papers AI Learner
The Github is limit! Click to go to the new site.

Examining the Presence of Gender Bias in Customer Reviews Using Word Embedding

2019-02-01
A. Mishra, H. Mishra, S. Rathee

Abstract

Humans have entered the age of algorithms. Each minute, algorithms shape countless preferences from suggesting a product to a potential life partner. In the marketplace algorithms are trained to learn consumer preferences from customer reviews because user-generated reviews are considered the voice of customers and a valuable source of information to firms. Insights mined from reviews play an indispensable role in several business activities ranging from product recommendation, targeted advertising, promotions, segmentation etc. In this research, we question whether reviews might hold stereotypic gender bias that algorithms learn and propagate Utilizing data from millions of observations and a word embedding approach, GloVe, we show that algorithms designed to learn from human language output also learn gender bias. We also examine why such biases occur: whether the bias is caused because of a negative bias against females or a positive bias for males. We examine the impact of gender bias in reviews on choice and conclude with policy implications for female consumers, especially when they are unaware of the bias, and the ethical implications for firms.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1902.00496

PDF

http://arxiv.org/pdf/1902.00496


Similar Posts

Comments