papers AI Learner
The Github is limit! Click to go to the new site.

On Measuring Gender Bias in Translation of Gender-neutral Pronouns

2019-05-28
Won Ik Cho, Ji Won Kim, Seok Min Kim, Nam Soo Kim

Abstract

Ethics regarding social bias has recently thrown striking issues in natural language processing. Especially for gender-related topics, the need for a system that reduces the model bias has grown in areas such as image captioning, content recommendation, and automated employment. However, detection and evaluation of gender bias in the machine translation systems are not yet thoroughly investigated, for the task being cross-lingual and challenging to define. In this paper, we propose a scheme for making up a test set that evaluates the gender bias in a machine translation system, with Korean, a language with gender-neutral pronouns. Three word/phrase sets are primarily constructed, each incorporating positive/negative expressions or occupations; all the terms are gender-independent or at least not biased to one side severely. Then, additional sentence lists are constructed concerning formality of the pronouns and politeness of the sentences. With the generated sentence set of size 4,236 in total, we evaluate gender bias in conventional machine translation systems utilizing the proposed measure, which is termed here as translation gender bias index (TGBI). The corpus and the code for evaluation is available on-line.

Abstract (translated by Google)
URL

https://arxiv.org/abs/1905.11684

PDF

https://arxiv.org/pdf/1905.11684


Similar Posts

Comments