JoLIE 18:3/2025

 

Back to issue page

 

 

 

GENDER RESOLUTION IN GOOGLE TRANSLATE AND CHATGPT: A DESCRIPTIVE CROSS-LINGUISTIC ANALYSIS

 

 

Ana-Maria OpreaTitle: A green circle with white letters

AI-generated content may be incorrect.

1 Decembrie 1918 University of Alba Iulia, Romania

 

 

 

Abstract

 

This study presents a descriptive cross-linguistic analysis of gender resolution patterns in two widely used machine translation systems, Google Translate and ChatGPT. Using a controlled challenge-set methodology, the analysis examines how each system assigns grammatical gender when translating gender-ambiguous source sentences into five typologically diverse target languages.

The test set comprises 175 constructed sentences designed to probe linguistic environments where gender resolution is known to vary, including occupational nouns, pronoun ambiguity, adjective-based descriptions, grammatical agreement, epicene nouns, prestige-related terms, and translations from a gender-neutral source language. Translation outputs were manually coded and analysed for distributional patterns across systems and languages.

The results document a recurrent tendency toward masculine default forms in gender-ambiguous contexts across both systems, with variation depending on target language and linguistic parameter. While ChatGPT more frequently provides alternative gendered renderings, its primary outputs show distributional patterns comparable to those observed in Google Translate. Cross-linguistic comparison suggests that typological features appear to influence how gender is resolved but do not eliminate default patterns.

This study is descriptive in scope and reports observed translation outputs under specific testing conditions. It does not aim to establish causal explanations or statistically generalisable claims about underlying system mechanisms beyond the tested sentence sets. Findings are limited to the tested sentence sets, systems, and time of evaluation and are not intended to support causal or generalisable claims about system design.

 

Key words: Gender bias; Machine translation; Translation studies; Cross-linguistic analysis; Controlled evaluation.

 

 

References

 

Argamon, S., Koppel, M., Fine, J., & Shimoni, A. R. (2003). Gender, genre, and writing style. Text & Talk, 23(3), 321–346. https://doi.org/10.1515/text.2003.014

 

Bab.la. (n.d.). Gender bias. In Bab.la dictionary. Retrieved May 11, 2024, from https://en.bab.la/dictionary/english/gender-bias

 

Baker, M. (2006). Translation and conflict: A narrative account. Routledge. https://doi.org/10.4324/9780203099919

 

Baker, P. (2014). Using corpora to analyse gender. Bloomsbury.

 

Bentivogli, L., Savoldi, B., Negri, M., Di Gangi, M. A., Cattoni, R., & Turchi, M. (2020). Gender in danger? Evaluating speech translation technology on the MuST-SHE corpus. In D. Jurafsky, J. Chai, N. Schluter, & J. Tetreault (Eds.), Proceedings of the 58th annual meeting of the Association for Computational Linguistics (pp. 6923–6933). Association for Computational Linguistics. https://aclanthology.org/2020.acl-main.619.pdf

 

Butler, J. (1999). Gender trouble: Feminism and the subversion of identity. Routledge.

 

Comrie, B. (1999). Grammatical gender systems: A linguist’s assessment. Journal of Psycholinguistic Research, 28, 457–466. https://doi.org/10.1023/A:1023212225540

 

Corbett, G. G. (Ed.). (2014). The expression of gender. De Gruyter Mouton.

 

Costa-Jussà, M. R., & de Jorge, A. (2020). Fine-tuning neural machine translation on gender-balanced datasets. In M. R. Costa-Jussà, C. Hardmeier, W. Radford, & K. Webster (Eds.), Proceedings of the Second Workshop on Gender Bias in Natural Language Processing (pp. 26–34). Association for Computational Linguistics. https://aclanthology.org/2020.gebnlp-1.3

 

Dastin, J. (2018, October 11). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/world/insight-amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK0AG/

 

Ghosh, S., & Caliskan, A. (2023). ChatGPT perpetuates gender bias in machine translation and ignores non-gendered pronouns: Findings across Bengali and five other low-resource languages. In F. Rossi, S. Das, J. Davis, K. Firth-Butterfield, & A. John (Eds.), Proceedings of the 2023 AAAI/ACM conference on AI, ethics, and society (AIES ’23) (pp. 901-912). Association for Computing Machinery. https://doi.org/10.1145/3600211.3604672

 

Gygax, P. M., Elmiger, D., Zufferey, S., Garnham, A., Sczesny, S., von Stockhausen, L., Braun, F., & Oakhill, J. (2019). A language index of grammatical gender dimensions to study the impact of grammatical gender on the way we perceive women and men. Frontiers in Psychology, 10, Article 1604. https://doi.org/10.3389/fpsyg.2019.01604

 

Hellinger, M., & Bussmann, H. (Eds.). (2001). Gender across languages: The linguistic representation of women and men (Vol. 1). John Benjamins. https://doi.org/10.1075/impact.9

 

Holmes, J. (1990). Hedges and boosters in women’s and men’s speech. Language & Communication, 10(3), 185–205. https://doi.org/10.1016/0271-5309(90)90002-S

 

Isabelle, P., Cherry, C., & Foster, G. (2017). A challenge set approach to evaluating machine translation. In M. Palmer, R. Hwa, & S. Riedel (Eds.), Proceedings of the 2017 conference on empirical methods in natural language processing (pp. 2486–2496). Association for Computational Linguistics. https://doi.org/10.18653/v1/D17-1263

 

Jakobson, R. (1972). Verbal communication. Scientific American, 227(3), 72–81.

 

Johnson, M. (2020, April 22). A scalable approach to reducing gender bias in Google Translate. Google Research. https://research.google/blog/a-scalable-approach-to-reducing-gender-bias-in-google-translate/

 

Kayser-Bril, N. (2020, September 17). Female historians and male nurses do not exist, Google Translate tells its European users. AlgorithmWatch. https://algorithmwatch.org/en/google-translate-gender-bias

 

Kramer, R. (2014). Gender in Amharic: A morphosyntactic approach to natural and grammatical gender. Language Sciences, 43, 102–115. https://doi.org/10.1016/j.langsci.2013.10.004

 

Lakoff, R. (1975). Language and women’s place. Harper & Row.

 

Nedlund, E. (2019, November 12). Apple Card is accused of gender bias. Here’s how that can happen. CNN Business. https://edition.cnn.com/2019/11/12/business/apple-card-gender-bias/index.html

 

Popescu, T. (2025). Research in applied linguistics and language education: Design, methods, and analysis. Presa Universitarã Clujeanã. https://doi.org/10.29302/ResearchApplLing_LangEduc.popescu.t

 

Prates, M. O. R., Avelar, P. H., & Lamb, L. C. (2020). Assessing gender bias in machine translation: A case study with Google Translate. Neural Computing and Applications, 32, 6363–6381. https://doi.org/10.1007/s00521-019-04144-6

 

Robert F. Kennedy Human Rights. (2023). How women shaped the Universal Declaration of Human Rights. https://rfkhumanrights.org/our-voices/how-women-shaped-the-universal-declaration-of-human-rights-2/

 

Savoldi, B., Gaido, M., Bentivogli, L., Negri, M., & Turchi, M. (2021). Gender bias in machine translation. Transactions of the Association for Computational Linguistics, 9, 845–874. https://doi.org/10.1162/tacl_a_00401

 

Savoldi, B., Gaido, M., Bentivogli, L., Negri, M., & Turchi, M. (2022). Under the morphosyntactic lens: A multifaceted evaluation of gender bias in speech translation. In S. Mureșan, P. Nakov, & A. Villavicencio (Eds.), Proceedings of the 60th annual meeting of the Association for Computational Linguistics (pp. 1807–1824). Association for Computational Linguistics. https://doi.org/10.18653/v1/2022.acl-long.127

 

Stanovsky, G., Smith, N. A., & Zettlemoyer, L. (2019). Evaluating gender bias in machine translation. In A. Korhonen, D. Traum, & L. Màrquez (Eds.), Proceedings of the 57th annual meeting of the Association for Computational Linguistics (pp. 1679–1684). Association for Computational Linguistics. https://doi.org/10.18653/v1/P19-1164

 

United Nations. (2019). Women who shaped the Universal Declaration of Human Rights. https://www.un.org/en/observances/human-rights-day/women-who-shaped-the-universal-declaration

 

Vanmassenhove, E., Hardmeier, C., & Way, A. (2018). Getting gender right in neural machine translation. In E. Riloff, D. Chiang, J. Hockenmaier, & J. Tsujii (Eds.), Proceedings of the 2018 conference on empirical methods in Natural Language Processing (pp. 3003–3008). Association for Computational Linguistics. https://doi.org/10.18653/v1/D18-1334

 

 

How to cite this article: Oprea, A.-M. (2025). Gender resolution in Google Translate and ChatGPT: A descriptive cross-linguistic analysis. Journal of Linguistic and Intercultural Education – JoLIE, 18(3), 107–130. https://doi.org/10.29302/jolie.2025.18.3.6

 

For details on subscription, go to: http://jolie.uab.ro/index.php?pagina=-&id=19&l=en