Artificial Intelligence is widely prevalent in day-to-day tech today and is expected to become more pervasive in the future. While AI learns from and mimics its human trainers, given the learning it achieves in a shorter duration relative to a human and the learning solely based on data and models, AI can amplify certain behaviors in a relatively shorter time.
Of the multiple such issues, this collection focuses on gender bias. Traditional gender bias was most visible with occupational description. With expanding AI use, gender bias could reflect across information areas like social feeds, news feeds, to more critical areas like medical care decisions, to product design decisions like airbag safety.
As AI becomes pervasive and the potential for biases to potentially disrupt the use cases they serve, it becomes crucial for developers to identify and rectify any biases in their models. While the area is vast and there is significant ongoing work, kandi has shortlisted libraries that help you in recognizing and resolving gender bias.
Stand-alone Use Cases
If you are looking at simpler stand-alone use cases to experiment, gender-bias by gender-bias, catalyst-bias-correct by EskaleraInc, bias-detector by intuit, catalyst-slack-service by willowtreeapps, Gender-Bias-Visualization by GesaJo can help you with use cases from bias detection to visualization to NLP auto-correct plugins.
gender-biasby gender-bias
Reading for gender bias
catalyst-bias-correctby EskaleraInc
A Slack Application that identifies and corrects unconscious gender bias in messages and conversations
catalyst-bias-correctby EskaleraInc
Java 27 Version:Current License: Permissive (Apache-2.0)
bias-detectorby intuit
bias-detectorby intuit
Python 32 Version:0.0.12 License: Permissive (MIT)
catalyst-slack-serviceby willowtreeapps
Unconscious gender bias has been fueling the gender gap for far too long. We’re releasing the #BiasCorrect code in hopes that coders around the world will adapt it for whatever chat-based platforms they use in order to give more people access to this tool for change.
catalyst-slack-serviceby willowtreeapps
Java 12 Version:Current License: Permissive (Apache-2.0)
Gender-Bias-Visualizationby GesaJo
A website to visualize gender bias in language models
Gender-Bias-Visualizationby GesaJo
HTML 10 Version:Current License: No License
Coreference Resolution focused on Gender Bias
If you are looking for coreference resolution focused on gender bias, try corefBias by uclanlp, winogender-schemas by rudinger.
corefBiasby uclanlp
To analyze and remove gender bias in coreference resolution systems
corefBiasby uclanlp
CSS 51 Version:Current License: Permissive (MIT)
winogender-schemasby rudinger
Data for evaluating gender bias in coreference resolution systems.
winogender-schemasby rudinger
Python 20 Version:Current License: Permissive (MIT)
Review of Models and Unsupervised Bias Validation
For in-depth review of models, data, labels, domains, and unsupervised bias validation, try CausalMediationAnalysis by sebastianGehrmann, Balanced-Datasets-Are-Not-Enough by uvavision, unsupervised_gender_bias by anjalief, GeBNLP2019 by alfredomg.
CausalMediationAnalysisby sebastianGehrmann
Code for the paper "Causal Mediation Analysis for Interpreting Neural NLP: The Case of Gender Bias"
CausalMediationAnalysisby sebastianGehrmann
Python 35 Version:Current License: Permissive (MIT)
Balanced-Datasets-Are-Not-Enoughby uvavision
[ICCV 2019] Balanced Datasets Are Not Enough: Estimating and Mitigating Gender Bias in Deep Image Representations
Balanced-Datasets-Are-Not-Enoughby uvavision
Python 19 Version:Current License: No License
unsupervised_gender_biasby anjalief
Code for https://arxiv.org/pdf/2004.08361.pdf
unsupervised_gender_biasby anjalief
Python 5 Version:Current License: No License
GeBNLP2019by alfredomg
Accompanying code and data to the paper "Measuring Gender Bias in Word Embeddings across Domains and Discovering New Gender Bias Word Categories" by Kaytlin Chaloner and Alfredo Maldonado
GeBNLP2019by alfredomg
Python 5 Version:Current License: No License
Audit Models
A popular audit tool for your models is fairml by adebayoj.