kandi background
Explore Kits
Build a Bias Detector Application Banner Bias is prevalent in every aspect of our lives. Our brains are hardwired to categorize things we encounter in order to make sense of the complicated world around us. However, biases can cause us to form prejudices against others, which allows for egregious inequalities to form between different demographics. While bias comes in many forms, bias words in writing is one form. Implicit bias in letter writing or evaluations negatively affects individuals at every stage of their career. In this challenge, we are inviting to build a solution for detecting bias in writings such as letter of recommendations, Job Descriptions etc with respect to gender and race for promoting equity. You can choose any topic of your choice. Please see below a sample solution kit to jumpstart your solution on creating a Bias Detector application. To use this kit to build your own solution, scroll down to refer sections Kit Deployment Instructions and Instruction to Run. Complexity : Simple The sample solution kit helps to detect gender bias

Training and Certification - How to build a Bias Detector application

Watch this self-guided tutorial on how you can use pretrained gender Bias detector model, Wordlist files which contains gender specific words, and an Input file for which the Bias needs to be detected to build your own Bias Detector application. Completed the training? Apply for your Participation Certificate and Achievement Certificate now! Tag us on social media with a screenshot or video of your working application for a chance to be featured as an Open Source Champion and get a verified badge.

Kit Deployment Instructions

For Windows OS, Download, extract and double-click kit installer file to install the kit. Note: Do ensure to extract the zip file before running it. The installation may take from 2 to 10 minutes based on bandwidth. 1. When you're prompted during the installation of the kit, press Y to launch the app automatically and run notebook cell by cell, by clicking on a cell and click Run button below the Menu bar. 2. To run the app manually, press N when you're prompted and locate the zip file Bias_Detector.zip 3. Extract the zip file and navigate to the directory gender-bias-master 4. Open command prompt in the extracted directory gender-bias-master and run the command jupyter notebook For other Operating System, 1. Click here to install python 2. Click hereto download the repository. 3. Extract the zip file and navigate to the directory gender-bias-master 4. Open terminal in the extracted directory gender-bias-master 5. Install dependencies by executing the command pip install -r requirements.txt 6. Run the command jupyter notebook.

Development Environment

VSCode and Jupyter Notebook are used for development and debugging. Jupyter Notebook is a web based interactive environment often used for experiments, whereas VSCode is used to get a typical experience of IDE for developers. Jupyter Notebook is used for our development.

Text Mining

Libraries in this group are used for analysis and processing of unstructured natural language. The data, as in its original form aren't used as it has to go through processing pipeline to become suitable for applying machine learning techniques and algorithms.

Kit Solution Source

The entire solution is available as a package to download from the source code repository. Please add your kit solution or prototype source repository in this section.

Instruction to Run

Follow below instructions to run the solution. 1. Locate and open the gender-bias.ipynb notebook from the Jupyter Notebook browser window. 2. Execute cells in the notebook by selecting Cell --> Run All from Menu bar For running it with your text, 1. Open letterofRecW file from the location data/input from gender-bias.ipynb location. 2. Update text in the letterofRecW file. 3. Execute cells in the notebook by selecting Cell --> Run All from Menu bar. 4. Output will be stored in a file gender-biased-words.txt in the location data/output. Output text is in json format. Output data format is: name - detector name. e.g. "Terms biased towards women" summary - summary of the detected bias flags - flag the detected bias words. e.g. "leader" You can additionally create your own detectors for race and dictionary dataset as well as other enhancements for additional score. For any support, you can direct message us at #help-with-kandi-kits

Troubleshooting

1. While running batch file, if you encounter Windows protection alert, select More info --> Run anyway 2. During kit installer, if you encounter Windows security alert, click Allow

Support

For any support, you can direct message us at #help-with-kandi-kits
  • © 2022 Open Weaver Inc.