For this assignment, I decided to feed the HBS mission statement to craiyon.ai. I wanted to see how a computer assessed the following:
“We educate leaders who make a difference in the world”
With not much in mind for how this would turn out, I was truly horrified to see the following images after a few minutes of processing:
Once the initial shock of this AI’s horrifying attempt at recreating human features subsided, a sadness set in as I noticed that every image seemed to possess male features. Is the computer saying it thinks the mission statement is only applicable to men? This to me was indicative of the inherent bias built into data by systemic and historic discrimination. For most of human history education and leadership were out of the question for women. HBS only began admitting women in 1962 (nearly 100 years after MIT, Stanford, and Cornell; and even then it only admitted 8 women to the program).
Joy Buolamwini’s research at the MIT media lab uncovered deep algorithmic racial and gender bias in in commercial AI used by Microsoft and IBM. The fact is that, while we are making strides socially, we are up against thousands of years of bias, and training models with that in mind is critical to developing AI that is inclusive to all.
I know this exceeds the word limit, but I think it’s an important point that warrants noting. Many in this class will go on to lead AI organizations, and I implore you all to consider how we avoid building bias into our computational systems.