My Experience with Facial Identification and Its Bias Against Women

By Boluwatife Aminu ‘27

High school me, unfortunately, was not the most punctual person. While my school did ultimately let me graduate — and escape Delaware, apologies Biden — I routinely refused to pass up the opportunity to grab Dunkin’ with my sister right before school started. After discovering the technology of ordering ahead, we cut down our time by a lot. But, what truly saved us was that my 14-year-old sister could authorize purchases with her face on my iPhone. It was this same phone that was configured to accept, and only accept my 17-year-old face. Several software updates later, and this was still the case. Unbeknownst to me, I happened to fit a checkbox for people whose facial identification (Face ID) doesn’t always work: Female.

At the time, however, I didn’t even know the extent of this new facial identification terminology. The me at the time was just ecstatic to upgrade my 4-year-old iPhone 6S with a nearly defunct home button to my purple iPhone 11. I decided to launch my new phone by showing up to school with my hair done in two puffs that each sported two purple barrettes, my new pair of purple Crocs, and my purple iPhone 11 in its purple case that I had all received during that winter break.

Nevertheless, the post-winter period included my uncle suggesting I watch the documentary “Coded Bias” on Netflix. After watching the documentary, I could better articulate what I was experiencing with my new phone. Amidst the shimmer of the iPhone 11 name, it appeared Apple would never dissociate me from my iPhone 6S because my new phone could not tell me, its owner, from my younger sister.

To delve deeper into this, I must first define facial recognition and facial identification, and the difference between them. However, my definition comes from Dr. Joy Buolamwini’s memoir “Unmasking AI” (of course I became one of her many fans following the documentary). Identification refers to matching one’s when you look directly into the camera to verify your identity. Whereas, face recognition is in regards to your face as a whole and can work when you are not staring into the camera. On another note, I appreciate Apple recognizing the difference between a living, breathing version of myself and a picture.

Yet, the problem with Face ID and many of these computing systems is that the system created on datasets reflects the biases of those who created it. In Dr. Buolamwini’s paper (the one that largely inspired the documentary), she noted that women and darker-skinned individuals are more likely to be misidentified by facial systems compared to their lighter-skinned, and male counterparts. Heavy on the male counterparts because when you look into Silicon Valley, the majority of the employees here are “tech bros.” If there were a significant number of women, we’d have “tech girlies” but we’re barely even on the brink of “finance girlies.”

But I must say, all of this development has been novel to the field of CS. The movie only came out 4 years ago but these harms have been perceptible for quite some time now. We tend to have a code now, think about ethics later approach to our adoption of technology. But at least we’re talking about it now.

That’s why I’m grateful for communities such as Stanford’s Women in CS. The thing is, as we prepare for leading careers in CS, I desire that we contribute to creating systems that are more equitable than we found them. And I do not doubt our ability to accomplish that.

Note: I hope this did not require a Bibliography as MLA was never my favorite…

--

--

Stanford Women in Computer Science

Stanford Women in Computer Science is a student organization that aims to promote and support the growing community of women in CS and technology.