Artificial intelligence is at the top of my mind right now. We have a Power Breakfast on the topic on Feb. 28, and much like when you get a new car and see it everywhere, any and all AI stories are catching my attention. 

But they’re also catching the attention of our readers. Teva Dawson, formerly of the Des Moines Area Metropolitan Planning Organization and now the co-owner along with Mat Greiner of Group Creative Services, reached out recently to share two interesting articles about the potential for bias in AI. 

Around the same time Dawson received the invite to our Power Breakfast, she coincidentally received an invite to a workshop in New York titled “What does AI need from you?” It  aimed to focus on creating dialogue around the intersection between AI and race, gender and aging. 

Said Dawson via email: “I would never have thought of race and gender implications with AI — yet why wouldn’t it be present? Bias is infused into our culture so why wouldn’t it be infused into the programming that is shaping our future-selves?” 

One of the articles she passed along was from ProPublica, titled “Machine Bias,” and in particular explored a bias in a software that’s used across the country to predict future criminals. The problem? It’s biased against blacks. Here’s the article: http://bit.ly/1XMKh5R. 

She also passed along an NPR podcast — http://n.pr/2Cq6oI3 — that touched on gender bias with robots. For example, Dawson shared from the podcast that robots that are designed to serve and do things like turn on the lights are often given more female names, while robots that are designed to be smart and solve problems are often given male names. 

A big thank you to Dawson for bringing this aspect of AI to my attention, and I hope to spend some time on that topic at our event on Feb. 28.