Another concern of AI, although I actually don’t hear about this as often, is bias.
I want to start by sharing an example:

If you see the bias, I applaud you. If you don’t, you’re either old or a man.
The bias here is that the doctor was assigned a man, and the nurse was assigned a woman. This dynamic exists IRL, of course, but there are plenty of women doctors and male nurses. The “higher” positions are given to men, while the “lower” positions are given to women.
It’s amusing how ChatGPT pairs a man and a woman in the story—it instantly feels romantic! It even prompts me to decide if I want to steer it in that direction. But here’s the kicker: my original prompt didn’t suggest romance at all!
I wanted to experiment a little more, so I prompted ChatGPT with two stereotypical masculine careers:
It still gave me a man and a woman. I suppose the more “manly” job was considered that of a plumber. Also, what’s the plumber’s name? Javi. Do you want to guess the origin of that name?
So the plumber is Hispanic…
Next, I tried it with two stereotypically feminine careers:
I didn’t realize that ChatGPT is quite the romantic.
At this point, I’m sick of it telling me the same story with different names and careers. Moving on!
I wanted to try a different prompt. I am going to put them side by side:




I’m glad to know that ChatGPT is aware of gender inequalities in the workplace. I’m also glad that it included some of the same reasons between the two, but look at the priority of some of them. For men, becoming a parent is listed wayyyyy at the bottom. For women, it’s smack dab in the middle. Also, the men’s reasoning is incredibly simplified, with phrases like “becoming a parent” and “needing time to recover or care for someone else.” Versus, the women’s reasons are phrased as: “Childcare or eldercare responsibilities: She may need to stay home due to lack of affordable or accessible care.” Also, notice how, for men, one of the reasons they may quit is “moving to a new city or country,” yet the closest we get to that on the women’s list is “she might move because of a partner’s job.”
Then some items appear on the men’s list but not on the women’s. For example, retirement. Retirement doesn’t exist for women, FYI.
Keep in mind that AI isn’t inherently sexist, but it learns from the data it’s fed, and that data is filled with generations of bias. When a tool like ChatGPT reflects gender stereotypes, it’s not trying to be offensive; it’s holding up a mirror to the world we’ve built. But that doesn’t mean we should shrug and accept it. Recognizing these patterns, whether it’s always pairing men with authority and women with support roles or turning every mixed-gender story into a rom-com, is the first step toward demanding better. If we want AI to help build a more equitable future, we have to question what it’s learning and challenge the defaults it presents. Bias doesn’t disappear just because it’s wrapped in code.
And how do we adjust? We craft efficient and specific prompts, and we can learn those techniques in the classroom.