Many dental professionals are drawn to a career in education. Some of the benefits are obvious: You get to give back to the profession by passing on your knowledge; you gain prestige from your participation in an academic program; and you can depend on a stable (though low!) income. In addition to those, there are other, more unexpected benefits that come with a career in dental education.
As a high school senior, I had an opportunity to interview for a collegiate scholarship, during which I discussed my aspirations for a career in dentistry with an all-male panel of judges. I remember being asked, “Why don’t you want to be a dental hygienist or an assistant? Aren’t those the typical roles in dentistry for a female?”
I was taken aback. I was sure that it wasn’t their intention to instill self-doubt in a woman pursuing a career in a male-dominated industry. However, I couldn’t help but feel as if I was being relegated to another career, based strictly on traditional gender roles.