In my previous post, I alluded not-so-subtly to my preference for female doctors. Writing about it like that made me wonder if that’s something I should challenge a bit.
The first doctors I ever remember going to as a child were all men, smiling men with long Italian names and cheerful demeanors that weren’t even thrown off by my (violently loud) hatred of throat cultures. They set an image in my mind of doctors as pleasant, caring, competent healers.
Later in my childhood, this image suffered a bad blow when I went for a couple of years to a different male pediatrician who always had cold hands and was in a perpetual rush. After college, I chose a random (male) doctor from my provider’s list of primary care docs. He didn’t help repair my image of male doctors–his favorite thing was to tell me that he was “underwhelmed” by issues that were, granted, not life threatening, but were nevertheless really bothering me (acne, shoulder and back pain, etc).
Now, I insist on only women doctors. But I wonder if I’m investing too much meaning in the doctor’s gender, thinking that a woman doctor is more likely to understand a woman’s body, or is naturally more compassionate. An insensitive doctor is an insensitive person, after all, and that type comes in both genders.
I’m currently contemplating whether to search for an integrated medical specialist, someone who is an MD, but who is also open to complementary and alternative therapies. There’s a well-recommended integrative doctor right in my town, but he’s…a he. Should I give him a call?