Skip to main content



Cultivating a Culture of Sickness

Healthcare has become the buzzword for what used to be the field of medicine. While medicine implies the healing of dis-ease, healthcare, in its raw meaning, sounds like it is about taking care of our bodies so we will not need to resort to medicine. And yet, it has become exactly the opposite. Healthcare has become an institution based—not on promoting healthy humans—but almost solely on profit. It has become an oxymoron. Healthcare in the United States is now Big Business, shrouded in media marketing and frequent corporate mergers. Last week, I heard a radio commercial from a large healthcare system say to its audience: “We haven’t seen you a while. Don’t forget, our urgent care offices are open seven days a week and we can help when you or your family get sick from…” fill in the blank. It went on to list multiple common problems, some of which, like a cold, have no clear treatment options, while promising the convenience of being in and out in less than an hour. I happen to know tha

Latest Posts

The "Undue Influences" in Our Lives

The Most Debilitating Disease of Our Time

Our Souls Must Catch up

Why would a physician become a coach?

Medicine for Profit