I’m not just talking about ‘medicine’ but anything and everything related to health care (nursing / medicine / paramedicine / etc.). This website has mentioned the inaccuracies portrayed on the big screen and on TV before. Everything from Nurse Jackie, to medical TV drama shows. We as health care professionals easily and quite often recognize and can differentiate between reality and ‘reality TV’.
What about the laymen? What about the general population? What about our patients and their families? How do they know the difference between what is real and what is Hollywood’s reality.
The greater question being, is Hollywood medicine health care’s enemy? Or our ally?
Some would argue that Hollywood is the enemy. They are making the real professional’s job much more difficult. Not only are we having to educate and direct our patients properly, but we have to defend and explain why what they saw on TV / theater is just not how it happens. We have to continually battle the misinformed hype with scientifically sound evidenced-based practice. Check out what Dr. Val Jones has to say about the most infamous Dr. Mehmet Oz.
Others would argue that Hollywood is helping bridge the gap between the uninformed and the now curiously inquisitive. Having the patient’s (and families) take an interest in their health, while a jagged step, is still a step in the right direction. The media is at least generating the interest and creating the conversations that the health care industry is trying so desperately to establish. This side would also argue and give the patients the benefit of the doubt. That the patient population is well informed, now more than ever.
We all know that there is more to the story when it comes to Hollywood and there messages about health and health maintenance. Even the most well respected of professionals will have to pay their ‘piper’. I mean Hollywood is all about the almighty dollar, and the best information they can give is the information that makes a profit. It’s not a new concept. Unfortunately the physicians have been under the microscope before. Do you remember a time when physicians were being questioned about their prescriptive practices? Were there choices motivated by profit? Were they prescribing certain medications because of the kickbacks they would receive? (Yes, I know many changes have been put in place to deter and hopefully prevent this)
Are these reality TV shows being motivated to ‘sell’ you something? Or are they being genuine about their message of health?
One thing is for sure, Hollywood and their reality TV shows and their big picture films aren’t going anywhere. Whether the health care community likes it or not, whether they agree with their methods or not and whether they believe Hollywood has the patient’s best interest in mind is still a matter of opinion. The question is do we want to put up our dukes and fight the good fight? Or should we join forces to try and knock down the walls of misinformation and profoundly change the ‘reality’ of health?
Which option benefits the patient more?
Things that make you go hmm.