What about the laymen? What about the general population? What about our patients and their families? How do they know the difference between what is real and what is Hollywood’s reality.
The greater question being, is Hollywood medicine health care’s enemy? Or our ally?
Some would argue that Hollywood is the enemy. They are making the real professional’s job much more difficult. Not only are we having to educate and direct our patients properly, but we have to defend and explain why what they saw on TV / theater is just not how it happens. We have to continually battle the misinformed hype with scientifically sound evidenced-based practice. Check out what Dr. Val Jones has to say about the most infamous Dr. Mehmet Oz.
Others would argue that Hollywood is helping bridge the gap between the uninformed and the now curiously inquisitive. Having the patient’s (and families) take an interest in their health, while a jagged step, is still a step in the right direction. The media is at least generating the interest and creating the conversations that the health care industry is trying so desperately to establish. This side would also argue and give the patients the benefit of the doubt. That the patient population is well informed, now more than ever.
We all know that there is more to the story when it comes to Hollywood and there messages about health and health maintenance. Even the most well respected of professionals will have to pay their ‘piper’. I mean Hollywood is all about the almighty dollar, and the best information they can give is the information that makes a profit. It’s not a new concept. Unfortunately the physicians have been under the microscope before. Do you remember a time when physicians were being questioned about their prescriptive practices? Were there choices motivated by profit? Were they prescribing certain medications because of the kickbacks they would receive? (Yes, I know many changes have been put in place to deter and hopefully prevent this)
Are these reality TV shows being motivated to ‘sell’ you something? Or are they being genuine about their message of health?
One thing is for sure, Hollywood and their reality TV shows and their big picture films aren’t going anywhere. Whether the health care community likes it or not, whether they agree with their methods or not and whether they believe Hollywood has the patient’s best interest in mind is still a matter of opinion. The question is do we want to put up our dukes and fight the good fight? Or should we join forces to try and knock down the walls of misinformation and profoundly change the ‘reality’ of health?
Which option benefits the patient more?
Things that make you go hmm.
When we discuss students, we always mention their qualities. Those qualities show what they are…
If you or someone you know is juggling mental health issues alongside substance abuse, understanding…
For the last couple of weeks, the Israel-Hamas conflict has taken over the news cycle.…
Our eyes are invaluable, serving as our windows to the world. The ability to see…
Undoubtedly, one of the most demanding and challenging professions is nursing. Nurses work long hours in…
Echocardiography, or echo for short, is a key diagnostic test used by cardiologists to assess…