Thursday, February 18, 2016

Medical news reporting


This is such a bullshit. Let me just glance over the fact that these studies hardly ever establish causality. What they do, is that they establish partial correlation which is as good as control variables they used. In a nutshell, you are more likely to get heart failure with the kind of people who both drink a lot of soda and do not exercise. If the study does not account for the exercising aspect, the rate of heart failure will be attributed only to drinking soda even though it should be attributed to lack of exercising as well. This is so called “omitted variable bias” and biomedical studies are plagued with it. The only way to eliminate it, and the only kind of evidence that should be accepted as definite are randomized controlled trials.

But this is only scientists’ wrongdoing and I want to complain here about how their findings – whether true or not – are twisted and misinterpreted by the public. To see it, let us look at the three news articles mentioned above and the corresponding research articles (which can be found here, here, and here). Let us discuss the 33% stroke figure from the first article. Upon reading the article, it seems that out of around 600,000 people, roughly 1700 had a stroke, which gives us average incidence of less than 0.3%. That is, if you work over 55 hours a day, your chances of getting a stroke are 0.4% instead of 0.3% that you would have if you worked just under 40 hours a day. In fact, your chances of getting a stroke if you work over 55 hours increase by 0.1%, not 33% as the article implies.

Second story: supposedly, drinking at least two sugary beverages a day increases your chances of heart failure by 23%. From the source article, we see that there were around 42,000 men taking part in the study. Around 4100 of them had heart failure. That gives us an average incidence of 9.8%. Drinking a lot of coke increases incidence to 12%. Therefore, your chance of getting heart failure raises by 2.2 percentage points, not by the implied scary 23%.

Finally, let us consider the claim that light drinking increases chances of breast cancer. According to the study, out of around 88,000 women, roughly 19,000 developed cancer. This tells us that an average women in this study had a probability of 21.6% of developing cancer. On the other hand an average drinking women had a probability of 24.4% of developing cancer. Thus, the probability increases by 2.8% which does not seem too much when compared to the overall risk of developing cancer. (My calculations in all the three examples are approximate and based on a number of simplifying assumptions, yet they should illustrate the problem).

So what is going on? What is the source of all these discrepancies? The numbers that are reported by medical journals are relative risks (and the statements about them are technically correct but may be misleading to an untrained person). That is, they tell you how many times a person who is involved in a “risky” behavior is more likely to contract a disease as compared to a person who is not involved in the “risky” behavior. That is, it may be true that a person who works a lot, a person who drinks a lot of soda, and a person who drinks a little alcohol every day are 1.33, 1.23, and 1.13 (respectively) times more likely to contract a disease in question than a person who does not do it. But the probability for the person who does not do it is usually small anyway, so even doubling it often does not increase your overall probability of contracting the disease enough to worry about it.

A related problem has to do with labeling substances as carcinogenic. Usually, anything that increases probability of causing cancer is labelled as carcinogenic. Therefore, a substance that increases your chances of contracting cancer from 15.2% to 15.3% (that is by 0.1% overall) is treated by the media frenzy in the same way as a substance that increases your chances from 7.5% to 38.1% (that is by 30.6% overall). People easily swallow it: to a raging mom dead set on protecting her baby from all harm, something labelled as a carcinogen is pure evil, no matter how strong a carcinogen it is and whether it also has some unrelated health-enhancing benefits.

But this is not all. Even if the numbers were reported correctly, they still would be a bad guidance for making decisions. This is because virtually everything has its advantages and disadvantages. Drinking a glass of wine a day may increase your chances of developing cancer but may also decrease your chances of getting a stroke. Quitting smoking may save you from lung cancer but may also cause you to eat more which will make you obese and increase your chances of heart failure. And so on. The only way to make a good decision is to perform a complete cost-benefit analysis with respect to an ultimate variable of interest. That is, a study is most useful for a person deciding what to do to stay healthy, if it reports whether doing something increases or decreases your expected healthy lifespan (and by how much). Only then you know full potential impact of such a decision on your health and can decide whether changing your habits is worth it. Unfortunately, studies reporting effects on healthy lifespan are hard to come by.

The focus on incidence and disregard for life expectancy is the source of another public misunderstanding of the current state of affairs: the notion of cancer epidemic. It is so scary: the cancer is currently much more prevalent than it was decades ago and is projected to be even more common in future. The world surely is becoming worse place, at least in this one respect, right? Bullshit. The reason why cancer is becoming ever more prevalent is because we are better and better at fighting other diseases. The average lifespan continuously increases because we successively eliminate other causes of death. This leaves cancer (along a few other culprits) to kill those people who did not die from other reasons. To sum up: the cancer epidemic is a positive sign of the overall progress in medical sciences.