endometriosis research made accessible to all

  • White Facebook Icon
  • White Twitter Icon
  • White Instagram Icon


  • endoupdateblog

Bad News

Hello readers! Welcome to the first post of the official Endometriosis Update website. I’ll give you fair warning, it’s a long article this week so if you want to tackle it all in one go, settle in! After presenting at the Endometriosis Research Now conference last month I got some great feedback and ideas from the people who attended my talks. One of the popular ideas was to write a post which built on the critical appraisal of scientific evidence I’ve written about before. Specifically it was suggested I should write about how to analyse pieces of endometriosis reporting and take them apart bit by bit to show exactly how to identify and counter poor writing on the subject. So, that’s what I’m going to do!

I’m going to take examples of bad endometriosis reporting and use them to make a list of points of what to look for when reading stories on endometriosis. I’ll be using screenshots of the actual articles rather than quoting or linking directly to them (because linking to bad articles seems at odds with the purpose of this blog), even if it is for educational purposes.

To start with then, perhaps the most important point to check when reading an article on endometriosis and one that you can check early on in most articles:

Have they got the description of endometriosis right?

A very simple thing to do, or so it would seem. When writing an article on endometriosis it is pertinent to include a description of the condition, which almost every article does. What every article does not do is give an accurate description of endometriosis. This is an easy thing to look up; the World Endometriosis Society and the World Endometriosis Research Foundation even produced a document called ‘Facts about Endometriosis’ specifically “Facts about endometriosis to aid journalists, medical writers, bloggers etc …”, their correct description is as follows:

Endometriosis is a condition in which tissue similar to the lining inside the uterus (called “the endometrium”), is found outside the uterus, where it induces a chronic inflammatory reaction that may result in scar tissue. It is primarily found on the pelvic peritoneum, on the ovaries, in the recto-vaginal septum, on the bladder and bowel

There, a nice and easy description for anyone writing about endometriosis to use verbatim or write a variation thereof. It would surely be difficult get it wrong with such easy access to accurate information right? Right? A lot of writers get it right, but there’s a very important detail that needs to be included, and that is “endometriosis is a condition in which tissue similar to the lining of the uterus..”, with specific emphasis on the key word ‘similar’. Endometriosis is not the same as the normal endometrium, they are two different tissue types that look similar and have some similar properties, but are distinct in many, many ways. If an article cannot get this simple bit of information correct at the beginning, can you trust they have the rest of their facts straight too? Below are some examples of article writers falling at the first hurdle.

All of these descriptions almost get it right, yet miss that key feature, which may seem trivial, but is essential when accuracy is important. Some may argue that it’s only a minor detail and easy to get wrong, but equally it is also extremely easy to get right.

The truth is we don’t know for sure what causes endometriosis, all we know is that tissue similar, but also dissimilar from that which normally lines the inside of the uterus (the endometrium) is found elsewhere in the pelvic region; how it gets there remains very much up for debate. So when they say things like “cells normally found in the womb attach themselves to other parts of the pelvic area” as in the second example above, they are making statements without the evidence to back it up.

Is it from a Trustworthy Source?

Take a look at the infographic on this page. Don’t worry too much about the ‘is the science content compelling’ axis running from top to bottom, what’s really important is the axis running left to right i.e. ‘is the media outlet’s science coverage driven mostly by evidence?’

As you can see from that infographic, most evidence based (and therefore, trustworthy) sources are scientific journals and specialist publications, news and media sites occupy a middle ground ranging good to poor, and personal websites and/or blogs are in the ‘pure garbage’ category. This is a perfect example of what I call 'The River of Misinterpretation’, where the more steps there are between the original scientific research and the article you’re reading, the less trustworthy it becomes. Or to put it in the analogous sense, the further you stray from the source, the more diluted the initial product becomes. Below is a overview of the river of misinterpretation.

It places readers in a bit of a bind though; accurate information is sometimes hard to get hold of and not written in a way as to be easily understandable for everyone, while inaccurate information is widely available and easy to access. There aren't many ways to combat the availability of misinformation in today’s society, where we are exposed to a constant stream of input from our phones, computers, TVs etc. The only way to get around this problem is to have our own internal filters made from rational scepticism. Obviously not all blogs, websites and social media posts are going to be unreliable or packed with misinformation (I like to think the one you are reading now is pretty reliable), but exercising more caution is warranted when reading anything on the internet; as my old pal Abraham Lincoln says:

Does it Reference its Source Material?

If someone was to approach you on the street and claim they could turn lead into gold you wouldn’t be unreasonable to ask for some proof of this rather than just take their word for it. Similarly, in science you are expected to back up any statements you make, either by presenting your own data to prove it, or by referencing rigorously checked data that has been published by someone else. Again, none of that sounds unreasonable to me. Yet in numerous media articles and blog posts I’ll often see someone claiming ‘A study shows’, or ‘researchers say’ and not provide a link to that study, expecting the reader to trust the writers have interpreted the study correctly, or even trust that the study existed at all.

With this blog I try to link to every bit of scientific research that I talk about. This is because I want my readers to be able to go and check for themselves that what I’m telling them is true and make judgements for themselves about the research I’m discussing. It also gives anyone the opportunity to point out if I’ve made a mistake or misunderstood something in the source material.

Does it use Sensationalism or Fear Tactics?

Perhaps the most insidious and vile tactics used by media outlets to get views and generate money are sensationalism and fear tactics. Sensationalism is using wording to make a story sound more exciting than it is, usually by blowing it out of proportion or making grand, unfounded statements, or misrepresenting the findings of a study to make it sound more attention grabbing. When looking at health stories these often take the form of titles like ‘THIS raises cancer risk’ or THIS may cure cancer’. More often the enigmatic ‘THIS’ isn’t referred to in the title, or it is hinted at if it is a commonly used product or foodstuff. This particular trick is designed to make you click the article and get transferred to a different page, one that is full of adverts which the site gets paid for. You won’t have to look far to find an example of fearmongering. I went onto the ‘health’ section of a particular media outlet and found this as the second story on that page

What are the authors and editors of pieces like this trying to achieve if not to terrify people into reading their articles? It is playing on our innate concerns for our health, or the health of loved ones and making us believe that the knowledge contained in this article will help our survival, when it actually won’t. This wording is not the kind used by those who are trying to help you, it is the language of the exploitative and manipulative, who wish to profit from such fearmongering. For example, on that same webpage, from the top of the article to the bottom there were seven adverts, twelve click bait links (e.g. YOU WONT BELIEVE WHAT THIS CELEBRITY LOOKS LIKE NOW), and thirty four links to other articles on that site, the actual article itself was only around 660 words long.

Some might argue that these type of articles are just raising awareness of real medical issues people face. I’ve no problem with that, however it is the manner in which it is done that disgusts me. The title of the article above could just as easily be written as ‘Charity advises people with asthma take care when selecting what foods to eat over Christmas’. I actually went to the Asthma UK charity website named in this article to look at the advice they gave and see if it was as doom-laden as this journalistic piece; here is what they had to say about Christmas trees:

Real Christmas trees host mould too, so if that triggers your asthma choose a fake one. However, artificial Christmas trees carry their own triggers – sometimes mould grows when they’re kept in storage, and they can get dusty. Try wiping the plastic leaves and any decorations with a damp cloth

And on Christmas food:

“This time of year is full of different treats which could cause asthma symptoms. Mulled wine, pigs in blankets and other processed foods contain sulphites which lots of people with asthma are sensitive to. Food allergies are even more serious – so if you have a serious allergy, e.g. from peanuts, follow your allergy and asthma action plan

This calm and sensible advice from a real organisation, which is actually dedicated to helping people, is quite at odds with the ‘Christmas can be deadly!’ rhetoric of the news media. The lesson to take away from this is to always ignore fear tactics and check any information you are concerned about with a trusted source (like a charity or educational organisation).

Let’s take a look at an example of this as it relates to women with endometriosis, here is the title of two particular articles.

Wow, pretty scary right? Also very over the top and attention grabbing, but is any of it right? The second article gave a link to the original study (see here), so I was able to access it and see if any of what they were claiming was true. First of all, can you spot what’s wrong with the bullet points in the first article headline? Yep, they got the description of endo wrong straight away, not a good sign. Also the headline is factually incorrect, the study did not find two or more servings a week increased the risk of endo by more than 50%, it found that two or more servings of red meat A DAY increased the risk of laparoscopically diagnosed endometriosis by 56% compared to women eating one or less serving of red meat a week. Also, while the original article does include evidence linking alterations to estrogen levels by consuming red meat from steroid treated animals, it does not anywhere imply this ‘leads to endometriosis’, which is something the news site has added of their own volition.

At least the second article was correct with the reported measure of meat per day, but it claims red meat doubles the risk of endometriosis and then indicates it raises the risk 56%. Is a fifty six percent increase the same as doubling? No. If you have 100 women with endometriosis and increase that number by 56%, you would then have 156 women, not 200. Also they are factually incorrect in stating, in the subheading, that “women who eat meat twice a day are 56% more likely to develop endometriosis” because the original study found no link between endometriosis and poultry, fish or shellfish. I’m not defending eating two or more portions of red meat a day, if nothing else it increases the risk of colorectal cancer, but I would like to see the results of such studies linking its consumption to endometriosis reported correctly.

Another example of fear tactics is the ‘horror story’. These are articles that solely focus on the most extreme or worst cases of a particular condition or disease and target women’s’ fears. I don’t have a problem with women telling their stories, obviously with a condition like endometriosis examples of women struggling to get diagnosed and treated in the face of ignorance and disbelief are sadly far too common and need to be told to educate and inform. Yet, if an outlet is using the suffering of others, sensationalising this suffering, and using it to instil fear in others for profit, I have a problem with that.

Information regarding a condition like endometriosis needs to be treated with care and sensitivity, there are many aspects of the disease that are scary, like infertility and pain that can be distressing for some people to talk about. A young girl experiencing pelvic pain, or a woman trying to conceive does not need a media outlet taking the experiences of others and using them to write horror story pieces, sensationalising those experiences to scare people into clicking their links or buying their papers/magazines.

Like I said, endometriosis is a condition around which genuine suffering gravitates, but these experiences should be used to educate others, not terrify others.

I should say that I don’t think many of the writers of these articles are bad people, I don’t believe they are intentionally trying to do harm or spread misinformation. Rather, there is simply a lack of fact checking and expert knowledge on endometriosis. This applies to journalistic circles and the world alike. Unfortunately there aren't enough experts to properly vet all information that gets put to the public.

Are there Inconsistencies and/or Contradictions?

Science is a complicated subject, it can take someone an entire lifetime to become an expert in just one very narrow subject and even then mistakes get made, “to err is to be human” after all. So it’s no wonder that errors get made when non-scientists try to interpret scientific data. This doesn't always have direct impacts on peoples’ lives, if say, a journalist writer for an astronomy piece misinterprets results of the composition of Jupiter’s gas clouds from new satellite data. However, when it comes to reporting on issues that impact health and wellbeing, particularly that of people who suffer chronic illnesses, accuracy and trying to minimise errors are paramount.

Nevertheless a dead giveaway that something is amiss in an article is if there are inconsistencies or contradictions in it. Let’s look at an example of this

Have they Falsely Extrapolated Results or Confused Correlation with Causation?

To extrapolate, in the scientific sense, means to look at some data we already have and use it to predict how a trend will behave in the future. For example, if we look at house prices in the UK over time, we see that they have, on average, increased since the 1950’s so we can say with some confidence they will continue to do so in the future. That is using one set of data to predict the outcome of something else, extrapolating future data from current data.

The trouble is when extrapolation is used to link things that are not connected, and this happens a lot when people interpret scientific data. One of the top examples of this, in medical and biological science at least, is extrapolating results obtains from in vitro and in vivo studies to humans. I’ve given a longer description of in vitro and in vivo before, but for simplicities sake in vitro experiments are those performed in a lab, usually on cells or tissues grown in said lab. In vivo are experiments performed in non-human animals.

These types of experiments are essential for our basic understanding of disease mechanisms and treatment, but they cannot be extrapolated to how the disease would behave in the complex system that is the human body. For example, there was a lot of hype a few years ago about turmeric as a potent anti-inflammatory agent, I found posts like this one:

OK, what exactly do these studies say? Well, first of all these studies aren’t looking at turmeric as so many articles claimed, but then go on to elaborate:

They are in fact all in vitro or in vivo studies, looking at specific effects of curcumin (a compound that only comprises around 3% of turmeric), like anti-inflammatory properties, in cells grown in a lab, or rodent models of endometriosis. Many of these studies found encouraging results in terms of positive ways curcumin could be beneficial for endometriosis symptoms, but these results are still not translatable to humans. Let me give you an example of why that is the case. In one study, rats were given a certain dose of curcumin (150mg per Kg of body weight) equivalent to roughly 10g curcumin for the average woman in the UK. But because curcumin is only a fraction of the content of turmeric, you would need to eat over 300g of raw turmeric to match the dose of curcumin the rats received.

It’s a similar story for many in vitro and in vivo studies, they normally use doses of the chemical of interest way beyond what could be administered to humans (or what could be gained through the diet) just to investigate the properties of that chemical. These types of experiments may, one day, inform the development of new treatments, but they cannot be used as treatment advice on their own.

The next error is assuming correlation equals causation. If two separate things run in concert with one another they are said to be correlated. For example, if you were to get a cup of water and measure the amount of ice in that cup, while at the same time changing and measuring the temperature of that water, you would see two relationships. As the temperature went down, the amount of ice would increase, as the temperature went up the amount of ice would decrease, the two factors are correlated because when one changes, so does the other. In this example we know that temperature caused the change in amount of ice, so the relationship was both causal and correlated. But, just because two other sets of data follow the same pattern, doesn’t mean one caused the other. There are some great examples of this at Spurious Correlations, one of my favourites is the one below

The relationship between the age of Miss America and number of murders by steam is very similar, which means the correlation is very high, does this mean one causes the other? No, of course not!

We see examples of this kind of mistake when it comes to linking endometriosis to other lifestyle and environmental factors. As we have seen with the example of the red meat study above, results of these studies can be misinterpreted. In the original red meat study the authors suggest such high consumption of red meat may be contributing to endometriosis symptom severity, but not causing the disease itself.

This is something you may encounter a lot in articles on endometriosis and diet, titles like ‘this food linked to endometriosis risk’ or ‘this food reduces endometriosis risk’. These type of articles may be reporting a correlation between endometriosis risk and certain foods, but are they reporting a causal link between the two and is that causal link justified?

Despite how long this post has been it still doesn't cover everything we need to avoid fake news in our society. Hopefully though it is a useful primer for anyone who reads a story on endometriosis in the future. For more information on spotting fake news, check out the link below


Images sourced from pexels.com



Dr Matthew Rosser

I have over 15 years experience researching endometriosis, endometrial cancer and fibroids. During this time I have       noticed that whilst research is regularly published on               endometriosis very little is reported accurately to the public in mainstream media. This blog aims to educate and inform anyone who wishes to learn more about the science and research into endometriosis.



© Copyright 2019 - By Endometriosis Update

  • White Facebook Icon
  • White Twitter Icon
  • White Instagram Icon