By Liam Mannix – Daily Life – Brisbane Times
Much of what we do is based on what science says is good for us. Take, for example, the ways we ease our sore muscles after a workout.
Published and widely publicised studies have shown that aches and pains can be eased with a cold-water bath, a dose of turmeric or a session on a foam roller.
People turn to these treatments because they are supported by a scientific method which, we are led to believe, relies on rigorous statistical analysis of masses of data to prove a point.
But in some branches of science that appears to not have been the case lately.
Some leading statisticians are now asking serious questions about the accuracy of dozens of studies carried out by Australian sports scientists using a controversial method they say is unreliable, deeply flawed and “moves the goalposts”.
All experiments on humans involve an element of randomness.
Say you are testing whether a foam roller helps soothe someone’s aching muscles.
At the end of the experiment, five of your subjects say their muscles really do hurt less. This could be because of the foam roller – but it could also be random chance.
To avoid this problem most scientists try to study as many people as possible to work out the variability and average out any effects due to chance.
But sports science tends to study very small groups of people.
That makes it very difficult to tell if the results are real – or just random chance.
To address this, a pair of researchers developed a statistical method called magnitude-based inference in 2006.
In traditional statistics findings are considered to be accurate when there is only a 5 per cent (or less) chance of them being a “false positive”.
The MBI method allows results to have up to a 50 per cent chance of being a false positive and lets scientists make big claims based on studies of just a few people – usually about 10.
The method quickly became popular with sports scientists across the world and there have now been more than 230 papers published which make use of it.
Scientists from at least six Australian universities, plus the Australian Institute of Sport, have led or co-authored dozens of sports-science papers that used MBI over the past decade.
The claim that foam rollers could help with sore muscles, made by a team in 2015 that included a Charles Sturt University researcher, was based on a study of just eight people.
The AIS tested curcumin – a component of turmeric – on just 17 men playing social football and basketball but were still able to report in 2015 the first empirical evidence that it too helped with muscle soreness.
And in 2019 a University of Technology Sydney researcher co-authored a paper that tested cold-water baths on just 11 Brazilian rugby players. They also helped with sore muscles, the team found.
In such small studies, “any classical statistical test would pretty much never see significant results,” says Dr Stephen Woodcock, a sports statistician at the University of Technology Sydney.
“This is where MBI comes into its own – it suggests significant results much more readily,” he says.
“The sad thing is that such ‘findings’ are seldom replicated and soon pass into the minds of coaching consultants as ‘scientifically proven’ when the evidence is super weak.
“I was amazed when I started working in sports science that there was this whole subfield of stats that no mathematician or statistician had ever heard of,” says Dr Woodcock, who calls MBI a “statistics cult”.
“It’s basically moving the goalposts,” he says.
In 2012, Dr Emma Knight – now a senior biostatistician at the University of Adelaide – was hired as a statistician by the AIS.
This is where she first came across MBI. She believes as many as half the institute’s researchers were relying on it in some studies. The institute disputes this, saying it was only four researchers.
Concerned the method might be flawed, Dr Knight convinced the AIS to hire the Australian National University’s Professor Alan Welsh – one of Australia’s most-respected statisticians – to work with her on a review of MBI.
That review, Professor Welsh says, was damning: MBI did not work when compared with accepted principles of statistics.
“They are claiming to have found effects that very likely are not real,” says Professor Welsh.
“It’s increasing the chances of finding an effect that is not there,” says Dr Knight.
In the wake of the review, the AIS phased out MBI in 2014.
Papers using the technique, which are almost exclusively confined to sports science, have been banned by top journal Medicine and Science in Sports and Exercise.
“If I was ever to peer review a paper using magnitude-based inference then I would reject it and tell the authors to completely redo their analysis,” says Professor Adrian Barnett, president of the Statistical Society of Australia.
“It’s strange that a method that has been clearly shown to be flawed has remained in use in sports science research.”
The University of Western Australia and James Cook University both told The Age and The Sydney Morning Herald that they were phasing out the technique, while the University of Technology Sydney declined to comment.
But Charles Sturt University and Victoria University both defended MBI as scientifically valid.
Professor Francesco E. Marino, Charles Sturt’s interim head of the School of Dentistry & Health Sciences, said MBI was used by researchers all around the world.
“Within the sports science discipline, researchers typically try and find improvements that might allow performance to improve by the smallest margin in elite athletes. The difference between elite athletes is minuscule, so sometimes MBI is used to describe a potential worthwhile change.”
Professor Andrew Stewart, Victoria University’s dean of sport and exercise science, said the developers of MBI were working with Associate Professor Kristin Sainani, a statistician based at Stanford University in California, to “co-author a joint paper on the accepted and appropriate use of MBI”.
“As a result, existing publications using MBI are scientifically sound.”
Dr Sainani is, self-admittedly, now a critic of MBI and she called the university’s statement “a distortion of reality”.
“I do not agree that existing publications using MBI are scientifically sound. Many of these papers contain over-inflated conclusions that are not supported by the underlying data.”
Liam Mannix–Liam is The Age and Sydney Morning Herald’s science reporter