The Student Room Group

Is 90% of scientific research a bit crap?

Theodore Sturgeon of Sturgeon's law fame said that 90% of everything is crap.

This guardian article highlights some of the problems with scientific research - such as between a quarter and a third of all research done remains uncited

Does this need to change or are we carrying on fine? What's your opinion?
Reply 1
No, it's closer to 99%.

1. Due to natural progress over time there are so many niche fields and scientists don't see the bigger picture. 100 years ago a physicist knew the entire scope of physics, well they invented it. Now a physicist knows some specific subset of another subset of another. So it all loses purpose and direction. The only purpose now is to publish publish publish to bring in the money.

2. 100 years ago the equipment was hands on, you would build it yourself in fact and gather data manually. That gives you a massive insight. Now the equipment is automated and processes the data for you. It leads to people publishing data, rather than thinking of deeper experiments designed to dig further and explain that data. So you see result of say 1eV...its published the next day. No reason or rhyme. Just a pretty graph. Before you would think about it some more, and build a new experiment to get to some fundamental truth.

3. 100 years ago, is when physics was invented, as well as engineering and biology and medicine. So the students back then were getting taught by the guys who discovered the **** for themselves. Students now are getting taught by the guy who heard it from the guy who heard it. So it loses the understanding and genius part, and becomes just a transfer of second hand information. Over time the intelligence and ability of researchers diminishes.

I could go on. They need to publish less quantity and more quality.
(edited 9 years ago)
Definitely a lot of ****ty studies out there. It's rather pointless having access to all that research on PubMed and such, when much of it turns out to be unusable in practice (eg. in the medical field) because it turns out to be ****ty quality upon scrutiny.
Original post by Armadillo
Theodore Sturgeon of Sturgeon's law fame said that 90% of everything is crap.


I don't know.

What are your criteria to distinguish between:-

That's mind-blowingly awesome
That's quite good
That's not bad
That's a bit crap
That is so unbelievably poor, I need to go and lie down in a darkened room


Have you published your data and conclusions?

Has anybody else repeated your study?

Were their conclusions consistent with your own? If not, was their research published in a fancier journal because if so, then they were right and you aren't.
Reply 4
Original post by deech
No, it's closer to 99%.

1. Due to natural progress over time there are so many niche fields and scientists don't see the bigger picture. 100 years ago a physicist knew the entire scope of physics, well they invented it. Now a physicist knows some specific subset of another subset of another. So it all loses purpose and direction. The only purpose now is to publish publish publish to bring in the money.

2. 100 years ago the equipment was hands on, you would build it yourself in fact and gather data manually. That gives you a massive insight. Now the equipment is automated and processes the data for you. It leads to people publishing data, rather than thinking of deeper experiments designed to dig further and explain that data. So you see result of say 1eV...its published the next day. No reason or rhyme. Just a pretty graph. Before you would think about it some more, and build a new experiment to get to some fundamental truth.

3. 100 years ago, is when physics was invented, as well as engineering and biology and medicine. So the students back then were getting taught by the guys who discovered the **** for themselves. Students now are getting taught by the guy who heard it from the guy who heard it. So it loses the understanding and genius part, and becomes just a transfer of second hand information. Over time the intelligence and ability of researchers diminishes.

I could go on. They need to publish less quantity and more quality.


Some interesting points here. I agree that as knowledge gets further and further away from it's source it gets muddled, confused and diluted! That's quite an interesting theory in itself! :smile:
I've read quite a few articles about this and it is quite disheartening, but not at all surprising. Nowadays in academia your worth as an academic is measured by your research output - "research or die". In a perverted way, it's in a lot of people's interest to keep on spewing out research of questionable quality and use, to keep on citing (or not citing) research of questionable quality and use, to not maintain rigorous research auditing (both during research and after, during peer-review for publication) and so on. In this way, researcher's keep their jobs and departments keep their funding.

And great point above about the narrow focus of research. To be fair, that is an unavoidable consequence of the development of the sciences. But we should do more to encourage research projects that cut across narrow research silos. Again, this is against departments' interest as collaboration projects are seen to dilute the credit and attention accorded to the individual contributors (and thus departments/universities)

On the bright side some people are taking this issue seriously and are actually acting upon it: http://med.stanford.edu/metrics/

The Meta-Research Innovation Center at Stanford (METRICS) is a research-to-action center whose purpose is to advance excellence in scientific research. Our center aims to undertake rigorous evaluation of research practices and find ways to optimize the reproducibility and efficiency of scientific investigations. We aim to apply research methods to study how research is done, how it can be done better, and how to effectively promote and incentivize the use of best scientific practices.
(edited 9 years ago)
I will say that although a lot of research may never become famous or cited by thousands, if it is of good quality and reliability, and if it furthers human knowledge in some way, then it should be considered a worthy contribution to the scientific cannon.
Reply 7
Research is like most other areas of human endeavour, some of it is good, some bad and most is in between.
Research has become a nationalised industry. Since research isn't really profitable, that means there's more of it, but it also means that the incentives for producing research are detached from the real goal. You need to publish to show that you can do some work; a lot of these papers are not really research at all, they're better viewed as published exam papers, or as resumes. This is the case for junior researchers trying to get permanent jobs, but it's also the case for senior researchers trying to get government funding. Papers are published to give individuals chips in a bureaucratic game.
Original post by deech
100 years ago, is when physics was invented, as well as engineering and biology and medicine.


1914 was quite a year wasn't it?

Quick Reply

Latest

Trending

Trending