The End of Science as a Useful Tool

B
14 min readMay 20, 2024
Photo by Hal Gatewood on Unsplash

Despite the rapid roll-out of seemingly newer and shinier than ever technologies, something seems to be amiss with science. The symptoms however — like a radical fall in new discoveries, the rise of flat earth theories, burgeoning bureaucracies, the politicization of results — are just that: indicators, not the root cause. But why is that so? What is really going on?

Sabine Hossenfelder, physicist and an excellent science communicator, has came out with three truly remarkable videos recently. At a first glance — to the casual observer at least — there seems to be nothing special or in common in them: just random rants from a former career scientist. (Even Sabine herself doesn’t make the connection between these topics.) But as usual, there is much more than what meets the eye.

My dream died, and now I’m here

Why flat earthers scare me

Scientific Progress is Slowing Down. But Why?

Before we delve into the matter, let me make a very important distinction right at the start. Technology is not science. It is the application of scientific discoveries. Science is the systematic study of the structure and behavior of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained. So when your colleague at work, your uncle or your drinking buddy waves their newest smartphone as a sign of “unstoppable progress”, what he or she is actually demonstrating is the lack of scientific discoveries, and the industrial scale refinement of technologies based on a set of already established principles. Let me explain.

As most of you already know, I have been working for large, multinational industrial companies for the last eighteen years of my life. I have first hand experience with how products (especially electronic ones) are developed, tested, sourced then manufactured. I took classes (both in the University and as part of my training curricula at work) on innovation management, procurement, manufacturing technologies and so on. All throughout these classes, though, there was one sure warning sign for the lack of progress highlighted by my teachers. Burgeoning version numbers.

Generally, it’s OK to have generation two or three from a product, offering incremental (or even radical) performance improvement over the original version, but by the time you roll out version four or five you should be definitely working on the next big thing. And not just theorizing about it, but actually designing, testing, and preparing to industrialize it. If you reach version ten, with still no new idea what to do next… well, that means that your company has turned into a mausoleum, and no longer functions as an innovative institution. And when the entire industry you’re in keeps hitting the repeat button with every product release, then you know that you are in trouble. A big one.

The first smartphone from 1994: the IBM Simon sitting in its charging case. Source

Let’s take cell phones for example. In essence they’re all based on the scientific discovery of radio waves, semiconducting materials and electrochemical processes needed to build batteries. Phones are applied science, or as we like to call it: technology. The existence of radio waves, for example, were proved by Heinrich Hertz in the late 1880s already. The technology of building ever smaller radios has developed ever further since then, until they got small enough to be slid into a pocket. So when next time someone waves a smartphone as a sign of scientific progress, kindly remind them that they are holding the results of a 140-year old discovery in their hands.

This is the point where the topic of burgeoning version numbers comes into the picture. What is the latest iPhone model? Number fifteen? Oh, and that’s not even counting that the first smartphone was not even designed by Apple. Actually, this title goes to the IBM Simon; dating back to 1994. That means, if you celebrate your thirtieth birthday this year, than you are as old as the smartphone. Now, do you know when was the first lithium ion battery developed, and by whom? Well, not as recently as you would expect:

In the early 1970s, Exxon scientists predicted that global oil production would peak in the year 2000 and then fall into a steady decline. Company researchers were encouraged to look for oil substitutes, pursuing any manner of energy that didn’t involve petroleum. Whittingham, a young British chemist, joined the quest at Exxon Research and Engineering in New Jersey in the fall of 1972. By Christmas, he had developed a battery with a titanium-disulfide cathode and a liquid electrolyte that used lithium ions.

Yes, if you are 52 this year, you are as old as the Li-ion battery. Sure, many refinements were made since then. These products got lighter, faster, cheaper and more widely available. But the very fact that we haven’t started to roll out anything new but version 234 of these technologies should be at least concerning. And it doesn’t stop with smartphones: Modern looking jet planes roam the skies since the sixties, and they didn’t get much faster or more comfier since then, only somewhat cheaper to operate. In fact, the Concorde have managed to reach 2179 km (1354 miles) per hour in 1969 already, the same year the US sent people on the Moon, and just 22 years after the very first supersonic flight was made by man. With the retirement of supersonic passenger flights, one could say, we are actually progressing backwards.

If you had been knocked unconscious during the first battles of WWI in 1914, and awoke 55 years later in 1969 you could’ve rightly say that the world has completely changed. Steam locomotives were replaced by diesel, then fully electric trains, while flimsy wood and textile planes powered by a loud and weak combustion engine were superseded by all metal sub- and supersonic airplanes of all kinds. Nuclear reactors, supercomputers, new surgical techniques, antibiotics etc. — things no one had the idea in 1914 that could even exist — have by then all become a practical reality.

Now, should something similar happened to your head in the year of the Moon landing, and you were awaken today… Well, you would see roughly the same looking airplanes, and even the same muscle cars from the very same brands. You would be eating at the very same fast food restaurants, drive on the very same highways and through the very same bridges as you did in 1969 before your accident. ‘OK, we’ve got smartphones, e-mails, GMO corn, and now AI, but aren’t these all just incremental steps over already existing technologies? Apart from gene sequencing have we made any new scientific discoveries, like splitting the atom…?’— you might ask. And rightly so.

Quantum theory was proposed by Planck in 1900, relativity by Einstein in 1905. Superconductivity was discovered by Onnes in 1911. The first electron microscope was built in 1931, followed shortly by the discovery of the positron, the first antimatter particle by Anderson in 1932. Nuclear fission was proved to be possible by Meitner & Frisch in 1939. 1947 saw the invention of the transistor by Shockley, Bardeen, and Brattain, and the first laser was built by Maiman in 1960. And the list goes on. Now compare that to what we have discovered in the past two decades… Call me amazed (not).

Based on the pace of scientific progress till the 1960’s, we ought to be driving flying cars powered by anti-gravitational engines by now. On Mars. But we don’t. So next time someone brings up the topic of fusion as something novel, and something which could be developed anytime soon, kindly remind them that the first patent for a fusion reactor was filed in 1946, and in 1951 fusion research already began in earnest… That is seventy-three years ago. And where are we from developing warp-drives, anti-gravity engines or spaceships capable of carrying thousands of people?

Photo by Collab Media on Unsplash

When people point to the unstoppable progress of science and technology, they tend to think of it as a straight line ‘from the caves to the stars’. Throughout human history, however, we see a series of arcs, each with its own ascendant, and well, descendant phase. The Chinese, the Greek, the Roman, the Mayan and the many other cultures before and after them have all developed their own science and technologies, and rose to the top of their arcs of progress… Only to experience a fall and a subsequent dark age thereafter. And most ironically, they developed their most sophisticated technologies right before their demise.

This is no different this time. Civilizations all follow the same lifecycle of rise-prosper-and-fall, as described by Tainter in his 1988 book: ‘The Collapse of Complex Societies’. He and his colleagues also proved that the same process, driven by hitting diminishing returns, is present in our civilization as well. Especially when it comes to science. As Deborah Strumsky, José Lobo and Joseph A. Tainter wrote in their 2010 study on the Complexity and the productivity of innovation:

Our investments in science have been producing diminishing returns for some time (Machlup, 1962, p. 172, 173). To sustain the scientific enterprise we have employed increasing shares of wealth and trained personnel (de Solla Price, 1963; Rescher, 1978, 1980). There has been discussion for several years of doubling the budget of the U.S. National Science foundation. Allocating increasing shares of resources to science means that we can allocate comparatively smaller shares to other sectors, such as infrastructure, health care, or consumption. This is a trend that clearly cannot continue forever, and perhaps not even for many more decades. Derek de Solla Price suggested that growth in science could continue for less than another century. As of this writing, that prediction was made 47 years ago (de Solla Price, 1963). Within a few decades, our results suggest, we will have to find new ways to generate material prosperity and solve societal problems.

Is AI going to be that “new way” then? Well, AI builds on existing human knowledge and combines it to burp up something “original”. So, if you were looking for a tool to write “scientific” papers at a thousand page per minute speed, you’ve got a winner. On the other hand, if you wanted to work out the principles needed to build warp drives then you’re out of luck: BS to BS conversion will not help you out. AI will certainly generate material prosperity to a select few, but not to society as a whole. It will be always much easier to dream up digital tools and sell them to investors, than to solve real world problems.

And now back to Sabine’s video on the lack of scientific progress. Albeit Tainter and his studies are not mentioned, she brings up quite an amount of other research and still reaches the same conclusion: scientific progress is in decline. She then asks the question: ‘Why is that?’, and puts out three major hypotheses:

  1. “No Problem: deny that a problem exists and insist everything is going just fine.”
  2. “Nothing Left: there’s just nothing left to discover (this is the death rattle of science, and the disease is fatal)”
  3. “Paper Treadmill: the current way of organizing scientific research impedes progress by rewarding productivity over usefulness.”

Those who have been paying attention so far could already see a fourth answer, missing entirely from civic discourse. What if our mental capabilities were also prone to hit diminishing returns? And no matter how much thinking we put into solving the next big mystery, we are simply at our limits already… In fact, I would argue, and many teachers would certainly agree: we are already well past our peak in human mental capabilities as a culture. As astrophysicist, teacher and fellow blogger Tom Murphy wrote in his essay ‘Reasoning with Robots’:

I don’t blame the students. I blame the fact that they have been trained to be robots, not thinkers. The market system has worked to make textbooks, lesson plans, and our educational system in general ever more pleasant for the customers, who are always right — by dint of the all-powerful money they hold. It turns out that boxes containing math recipes that anyone can follow go down well, receiving up-votes and becoming ubiquitous by demand. It feeds on itself: without having learned better “street” skills, the students are unprepared to be thrown into a more rigorous pedagogical experience, so the deficiencies perpetuate, resulting in watered-down classes at every level (and increased anxiety among students, who sense their tenuous grasp). My physics elders tend to be noticeably better educated in physics than I am, and I am better educated than present-day physics graduate students. This is not just a matter of accumulated experience: I was in better shape at the same stage, and my elders were in better shape than I was.

Einstein, Planck, Rutherford and a great many other thinkers worked on extremely complex math and physics problems entirely in their heads, equipped with a pen and paper at best. How many current scientists could do that today? Remember, there were no computers aiding their work, nor any textbooks on how to solve the mysteries of the Universe: they were the ones who had to invent the formulas now appearing in print… Computers, ever more efficient text books, predigested information has just made us dumber. We have lost many thinking skills during the past decades, and thus become unable to teach them to the next generation. And this is the point, where Sabine’s other video about flat earthers ties in:

“To me flat earthers are the first symptom of a much bigger problem. They show us what happens when people realize they can’t understand modern science, don’t trust scientists and therefore throw out even the most basic scientific knowledge on the rationale of skepticism.” […] “Unfortunately scientific knowledge isn’t heritable. We’re born without knowing anything of modern science. We have to learn everything and the more sophisticated science becomes, the more learning that requires. Sometime in the middle of the 20th century or so we reached a point where modern science just stopped being comprehensible to anyone without a special degree.

Quality of education, together with a capability to think through and solve complex matters, are all heading in the wrong direction; while science itself is becoming ever more sophisticated by the day. Is it any wonder then, that more and more scientists are needed to produce less and less breakthroughs? I’m not saying there were absolutely no progress made in recent decades, but the rate of discoveries are clearly pointing towards a decline. Based on the data presented here and in Sabine’s videos, a good case could be made that we are indeed at our limits when it comes to science and progress. And our prospects are not a tad bit rosier either.

Is it any wonder then, that scientific research activity has been slowly converting into a ‘paper mill’? Just like any other business past its innovative phase (Apple?), science has become a ‘cash cow’: feasting on grants and churning out publications no one reads or takes seriously. (Watch Sabine’s own account: it’s both heart breaking and revelating in so many ways.) Now add AI to the mixture — cobbling up papers at an even faster rate — and the only thing reaching warp speed will be the enshittification of the whole enterprise; turning science into an exercise in futility.

What we see here is perfectly normal, and has been repeated in various ways throughout history. Pushed beyond a certain point human rationality reaches it’s limits, putting an end to a civilization’s age of reason, and bringing back superstitious beliefs. The symptoms, many mistakenly believe to be the root cause behind, are everywhere. A dearth of breakthroughs in the most critical areas (notably energy). Declining research productivity. Diminishing returns on scientific investment. Degree mills producing ever lower quality scientists, writing ever more papers for their institution’s profit. Science becoming a cash cow and losing its credibility. People becoming unable to understand (let alone apply) even its basic principles… It should come as no surprise then, that magical thinking, tribalism and cargo cults took over even the highest echelons of power.

Meanwhile, true scientific results have become unpalatable with all their blathering about limits, climate chaos and all the rest — pointing towards an inevitable end to this version of a global civilization. Fossil fuels have provided all the necessary surplus energy needed to run so many things, including science, while remaining able to feed so many of us. With these indispensable energy sources turning net energy negative, and still no viable, scalable and cheap replacement found, the future of science becomes questionable.

Energy is the economy. No (surplus) energy, no economy, no frivolous activities either. Science has shown us the way how to use the massive bout of surplus energy from fossil fuels the most effectively, and how to pillage and plunder the planet more efficiently. Lacking the free energy needed to power the technology it gave us, science too will become useless. It’s discoveries will be forgotten in the centuries ahead, as there will be simply no means to utilize them — people will be able to grow potatoes just fine without understanding black holes and gravitational waves. Yes, it would be useful to know how to make fertilizer or pesticides, but lacking natural gas and oil (the prime source of chemicals needed to make these agricultural inputs), this knowledge too will be forgotten.

But then, as Tom Murphy asked: what’s the point? I don’t think there was a point anytime in pursuing science. We did it because we could. We had the curiosity, the surplus energy, and the mindset setting ourselves above Nature. We were, however, not evolved to decode every secret of the Universe. Much to our frustration, the world remained a largely unreasonable place, with only so many parts of it yielding to our primate logic and simple measurements. The better part of it, however, continued to act wholly irrational to us — and remained reliably beyond our capabilities to grasp.

Contrary to modern beliefs, this world, Earth, the solar system, the Universe, is in no need whatsoever for a reason to exist, or anyone to decode how it operates. It worked perfectly fine without us, ‘conscious’ human beings. And will work just fine when we are gone. We are no masters of this impossibly complex system, and never were anything but integral parts to it. Parts, which play an important role, but by no means being indispensable. A hard pill to swallow indeed — no wonder so many retreat to denial instead, and wait for the next big scientific breakthrough… Something, which might never come.

Science has enabled our species to overshoot the natural carrying capacity of the planet, and made us believe that we are above all living beings. That we are the masters of this Universe. Giving up that dream will be unbearably hard for many (especially for those in power), but that doesn’t necessarily infer that life will lose its meaning. There will be — in fact, there is — so many other things to live for than to plunder the planet and get rich. Friends, family, community. Or just living together with animals of the forest. Dancing, singing, playing a flute, telling stories around a campfire, cooking, gardening, arts and crafts were always be perfectly possible without science and modernity. The biggest psychological or I dare to say: eschatological challenge ahead of us will be to find this new meaning in the decades ahead, even as science and technology slowly breaks down around us.

Until next time,

B

Thank you for reading The Honest Sorcerer. If you would like to support my work, please subscribe for free and consider leaving a tip. Every donation helps, no matter how small. Thank you in advance!

--

--

B

A critic of modern times - offering ideas for honest contemplation. Also on Substack: https://thehonestsorcerer.substack.com/