Yeah, this. And this IMHO is one of the reasons science is in crisis today, because in many ways it is still being conducted according to the late-19th- and early-20th-century models. Scientists are not generally rewarded for doing good science (e.g. replicating previous results or engaging in pedagogy), but rather for making Big Discoveries, because that’s what all the great scientists did before. Trouble is that as time goes by there are fewer and fewer Big Discoveries to be made, and making them gets more and more expensive, while at the same time there are more and more people graduating with STEM Ph.D’s. So you have more and more people chasing fewer and fewer opportunities for career rewards. Another perverse consequence of this dynamic is the pressure to publish any old crap because Big Discoveries are presented in papers and so it is assumed that papers will lead to Big Discoveries despite the obvious logical fallacy. Often it becomes a full-fledged cargo cult [1]. (I once made a very successful career by publishing a lot of crap, so I’m in a position to know.)
Getting this right is a Really Hard Problem because there are entrenched interests for whom this poses an existential threat. There is a lot of power that comes from being an arbiter of truth in a society.
(I think one of the great unsung heroes of humanity is Jimmy Wales. He could have cashed in bigly on Wikipedia, but he chose not to, and as a result Wikipedia is one of the greatest repositories of objective knowledge ever assembled. And it’s free. It’s a freakin’ miracle. Arxiv is also big step in the right direction IMHO.)
To give my thoughts about the term 'Big Discoveries': I think it's mainly from the influence of the Enlightenment to focus on discovering absolute, simple laws of nature, and analytically using these laws to obtain a greater understanding of the world. But recently we are now discovering that most of the phenomenon we empirically observe (whether it be chemistry, biology, or society) can't be deduced straightforwardly from those laws, and conversely there is a limit to how we can deduce absolute laws from empirical data, due to the complex nature of large dynamic systems (interplay between molecules, interplay between cells, interplay between humans). The grand problem of Complexity still haunts us to this day.
I think that in the near future there will be a mode shift about how we think about science and technology in general. Simulation will start to replace analytic reasoning (since logical deduction inside our heads has reached its limits of analyzing complex systems), and science will become more and more indistinguishable from engineering (The question of 'what is possible?' will begin to replace 'why is this possible?'). I'm both terrified and excited about this new era.
It's not from a single book, I've personally come to the conclusion after reading about various topics on both science and philosophy, as well as thinking about some current trends in science and engineering. I can't really give you a simple answer, I'm still studying and trying to figure this out.
But to give you a bunch of unorganized links if you want to follow a similar line of thought:
- Seeing how theoretical physics has been left behind in progress for many decades in favor of fields like chemistry, biology, and earth sciences - which all seem to use those discovered laws of physics, and is increasingly trying to arrive at conclusions through computer simulation of those physical laws.
- Thinking about the relationship between computer graphics (which brings the virtual to the real) and computer vision (which brings the real to the virtual) - and its interplay between the two.
- Continental philosophy (I'm currently reading Batallie's The Accursed Share and it's giving lots of good insights about 'societal' systems and the general economy. Deleuze & Guattari also talks a lot about cybernetics, although their books are notoriously hard to decipher and I've only read second-hand explanations of it. And a few writings from Nick Land (preferably something before his breakdown) seems quite illuminating. Marx also seems to have some surprising proto-insights about the cybernetics of capitalism - it's an area of research I might delve into it later.)
I think science is in a crisis today for the same reason society is: the underlying ethics is a scientific realism have yet to be established. That's to say science, or the advancement of organized knowledge and the corresponding cultural realities are still oriented around anthropomorphic biases that were formed during the advent of global culture and embodied by the mystery religions. This made sense when the planet was a conquest, but not so much as a management strategy. Instead the ethics of science and culture need to be organized around ecological ethics: the reality of planetary stewardship is the challenge of the Anthropocene. This is a problem for scientific culture, because it's easy to build complexity on existing paradigms, it's harder to reorganize ones fundamental belief system. Example is being able to describe the surface of a black hole while people starve in the street. The problem of modern scientific progress is a problem of ethical reorganization, and once that is done (if it succeeds) then again we will be able to build systems of thought that appear new and foundational like the classical ones.
This is all exactly correct. What too often is left out of these discussions is the labor pool and incentives. Too many cooks in the kitchen, because there are incentives to keep adding more cooks, as though increasing the researcher labor supply were a goal in itself, rather what it truly is, an obstacle to progress. Eternal September comes to academia too, a side effect of scale.
Curiously I think Jimmy Wales contribution was not that store of knowledge, but the community which keeps it alive. He has curated a sphere of unusual individuals working for free.
Otherwise somebody has likely already cloned or mirrored Wikipedia.
I think that argument, which I hear a lot, is a rather poor excuse. Currently it may seem like we're so far in our development, but in the future people will probably look back at our time and make the same argument. I think one big problem is 'hindsight is always 20/20', that is to say, great solutions may seem very simple and almost obvious in hindsight, while we ignore the fact that making the leap to get there was very hard at the time.
This trend presupposes an ever-growing accumulation of knowledge and a view of mathematics as a collection of discoveries of "facts". I think most people nowadays hold such a view and it is intimately connected with the idea of progress. Wittgenstein himself was quite suspicious of such an idea, however:
"This book is written for such men as are in sympathy with its spirit. This spirit is different from the one which informs the vast stream of European and American civilization in which all of us stand. That spirit expresses itself in an onwards movement, in building ever larger and more complicated structures; the other in striving after clarity and perspicuity in no matter what structure. The first tries to grasp the world by way of its periphery -- in its variety; the second at its centre -- in its essence. And so the first adds one construction to
another, moving on and up, as it were, from one stage to the next, while the other remains
where it is and what it tries to grasp is always the same."
- Wittgenstein, "Philosophical Remarks", Preface
Will the current trend continue so that future discoveries are more and more intricate but also more and more specialised and thus perhaps less interesting? Or will there be a paradigm shift so that we lose interest in many of our "fundamental" results and there will be new low hanging fruits waiting to be "discovered"? Wittgenstein was more sympathetic to the latter view and often countered the view that mathematical results are discovered with the idea that mathematical results are invented (which does not make him a naive constructivist, however).