![]() |
|
|
|
#1
|
||||
|
||||
My apologies ...
... for not posting a political thread, but I found this interesting.
Bot Theory in particle physics: Theological speculation versus practical knowledge Burton Richter October 2006, page 8 To me, some of what passes for the most advanced theory in particle physics these days is not really science. When I found myself on a panel recently with three distinguished theorists, I could not resist the opportunity to discuss what I see as major problems in the philosophy behind theory, which seems to have gone off into a kind of metaphysical wonderland. Simply put, much of what currently passes as the most advanced theory looks to be more theological speculation, the development of models with no testable consequences, than it is the development of practical knowledge, the development of models with testable and falsifiable consequences (Karl Popper's definition of science). You don't need to be a practicing theorist to discuss what physics means, what it has been doing, and what it should be doing. When I began graduate school, I tried both theory and experiment and found experiment to be more fun. I also concluded that first-rate experimenters must understand theory, for if they do not they can only be technicians for the theorists. Although that will probably get their proposals past funding agencies and program committees, they won't be much help in advancing the understanding of how the universe works, which is the goal of all of us. I like to think that progress in physics comes from changing "why" questions into "how" questions. Why is the sky blue? For thousands of years, the answer was that it was an innate property of "sky" or that the gods made it so. Now we know that the sky is blue because of the mechanism that preferentially scatters short-wavelength light. In the 1950s we struggled with an ever-increasing number of meson and baryon resonances—all apparently elementary particles by the standards of the day. Then Murray Gell-Mann and George Zweig produced the quark model, which swept away the plethora of particles and replaced them with a simple underlying structure. That structure encompassed all that we had found, and it predicted things not yet seen. They were seen, and the quark model became practical knowledge. Why there were so many states was replaced with how they came to be. A timelier example might be inflation. It is only slightly older than string theory and, when created, was theological speculation, as is often the case with new ideas until someone devises a test. Inflation was attractive because if it were true it would, among other things, solve the problem of the smallness of the temperature fluctuations of the cosmic microwave background radiation. Inflation was not testable at first, but later a test was devised that predicted the size and position of the high angular harmonic peaks in the cosmic microwave background radiation. When those were found, inflation moved from being theological speculation to a kind of intermediate state in which all that is missing to make it practical knowledge is a mathematically sound microscopic realization. The general trend of the path to understanding has been reductionist. We explain our world in terms of a generally decreasing number of assumptions, equations, and constants, although sometimes things have gotten more complicated before they became simpler. Aristotle would have recognized only what he called the property of heaviness and we call gravity. As more was learned, new forces had to be absorbed—first magnetic, then electric. Then we realized that the magnetic and electric forces were really the electromagnetic force. The discovery of radioactivity and the nucleus required the addition of the weak and strong interactions. Grand unified theories have pulled the number back down again. Still, the general direction is always toward the reductionist—understanding complexity in terms of an underlying simplicity. The last big advance in model building came a bit more than 30 years ago with the birth of the standard model. From the very beginning it, like all its predecessors, was an approximation that was expected to be superseded by a better one that would encompass new phenomena beyond the standard model's energy range of validity. Experiment has found things that are not accounted for in it—neutrino masses and mixing and dark matter, for example. However, the back-and-forth between experiment and theory that led to the standard model ended around 1980. Although many new directions were hypothesized, none turned out to have predicted consequences in the region accessible to experiments. That brings us to where we are today, looking for something new and playing with what appear to me to be empty concepts like naturalness, the anthropic principle, and the landscape. More at: http://www.physicstoday.org/vol-59/iss-10/p8.html |
Bookmarks |
Thread Tools | |
Display Modes | |
|
|