Fellow Scientific American blogger John Horgan is at it again. This time he is heralding the end of fundamental physics based on the increasing time lag between Nobel Prizes awarded for fundamental discoveries. There’s actually a grain of truth in his analysis; for instance the prizes awarded for quantum mechanics in rapid succession in the twenties and thirties tell us how fast these fields were growing, a scenario that’s unlikely to repeat itself.
The analysis is also a little deceptive.
To see why, let’s imagine the Nobel Prize being established much before, in 1700 instead of 1900. For his working out of the laws of motion and gravitation, Isaac Newton would have surely gathered his prize for his monumental work Principia, published in 1687. But what then? There were certainly great scientists like Hooke, Huygens, Boyle and Cavendish in the 18th century and many of them might have rightly received the prize. Perhaps Coulomb would have received it for his law of electrostatics, formulated in 1798 and Benjamin Franklin might even have received it for demonstrating that lightning is a form of electricity. The prize might have been awarded to Count Rumford (Benjamin Thomson) for his very important discovery of the relationship between mechanical work and heat.
To read more, click here.