Is modern physics worth the cost?
August 3, 2008 1:11 AM   Subscribe

What useful technologies have been developed or are in development due to advances of knowledge in modern physics?

Reading all these recent news stories about the Large Hadron Collider has got me wondering if there is a purpose, besides knowledge for knowledge's sake, in building such a gigantic and incredibly expensive machine. I understand there will be scientific experiments done with the machine to answer questions regarding Higgs bosons, baryons, supersymmetry, etc.; but will this knowledge actually be put to use in improving human life through the development of useful technology?

I'm wondering if there any examples of cutting edge physics research generating knowledge leading to developments in commonly used, or at least useful, technology of today. A similar question was asked about Einstein's contributions to science and technology, but I'm more interested in work done in the last couple decades.

I understand that the "point" of science isn't merely to develop useful tools to helps humans through life, and I would usually wholeheartedly support advancing branches of science that might appear to have no practical use. That said, with the LHC costing nearly 10 billion dollars, this seems to be something that should be carefully considered and thoroughly debated.
posted by wigglin to Science & Nature (16 answers total) 6 users marked this as a favorite
Our knowledge of semi-conductors is a very practical application of quantum physics. So you can wipe out everything you own with a transistor in it. LASERs are also a result of our understanding of quantum physics.
posted by Jimbob at 1:31 AM on August 3, 2008

Response by poster: Jimbob:

The transistor was invented in 1947 and the first LASER in 1960. If possible, I'd like to have some more recent examples than those.

In particular I'm interested in the things associated with the really strange stuff like say: string theory, tetraquarks, hadrons, Higgs bosons and so on. I honestly know very little about the groundbreaking physics work being done today, but I do know that there is a lot of man hours and money being poured into it.
posted by wigglin at 1:53 AM on August 3, 2008

Fair call. You might want to read this, though.
posted by Jimbob at 2:27 AM on August 3, 2008

Best answer: no. sorry. there is plenty of cutting-edge physics research that is leading to, or has recently led to, the development of new technology (see for instance GMR, which was first observed only twenty years ago, and is why you now have terabyte hard drives). but it's not coming from cosmology or high-energy physics.

most people in that community rationalize the cost of what they're doing by going on nova or the discovery channel and saying "it's fundamental! it's cool! it's the UNIVERSE, man!" or by making the (in my opinion, really weak) argument that there is a potential for as-yet-unknown spinoff technologies.

the classic example given of this latter rationale is synchrotron radiation, which is actually kind of an irritating limitation from the point of view of a particle physicist. but, it turns a synchrotron is a really good source of x-rays, and the discovery has really been something of a happy accident. synch light has been a huge new tool in materials science and biochemistry over the last 10-15 years, and new uses for it are being developed all the time.

but a lot of the stuff you mention is, at this point, mainly theoretical. i mean, hadrons aren't - you are made of hadrons! but you're talking about things that have been barely detected if at all, and are so unstable and elusive that it costs many billions of dollars just to study them. don't expect tetraquark technology anytime soon unless there's an utterly revolutionary development about to come at us out of left field.
posted by sergeant sandwich at 2:44 AM on August 3, 2008

How about giant magnetoresistance (1998), without which we wouldn't have the high-capacity hard disks that we do.
posted by le morte de bea arthur at 3:02 AM on August 3, 2008

Best answer: Giant magnetoresistance was discovered in 1988, commercialized by about 1991, today ubiquitous. I remember hearing it said that this was the shortest discovery-to-market time for a natural phenomenon ever in history. So that sets a time scale for you: the transistor and laser are perfectly sensible examples, having only become ubiquitous in the past couple decades, and I was going to mention them, too.

A more incremental, but commercially more important, example is the high-powered diode laser. Using a $10k power supply that fits on a handcart, you can produce 100 W or more of laser light. These were actually first developed to strip paint from airplanes, but have started a real industry in manipulating clouds of atoms. I'm using such a setup to make polarized helium-3 for a nuclear physics experiment. My collaborators who provided that hardware are also working on making polarized helium and xenon for doing NMR imaging of the lungs.

High-temperature superconductivity (where "warm" means "liquid nitrogen") has made MRIs affordable. Fast "functional" MRIs are a terribly exciting tool in neutroscience.

I would call PET scanning, where you eat some antimatter-emitting sugar and use the pattern of annihilations for imaging, an application of modern physics.

There are now beginning to be accelerators whose primary mission is materials science and engineering. The Advanced Photon Source at Argonne and the Advanced Light Source at Berkeley make x-rays. The Spallation Neutron Source at Oak Ridge makes neutrons which are in many ways complementary to x-rays. The SNS is interesting to compare the the LHC. It began operating last year (three or four of the 25 instruments are open to experiments), cost $1.5B, finished on time and on budget. The heart of the SNS is a pretty classy proton accelerator, built using a lot of the expertise (that is, engineers) freed up when the Superconducting Supercollider was canceled in the early 1990s.

And of course there's the old saw that HTTP and HTML were invented at CERN to make data sharing easier. Collider experiments have for most of the century produced more plain old data than any other source, and have pushed the envelope of a lot of commercial systems. I was talking a couple years to the data analysis coordinator for one of the experiments at RHIC, and mentioned reviewing (and approving) a grant to design and build the first robot for changing IDE disks (compare).

So: fundamental discoveries, maybe not so many; subtle, complicated, or indirect contributions --- like my colleagues turning my polarized helium into a medical imager --- lots.
posted by fantabulous timewaster at 3:12 AM on August 3, 2008 [4 favorites]

Best answer: I think the first thing to say is that we've never been made worse off by gaining a better understanding of something. You might not think that research into underwater basket weaving or the reproductive cycle of the giraffe is worthy, but it certainly doesn't make us any worse off.

Now, onto advances from high-energy (i.e. particle) physics:

Probably the headline "good for people" technology is the advent of proton therapy and other particle therapies. Treatment of cancer is often done using electromagnetic waves that impart some of their energy as they travel through the body, damaging it as they do so. Proton therapy imparts the damaging energy all at once to a certain point, i.e. at the tumour.

Production of medical isotopes is very important and better techniques that don't use materials like enriched uranium have come out of work in particle physics. PET scans require you to eat some special sugar (FDG) that shoots out anti-electrons (positrons); this FDG was first synthesised at Brookhaven National Laboratory which is a particle physics research centre.

In terms of computing, the amount of data produced in high-energy physics requires new methods of storage and analysis (the LHC GRID being a good example of this) and this distributed technology is used to investigate things like protein folding in cancer treatment; the first application of CERN's GRID technology was MammoGrid. Of course we all know that the Web was created by Tim Berners-Lee whilst working at CERN in order to share scientific data.

sergeant sandwich is quite right talking about synchrotron radiation. Just recently synchrotron radiation induced X-ray fluorescence spectroscopy was used to discover a "new old" Van Gogh painting. The research was carried out at DESY, a particle physics research centre (like CERN) in Germany.

All sorts of little advances in things like vacuum technology, crystal production (particularly lead tungstate), magnet manufacture, precision machining and manufacture, etc. have come out of work at CERN and other particle physics projects but it's difficult to identify the benefits these have had.

On a more "fluffy" note: particle physics research requires huge international collaborations and helps foster a community spirit in science.
posted by alby at 3:58 AM on August 3, 2008

The transistor was invented in 1947 and the first LASER in 1960. If possible, I'd like to have some more recent examples than those.

One thing that hasn't been directly addressed yet is the fundamental difference between things like cutting-edge research in physics and getting new technologies to the general public. The former (e.g. the basics of semiconductors) is also known as "basic sciences," which is like a first stop 5, 10, or more years before something awesome (e.g. transistors) can hit "engineering" and eventually your desktop. You probably won't know what's so exciting about Bose-Einstein condensates until you retire. Or maybe only your grandkids+ will know.

For example, back to the electronics: the rectifier effect was discovered sometime in the late 1800s (187something), but the rectifier itself wasn't invented until the 1930s.
posted by whatzit at 4:30 AM on August 3, 2008

the new, distributed internet developed to process the LHC data will probably end up changing human society so much that the internet we have now will be seen historically as an interesting prequel.
posted by lastobelus at 5:37 AM on August 3, 2008 [1 favorite]

Maybe not as recent as you're thinking of, but one thing that came to mind was that the clocks in GPS satellites need to be adjusted very slightly to compensate for relativistic effects from their speed and the weaker gravity they're in. The system couldn't have worked if we didn't understand relativity.

I think this is a good example of the way knowing more detail about how the world works helps us to go out and actively design new technologies, rather than specific discoveries leading to new ideas.
The better we understand the universe, the more directed and accurate our work on practical things can be, because we can make better predictions about our designs, and understand what's happened when things fail.
posted by lucidium at 5:48 AM on August 3, 2008 [1 favorite]

Super Conducting Quantum Interference Devices (SQUIDs) are used in MRI machines to measure extremely small magnetic fields. The first SQUID was invented in 1964 after Josephson Junctions were discovered in 1962. Josephson junctions consist of two superconductors separated by a thin insulator. Amazingly enough, current flows across this gap. With the discovery of high-temperature superconductivity in 1986 (high-temperature in this context being above 30 K), SQUIDs might become easier to produce as cooling can be done with liquid nitrogen rather than liquid helium.
posted by peacheater at 9:16 AM on August 3, 2008 [1 favorite]

> The transistor was invented in 1947

Well, sort of. In practice the transistor is re-invented continuously. I seriously doubt that all of the physics underlying 45nm silicon processes was well understood in 1947. And without the relentless march in silicon technology, the modern world would look nothing like it does today.
posted by madmethods at 9:59 AM on August 3, 2008 [2 favorites]

- Theory leads to experimental science.

- Experimental science leads to advancements in technology.

- Everyone benefits.

- It is ultimately nice to try and see why and how the Universe works (why bother building a telescope? Going to the moon? Send rovers to Mars?)

- the LHC costs about the same than two months of the Iraq war, so why the hell not?
posted by _dario at 10:21 AM on August 3, 2008 [1 favorite]

This thread on SA (really), previously mentioned on the Blue has a lot more answers and insight about what the LHC is, how it works, and why Physics is awesome.
posted by _dario at 10:47 AM on August 3, 2008 [1 favorite]

How about the memristor? Its existence was theorized in the 1970s, but they actually only managed to make one this year.
posted by kindall at 6:21 PM on August 3, 2008 [1 favorite]

Here's a more "market-based" benefit of modern physics research:

Pushing the limits of material science has benefited more than particle physics. The Tevatron paved the way for today's largest superconductor application: magnetic resonance imaging (MRI). Engineers calculated that Fermilab at one point had purchased 95% of the niobium-titanium the world had ever produced. Robert Marsh, the head of a major alloy supplier, once said that "every program in superconductivity that there is today owes itself in some measure to the fact that Fermilab built the Tevatron and it worked." The market for superconductors currently stands around $3.5 billion a year. As ever, Wilson was on the frontier of something big.


Thank modern physics research for the prevalence of inexpensive MRI (and for sparking the growth of that $3.5 billion/year industry).
posted by funkbrain at 12:26 AM on August 4, 2008

« Older What should I do? What major should I pick?   |   Guess this is why the virus in Hackers sang "Row... Newer »
This thread is closed to new comments.