Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.

Can particle physics at the LHC have a practical purpose?

by Martin Mon Feb 25th, 2008 at 07:48:13 PM EST

The LHC, the Large Hadron Collider is the next project of collider particle physics which will be switched on. The multi billion Euro machine has already helped practical purposes in various ways, e.g. due to the development of huge superconducting magnets, the efforts to create a computing grid, and the research on silicon pixel detectors among many other things. The analysis will help young physicists to learn statistics in a well suited environment. The CERN, the laboratory where the LHC is build, has a unique history anyhow. In the cold war meetings on neutral ground took place and politically hunted physicists could be freed when getting publicity through their former work at CERN. The most important issue CERN does, however, is proving the superiority the European model as reliable partner in scientific international projects.

But can the physics there do anything useful?

Going back to my origins... - Diary rescue by Migeru


My answer is a very careful yes.

Research in formally uninvestigated energy regimes is always bound to some uncertainty of what can be found. So may be something useful we simply have not yet thought about.
But one application one can think of, is actually very old. If one would have a heavy negatively charged enough long living particle with quantum numbers differently of the proton, one could use this particle as a catalyst for fusion.

The main difficulty in fusion is the high energy needed to come over the potential wall from the Coulomb repulsion of two positively charged nulei. On a larger scale this repulsion is shielded by the electrons, but the distance between the average electron orbit and the nucleus is still very large for nuclear physics. A heavier particle would be closer to the nucleus and longer protect from the Coulomb repulsion.
One can think of the Heisenberg uncertainty principle, which forbids a particle to have at the same time a definit momentum and space. Heavier particles have the same momentum with a much lower velocity, so in a classical picture the heavy particle can have a much smaller orbit (space fuzzyness) with a given momentum.

The particle which could do the job best from the known particles is the muon, which is the 200 times heavier relative of the electron. Some have thought about using it, but it is still science fiction (however not pure fantasy). The main difficulty is, that it lives only about 2 microseconds.

Some theories in circulation today contain a particle which could be better suited. It would be even much more heavy (~200 000 times the electron mass) and could have lifetimes even on the level of seconds (not longer than 10 seconds or so, because we would see effects in cosmology from the early universe).

If such a particle woud be found practical use could be achieved one day.

Display:
Not to derail your diary, but isn't the LHC the thing with all the Doomsday mythology attached to it? We flip the switch and unintentionally create a mini black hole that eventually eats up the Earth? That and a couple of other scenarios, if I recall.

Well plenty of physicists at ET so what better place to ask: is it simple urban myth, or is there actually some theoretical Doomsday risk attached to using the LHC?

by wing26 on Thu Feb 21st, 2008 at 02:55:17 AM EST
there is a theoretical doomsday risk attached to letting off a fart... yet millions of uncaring creatures do it every day on this planet.

Indeed, flipping on LHC has somewhat higher risk. Yet it remains very, very marginal. And Peak Oil + Credit Crunch have us doomed anyway, so don't worry, be happy.

Pierre

by Pierre on Thu Feb 21st, 2008 at 05:56:48 AM EST
[ Parent ]
Pierre, I am sitting very tight until you tell us what that fart risk is.

A butterfly's wingbeat's-worth of GHG that tips the Global Warming balance and precipitates us all into burning hell? The sudden opening of a human-race-inhaling wormhole to the back end of the Galaxy Where No One Wants To Live? Or just the standard complaints about people who eat beans?

by afew (afew(a in a circle)eurotrib_dot_com) on Thu Feb 21st, 2008 at 02:18:45 PM EST
[ Parent ]
We can not tolerate any kind of increased risks created by man, no matter if they pale compared to the risks in nature!

</snark>

Yes, I know I'm that predictable.

Peak oil is not an energy crisis. It is a liquid fuel crisis.

by Starvid on Thu Feb 21st, 2008 at 05:10:33 PM EST
[ Parent ]
Natural risks are part of the situation we are born to, to which we should react as appropriately as possible. Man-made risks are taken on our responsibility, and decisions regarding them should be carefully weighed.

That is, if you really want to make a moral separation of the two.

by afew (afew(a in a circle)eurotrib_dot_com) on Fri Feb 22nd, 2008 at 02:39:54 AM EST
[ Parent ]
That's funny, but then it's not actually true, is it. The ultimate cause of farting is a microbiological process that mainly involves methanogens, I think. No one ever claims that by farting they are going to be able to create new subatomic particles or examine conditions close to those that existed at the start of the universe . So no, I don't see how there is even a theoretical risk of Doomsday associated with that particular sort of Big Bang (pardon the pun).

Just to clear things up, I didn't ask because I am fretting about it, but because I wanted to know whether or not the theoretical risk existed, or whether it was just urban myth. By 'theoretical', I mean a non-zero risk consistent with current theoretical understanding.

And you say yes, there is. So that clears that up, I think.

by wing26 on Thu Feb 21st, 2008 at 07:59:48 PM EST
[ Parent ]
... in the higher part of the earth's atmosphere, when cosmic rays, solar winds and so on collide with atmospheric molecules. They can be orders of magnitude more violent than what will ever happen at LHC.
The difference at LHC is that there are huge detectors around the very precise spot where it's happening.


A 'centrist' is someone who's neither on the left, nor on the left.
by nicta (nico&#65312;altiva&#8228;fr) on Thu Feb 21st, 2008 at 12:00:49 PM EST
[ Parent ]
This time I am seriously interested in a description (for laymen) of these collisions. What happens?
by afew (afew(a in a circle)eurotrib_dot_com) on Thu Feb 21st, 2008 at 02:22:41 PM EST
[ Parent ]
More on the cosmic ray part or the LHC part or both?

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 02:28:30 PM EST
[ Parent ]
I interpreted the question as asking for an explanation of how elementary particle collisions lead to cascades of other particles.

Why this is possible is simple: it is possible to convert the kinetic energy of the collision into particles by means of Einstein's E=mc^2.

How it happens is a little harder to explain for laymen, but I suppose we could get away with Feynman diagrams.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 04:04:01 PM EST
[ Parent ]
Martin:

I was replying to nicta, and to his comment about collisions at the fringes of the atmosphere. What is the interaction between rays and molecules? What are solar winds and how do they interact with molecules? Do these collisions give rise to phenomena observable at the (for want, in my case, of the correct term) "macro" level?

But a description in layman's terms of the LHC project would also be welcome.

It occurs to me you may not have ET discussion threads displayed as "nested", meaning that you can see on screen which comment is replying to which. Is that the case?

by afew (afew(a in a circle)eurotrib_dot_com) on Fri Feb 22nd, 2008 at 02:25:59 AM EST
[ Parent ]
Cosmic rays consist of high-energy photons (gamma rays - normally outside the solar system or even extragalactic), electrons and protons (normally from the solar wind except for the very energetic ones).

These photons, electrons and protons collide with particles in the atmosphere and produce cascades of other particles by converting the kinetic energy of the incoming ray into the products. The cascades happen because the first products are energetic enough to, themselves, cause similar collisions further down the atmosphere. The cascades are seen by flying or land-based detectors.

At the LHC you have two high-energy proton beams colliding head-on. There is a cascade but it is a lot cleaner because there is nothing else for the products to collide with, so they just fly off and decay and the collection of decay products is seen by the detector array.

The LHC is a proton-proton collider. When a cosmic-ray proton hits a hydrogen aton in a water molecule you have essentially the same proton-proton collision.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 04:04:43 AM EST
[ Parent ]
I think afew will learn everything in the film linked by Pierre, but solar wind is really not the most important already in the energy regions which would be interesting compared with LHC. Sun wind reaches less than 1 GeV.

The Supernova accelerated ones are the relevant for the comparison with LHC.

The really high one, e.g. those which are watched at Auger have most likely their origin in active galactic nuclei.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 09:12:50 AM EST
[ Parent ]
Are you familiar with the kinematic argument that the highest energy cosmic ray protons that have been observed shouldn't actually be able to reach Earth and how that provides evidence of corrections to special relativity at high energies?

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:00:20 PM EST
[ Parent ]
A bit, we have a group of people working at Auger who have new (and unfortunately a bit boring) results.

The highest energy cosmic rays should interact with the cosmic microwave background radiation (CMBR) if the energy of the proton is about 6×10^19 eV (in the rest frame of the CMBR) into a Delta+ resonance. This is called GZK cutoff.

The average "lifetime" of a proton for flying without interaction at this energy is about 160 Megalightyears.

A Japanese experiment (AGASA) claimed to have found some of this very high energetic particles, which led to speculations, despite the big systematic uncertainties.

Auger (south, in Argentina) recently reported (maybe preliminary) to have found as well some, but
a) there is a reduction in the current on these very high energetic particles
b) they were able to track much of the particles origin (at this high energies the intergalactic magnetic fields don't bend the trajectory too much) back to known active galactic nuclei (AGN). AGNs are likely huge black holes (as we have likely a smaller one in our galactic center). On their accretion discs many light year long streams are accelareted to produce high energy particles of all types.

The surprise about their existence though shows only a lack of our understanding of the AGNs, not of the special relativity and not super heavy BigBang relicts, which would have been the most interesting for particle physics.
Of course this is not yet completely safe, as Auger is only shortly in operation and it might be that some have different origin, but I wouldn't bet against it.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 03:52:07 PM EST
[ Parent ]
Thanks for the update: if there is an AGN in the right direction within a distance of the order of 160 Mlyr then there's no evidence of violations of the GZK cutoff.

Which is rather a pity, because GZK violation was one of very few experimental results not explainable by the standard model of particle physics and the standard cosmological model (another one being the rotation curves of galaxies).

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:00:28 PM EST
[ Parent ]
LHC - it's like banging rocks together. Only the rocks are really, really small.
by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 08:44:02 AM EST
[ Parent ]
Oh, great, that's better, thanks. But you forgot to say how many rocks would fit into a football field?
by afew (afew(a in a circle)eurotrib_dot_com) on Fri Feb 22nd, 2008 at 09:03:31 AM EST
[ Parent ]
That was interesting. I thought I'd look up 'diameter of an electron' and found sizes between 10^-13 m and 10^-18 m.

The 'classical' radius is

Although all that really tells you is that you can't say how big an electron is, because unlike a rock it doesn't have hard edges. Instead it's defined by how much it bounces when you bang it with something else.

Curiously, football pitches also seem to suffer from quantum indeterminacy - very bad news for science journalists.

The official FIFA dimensions are '90-120m' wide and '45-90m' tall. Goal dimensions are defined as 7.32m x 2.44m, which presmably requires precision laser range-finding to make sure they're accurate to the nearest mm.

So - it's unlikely we'll ever know exactly how many electrons fit into a football pitch.

A bit unfortunate, that, but there it is.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 09:36:32 AM EST
[ Parent ]
I have to live with uncertainty. It's killing me.
by afew (afew(a in a circle)eurotrib_dot_com) on Fri Feb 22nd, 2008 at 10:15:00 AM EST
[ Parent ]
You can say that the charge of the electron is contained in a sphere with radius less than 10 ^(-18)m and is assumed to be a point in the current theory.

However at LHC protons are used not electrons.
Protons have a diameter of about 10^(-15)m. Of course not the protons, but the quarks and at LHC even mainly the gluons are colliding.

For the non-experts, the following picture is an illustration of a proton by the Oxford Hera group's website.

The dots are quarks. In a proton quarks minus antiquarks net 3 quarks. The springs represent gluons. The lower energy you count the more gluons are there. These are the "force particles" of the strong interaction, which holds as well nuclei together, and have the interesting property that they carry as well strong charge.

The best known "force particle" is the photon, which is the carrier of the electromagnetic interaction. Others are gravitons (not yet directly observed, but who doubts they exist), and the W (Z) bosons, the carrier of the weak interaction, which are the only force particles known which have mass.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 10:17:12 AM EST
[ Parent ]
The radius of a proton is a more meaningful quantity than the radius of an electron, because as far as we know the electron is pointlike (no internal degrees of freedom) but a quark is a composite particle containing three quarks and so in principle one could assign it a radius just like the hydrogen atom - a composite particle of a proton and an electron can be assigned a radius by studying the electron wavefunction for its ground state.

Then again, the "radius" of a wavefunction, be it the electron orbital in an atom or the wf of a quark in the proton, is only meaningful as an order of magnitude anyway.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:22:25 PM EST
[ Parent ]
You are a theory guy, right? ;-)
10^(-18)m is experimentally tested, point is what the theory says.

But more important you have a typo
"but a quark is a composite particle containing three quarks", should be proton is a composite particle...

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 04:02:56 PM EST
[ Parent ]
In a previous life I was a theory guy, yes, though I took my graduate Standard Model courses from a pretty good phenomenologist.

I suppose I could find it in the particle data booklet, but can you outline the experiment that tests the electron radius?

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 04:51:20 PM EST
[ Parent ]
Well, how do we know that the proton has a fm? When hitting it with a charged particle with a Compton wavelength in the order of 1 fm in a scatter experiment, we start to see a formfactor, e.g. the particle doesn't see the full charge any more.
LEP II had about 100 GeV/c^2 per electron, therefore a Compton wavelength of about 10^(-17)m, and there was not the slightest sign of substructure, we still see absolutely the full structure and one would likely expect already small signs of a formfactor with a somewhat bigger wavelength than the electron size.

I don't know what measurement exactly provides the best measurement, could be a precision measurement of the B sector or whatever, but that's it pretty much.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 05:17:09 PM EST
[ Parent ]
There's more than one LEP? that's scary ;-)

Any idiot can face a crisis - it's day to day living that wears you out.
by ceebs (ceebs (at) eurotrib (dot) com) on Fri Feb 22nd, 2008 at 05:32:24 PM EST
[ Parent ]
The electron is, as far as we can tell, a "point" particle, meaning that it has no internal degrees of freedom: all its degrees of freedom have to do with its relation with the surrounding space(-time).

The classical radius of the electron is the radius of a sphere such as the energy of the electric field outside it matches the observed rest mass of the electron. If you assume a "classical" electron is truly pointlike you get an infinite energy for its electric field.

The funny thing about quantum field theory is that, since the electron is pointlike (see first paragraph) you need to "renormalize" the self-interaction of the electron (i.e., the interaction of the electron with its own electric field) and renormalization methods involve a "cutoff" (effectively, a mass or a radius cutoff - see second paragraph). You then ger "running coupling coefficients" which means that the "bare mass" and the "bare charge" of the electron vary with the "cutoff".

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:18:33 PM EST
[ Parent ]
by Pierre on Thu Feb 21st, 2008 at 04:31:39 PM EST
[ Parent ]
I watched the beginning and it looks excellent. I'll watch all of it when I get a moment!
by afew (afew(a in a circle)eurotrib_dot_com) on Fri Feb 22nd, 2008 at 03:00:40 AM EST
[ Parent ]
Yes, but something usually said in reply to this is that those interactions would involve any by-products moving away from the earth at near the speed of light. The products of the LHC will, apparently, remain more or less where they are.

Now before you explode and say 'total gobblydegook!', please understand that I am only a layman and that the argument is not actually mine. I do not even know if it makes any sense. That is why I ask the people here.

I would like to see someone debunk that particular objection, because any time I see anyone 'debate' these things it goes like this:

'The LHC could destroy the world!'

'No it couldn't, interactions like that happen all the time from cosmic rays and nothing has gone wrong so far, has it.'

'But the cases aren't the same, because LHC products can be expected to remain in the vicinity of the Earth, unlike those produced by cosmic rays!'

[silence]

You can see why it is frustrating for me. I cannot follow physics, but I can follow an argument, and there is usually something missing in this one, i.e. the step that says 'Bollocks, because ...'

So can someone here provide that? Thanks.

by wing26 on Thu Feb 21st, 2008 at 08:10:41 PM EST
[ Parent ]
Actually the fallout of cosmic ray impacts goes all the way through the earth (what is not absorbed quickly in the atmosphere and crust). The LHC will generate more carefully calibrated events (angle, energy band), and its particular band, it will generate in a few years more events than cosmic rays in centuries. But cosmic rays generate every years a few events that are several orders of magnitude larger than anything LHC can produce.

Pierre
by Pierre on Fri Feb 22nd, 2008 at 05:58:27 AM EST
[ Parent ]
When you say "aren't the same," (let's assume it is true) you actually mean that the probability of a similar reaction happening at any given time is really low.
But space is big, the universe is old, and if there was a non-trivial probability of a cataclysmic occurence happening when the LHC's switched on, it would have had happen, and it would have been spotted, somewhere.
Now that I think of it, Supernovæ might actually be alien-built Large Hadron Colliders ...

A 'centrist' is someone who's neither on the left, nor on the left.
by nicta (nico&#65312;altiva&#8228;fr) on Sun Feb 24th, 2008 at 12:59:03 PM EST
[ Parent ]
I think you're talking about concerns that a quark gluon plasma could recreate the conditions near the big bang to the extent of producing a primordial black hole which would then proceed to swallow the earth whole.

There is no experimental evidence of primordial black holes and the theoretical evidence is tenuous (mostly based on "semiclassical" reasoning). In addition, a quark-gluon plasma has already been produced in heavy ion collisions in other experimental facilities and we're still here.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 04:18:16 PM EST
[ Parent ]
Actually I thought nicta's comment was already enough on that.

Although I'm not sure if sun winds really have LHC energies, cosmic rays can have up to 10^7 times the energy of the LHC as reported by the Auger collaboration.
So whatever is created at the LHC, it was already created by cosmic rays.

If it is created, and that would likely require rather big extra dimensions, Hawking radiation will likely destroy it in about 10^(-40) seconds. The nice thing about is is (a) it would immediately prove really fancy new physics, and (b) it is very easy to detect, as a lot of muons (heavy electrons) come out of it.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 05:26:54 PM EST
[ Parent ]
Martin:

Although I'm not sure if sun winds really have LHC energies, cosmic rays can have up to 10^7 times the energy of the LHC as reported by the Auger collaboration.
So whatever is created at the LHC, it was already created by cosmic rays.

You're forgetting the luminosity... though cosmic rays have been at it for geological time...

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 05:29:41 PM EST
[ Parent ]
And if it is a supersucking Black Hole, it could suck Jupiter or the sun or any neighbour star as well. So the effective area we have to observe this is really big.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 05:43:14 PM EST
[ Parent ]
And (sorry, me again) another objection is that 'Hawking radiation' has never actually been observed and might not exist, so that whatever is created might not be destroyed in such a short time frame. Is that true or not?
by wing26 on Thu Feb 21st, 2008 at 08:13:27 PM EST
[ Parent ]
Hawking radiation is another "semiclassical" result. IMHO Hawking radiation is much more solid than primordial black holes, theoretically speaking.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:47:36 AM EST
[ Parent ]
So to sum up:
There is no theoretical model which predicts sucking the world.
Any black hole going through the earth would as well leave some wholes, even if it would not suck the earth, so even the cosmic ray argument is completely valid. We would have observed the beasts.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 09:01:44 AM EST
[ Parent ]
A very interesting perspective, but I'm not sure I agree with the premise implied by the title.

Applications are good and all, but research into basic science should be motivated by the question 'is this scientifically interesting?' rather than the question 'is this going to be useful?' Overwhelmingly, history shows that successful scientific theories eventually find practical applications, so I say let the engineers worry about that.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Thu Feb 21st, 2008 at 06:52:50 AM EST
Agreed. Why can't satisfying human curiosity be an end of its own? Furthering our understanding of "how things work" sounds plenty useful to me.

"The basis of optimism is sheer terror" - Oscar Wilde
by NordicStorm (m<-at->sturmbaum.net) on Thu Feb 21st, 2008 at 07:19:00 AM EST
[ Parent ]
Even if curiosity is a good goal completely separated from application, something I personally do not believe, than it is still worth wondering if the LHC and high-cost particle physics gives the most important knowledge per dollar spend.

I remember reading a book by E.O.Wilson, in which he mentioned that a total inventarisation of world biodiversity, including all organisms in the rainforests, would cost on the order of 300 million dollar, an amount  incredibly high for ecologists. But LHC spends several times such a number per yearly.

It seems to me LHC and fundamental physics in general receives its spending still relatively easy as a leftover from the high days of fundamental physics around the middle of the previous century, when the strangest avenues of fundamental research would quickly result in applications.

So, funding for large-scale physics is partially a thank-you for the field that lead to electronics and nuclear power, and partially a hope for more such gifts.

by GreatZamfir on Thu Feb 21st, 2008 at 09:07:01 AM EST
[ Parent ]
Even if curiosity is a good goal completely separated from application, something I personally do not believe, than it is still worth wondering if the LHC and high-cost particle physics gives the most important knowledge per dollar spend.

Indisputably so, but that is an entirely different line of argument. Which is not, IMO, served very well by conflating it with a narrowly technological cost/benefit analysis.

There is indeed a case to be made that high energy physics and space exploration aren't sufficiently scientifically interesting to warrant the budgets they have, and if that is the case, then evaluating the applications makes sense, because then you need to sell it directly to industry.

But I will maintain that that's a different kind of analysis.

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Thu Feb 21st, 2008 at 10:53:24 AM EST
[ Parent ]
High energy physics is close enough to dying of success that the LHC should have been built because it can and just in case it shows that an essentially correct theory of matter at the regimes that can be probed experimentally in the foreseeable future has been known since the 1970's.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 04:13:10 PM EST
[ Parent ]
l receives its spending still relatively easy as a leftover from the high days of fundamental physics around the middle of the previous century
I don't think so. Receiving spending isn't easy, but quite a fight. The reason it is still financed is the result of excellent lobbying and PR.
Some colleagues here are working on an experiment which shall go to the ISS with a space shuttle. When after the accidents it was unsure if the experiment will get a flight, students were send to the US congress to directly convince representatives and senators.
Recently a laboratory in South Dakota managed to get a 7 million donation from a private person, just because he thought detecting neutrinos is cool.

When judging spending on particle physics one should as well consider, that a huge machinery will die and it is very difficult to reestablish it. If you stop research now, and in 200 years one would like to restart it, one would have to redo a lot of work and rebuild knowledge, which will be lost.

With ecologists its a special story. First I think that spending for ecologist projects and physics are not at all exclusionary. The spending on science is not so high that a bit more would do any damage to the society. But ecologists really are bad organised. When some astrophysicists had a project in the Adria (ANTARES, listen to high energy neutrinos) they asked some ecologists if they didn't want to participate and put some instruments on the chains to have long term Adria watching deeply under water. They hadn't and that is typical. In physics people usually come up with a huge number of projects once you have some type of large experiment.
Ecologists are not enough connected with each other to lobby for a big project. 300m really is peanuts for a developed world society with a yearly GDP of maybe 30 trillion.

The most important development of particle physics was html, and like this probably spin-offs really will exist in the future. If they are of such enormous reach is unclear, but for more specialised applications it will be good enough.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 01:36:55 PM EST
[ Parent ]
Hmm, I don't know about the 'not at all exclusionary' argument. Even if it is true that both need more spending at this moment, there is eventually a limit on the amount of money that can and will be spend on 'knowledge for the sake of knowledge'. At some point you still need to argue that spending a marginal extra euro on particle physics is better than spending it on, say, pure mathematics, or  archeology, or ecology.

As for the PR: of course it's true, but it worth considering that most fields wouldn't even dream of trying to get these amounts of money, because no one would stop and listen to their arguments why they need billions of euros, no matte how good their PR would be.

In the end, I would argue that past applications really are a large reason particle physics can even consider to have a PR-fight for billions.

by GreatZamfir on Thu Feb 21st, 2008 at 02:52:21 PM EST
[ Parent ]
Inventarisation of biodiversity isn't 'knowledge for the sake of knowledge'.
If it would be done once, then yes, but done every ten or twenty years it would provide a lot of data to judge how important specific measurements of protection are. At the moment we are driving on a straight road in the night without light and don't know when there will be a curve, although it is very likely that there is one.

On the issue of deciding where to invest money for 'knowledge for the sake of knowledge', I would suggest let scientist decide, what they like to do most. And of course there is a level of saturation and a minimum level below which a branch of science can't operate anymore at all. This minimum amount is unfortunately in particle physics relatively high, as you want to have at least one high energy collider in the world.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 03:04:28 PM EST
[ Parent ]
It wasn't a premise, that research must serve a technical purpose. For me knowledge is one of the most noble things we can get. If it were for technical purposes the spin-offs of the attempts to get the difficult task done are anyhow much bigger than what I ever expect from the real physics.
But if you read in the media about LHC they will usually write about Higgs, extra dimensions, string-theory, sometimes Susy. But there are other interesting things as well and as a physicist working on bread and butter issues like meson spectroscopy I want to write about interesting stuff you can't read everywhere.

And selling is very important ;-)
Some time ago I read in the newspaper a discussion about humanities (word sounds strange, but that's what my dictionary gives me for 'Geisteswissenschaften') and science, where the author challenged the claim by humanists, they would be underfinanced due to the lack of economic useful results, as a myth. He wrote astronomers and particle physicists as well don't produce much more useful results, but are excellent at selling their work as important, while humanists lack any good PR for their subject.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 01:12:04 PM EST
[ Parent ]
Sorry, are people calling supersymmetry "Susy"? I hadn't heard that before. (I follow developments in this area, but only very infrequently and from afar.)

I agree that there should be research for the sake of research, but I don't think there's anything wrong with speculating about what practical benefits may accrue from the research, as long as it isn't argued that the research is conducted solely for those practical reasons. We can look for them, and even try to anticipate what they may be, but we must not allow them to become the raison d'être for research.

As for the humanists, their subject of study just isn't all that popular with people. Familiarity breeds contempt, as they say. ;)

Il faut se dépêcher d'agir, on a le monde à reconstruire

by dconrad (drconrad {arobase} gmail {point} com) on Thu Feb 21st, 2008 at 03:27:25 PM EST
[ Parent ]
Yep, susy is the shortform of Supersymmetry and used pretty often even for conferences.


Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 03:46:16 PM EST
[ Parent ]
Yes, they've been calling it that for over 20 years.

There was a time when supersymmetric models were all the rage. Now supersymmetry is a requirement to make string theory consistent and string theory is all the rage, so SuSy lives on, even though people have burned the original SuSy papers they wrote in the 1980's.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 04:06:27 PM EST
[ Parent ]
Martin:
He wrote astronomers and particle physicists as well don't produce much more useful results, but are excellent at selling their work as important, while humanists lack any good PR for their subject.

Except for the economists - arguably not so human at all, but easily the most successful of the humanities.

PR is usually seen as 'public education' - it's not usually acknowledged that it has a direct influence on research funding and future research directions.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 08:23:06 AM EST
[ Parent ]
Ah, but isn't the big PR point of economics that it sells itself as not a social science or liberal arts field?

More specifically, they claim they understand their subject well enough, at a quantitive level, to do detailed prediction and policy prescription, something the humanities in general are very reluctant about.

by GreatZamfir on Fri Feb 22nd, 2008 at 08:32:58 AM EST
[ Parent ]
I'm a fourth year Physicist at Bristol University and my Masters project is part of the LHC project.

My project is computer based, simulating what they should see if the Higgs Boson has a mass of 165GeV (decaying to two W bosons which each decay to a lepton and a neutrino). I didn't know a huge amount about CMS (the detector) or the programming aspects before I started so it was a big learning curve. With three weeks to go until I have to stop doing practical work I'm reconstructing the Higgs Mass ok, just need to play around with my electron reconstruction and cuts.

I think to be honest a lot of the best real world applications of the LHC will be indirect, much like the Apollo missions contributed a lot to developments in technology not originally intended for the uses it ended up in. The LHC will process so much data every second that in a year it'll be something like 1/6th of ALL of the world's yearly information going through the detector and its computers.

The research that's gone into the various parts of the detectors (calorimeters, silicon trackers, etc) as well as the magnets, computers, triggers (that rapidly choose which of the billions of collisions are worth keeping and which should be chucked), all of this will have other applications, half of which probably haven't even been thought of yet.

So even if the physics discovered doesn't directly contribute practically, the engineering behind it definitely will.

by darrkespur on Thu Feb 21st, 2008 at 01:26:40 PM EST
I'm working at CDF and if the Standard model Higgs is at 165 GeV we will find it first ;-p
although as a Karlsruher I hope of course that CMS will do the big discoveries.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 01:41:49 PM EST
[ Parent ]
To be honest even now the WW mode is looking very unlikely, that and the ZZ mode look like they might be squeezed into low probability and effectively ruled out of the running. If it is a standard model Higgs, it looks like it'll be ~114-125.
by darrkespur on Thu Feb 21st, 2008 at 03:22:41 PM EST
[ Parent ]
Can you elaborate?

And what if the standard model Higgs isn't found? What are the alternatives?

Once in a graduate course the professor mentioned in passing that the standard model without a symmetry-breaking sector violates unitarity at about 1TeV, so there has to be some new phenomenon, be it the Higgs or something else, before that energy is reached. What is there to that?

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 04:09:22 PM EST
[ Parent ]
I once met a former physicists turned ERP consultant at IBM, who also told me that there was a small energy band between LEP and LHC, that was actually unreachable in LHC (too low for the beam as designed), and there was a tiny chance the Higgs had a mass that would make the LHC miss the finding (but then some boosted linear accelerators should be able to find first)

Pierre
by Pierre on Thu Feb 21st, 2008 at 04:29:27 PM EST
[ Parent ]
That rings a bell, too.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 04:39:41 PM EST
[ Parent ]
Oops.
by ThatBritGuy (thatbritguy (at) googlemail.com) on Thu Feb 21st, 2008 at 05:45:29 PM EST
[ Parent ]
I don't think a linear collider will be build any time soon. There is one country in the world which said it would pay a part of it, but likely only a small part and that is Germany, when the gov cancelled a smaller version for DESY Hamburg.

The US recently has cutted financing of R&D. And what was the reason?
The democrats were angry because Bush has vetoed some of their pet projects. As a revenge they cutted spending on some of Bush's pet projects.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 05:48:42 PM EST
[ Parent ]
There are existing infrastructures like SLAC which keep being upgraded in part with private funding, they are not entirely out of the race.

Pierre
by Pierre on Fri Feb 22nd, 2008 at 05:53:46 AM EST
[ Parent ]
Ahhh, but you are speaking of the ILC, a superconducting juggernaut of some 50km which can still only reach the puny collision energy of 0.5TeV. Why go for such a thing when the much more attractive possibility of CLIC exist? A delicious 3TeV of electron-positron collisions at ultra high luminocity. No troublesome and gradient limiting superconductive cavities either. With 100MV/m we are far above the theoretical limitations of the enemy competition. First beam? Around 2023, optimistically.

(Me? Biased? Naaahhh...)

by someone (s0me1smail(a)gmail(d)com) on Fri Feb 22nd, 2008 at 08:03:47 AM EST
[ Parent ]
Oh dear, what can the matter be?

You can't be me, I'm taken
by Sven Triloqvist on Fri Feb 22nd, 2008 at 08:05:11 AM EST
[ Parent ]
So making the 2023 to an 2035 for real operation, when applying the history of the LHC.
And ILC technology is in operation already, while CLIC needs a lot of additional R&D. DESY's XFEL is operated with cavities good enough for an ILC.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 08:36:58 AM EST
[ Parent ]
Basically they have to rule out very masses by proving that they can't see them. experiments at LEP and the Tevatron have put the mass limits between 114 and ~200Gev.

All the standard model higgs (where it's just a higgs boson with no other weird physics) decay into two particles. It depends on the mass of the higgs what these decays are - for instance i'm studying 165Gev which is roughly 2 times the W mass, so in this region this is by far the most likely decay.

So far they've not seen this above the background and the more data they have, the more they can rule it out. The LHC will provide more data at the low (~114) region, where the main decays are b/bbar quarks, photons, tau lepton, c/cbar quarks and two gluons. All of these occur anyway in the detector so it's hard to spot - which is why there has been the least work in this region so far, and why ruling out the less likely WW and ZZ decays has been easier. Although it's the decay that occurs the least, the photon decay is the easiest to spot.

If they don't find the Higgs at any of these energies, it means it's not just a standard model one - i.e. it behaves not just by the laws we already know, but by new physics we've never seen. Supersymmetry and Extra Dimensions are two of the more prominent theories why the Higgs may not be in the standard model region.

by darrkespur on Thu Feb 21st, 2008 at 04:39:30 PM EST
[ Parent ]
That's true. But it is even better. Unless the Higgs has pretty much exactly 116 GeV, so called loop corrections will destroy the model once more, if there are not more particles to shield these effect. As this is a kind of fine tuning people are not heavy with it, there are basically two classes of models introducing new particles.

The one is supersymmetry. This gives to each particle we already know a heavier superpartner, which has the property to cancel the contribution of its original particle in these loops (for physicists not connected to the matter, it makes a boson partner for every fermion and vice versa. In Feynman diagramms fermions come in with a negative sign, bosons with a positive, so the superpartners cancel their normal partners). The proposed particle I wrote in the diary to catlyse fusion is the stau, the superpartner of an even much heavier partner of the electron than the muon. In some parameter space it can be realively long living.

The other is some variation of 'techni color' or 'warped extra dimensions'. This is in general the more 'natural' solution of the problem, but even more than in the case of supersymmetry one would have expected to find deviations of the standard model about 20 years ago.

The good things about the LHC is, that we really enter the region in which these models have to exist, if they should help to solve any problem.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 05:40:20 PM EST
[ Parent ]
If the super-particle is OPPOSITE--boson for fermion, fermion for boson--then I do not understand how a stau particle--which would be a boson, right?--could replace an electron (a fermion) in a deuterium atom.  

Unless a tau particle is a boson, which would mean I know even less than i thought.  Isn't the tau a fermion like the electron?  

Or is Pauli exclusion irrelevant?  Higher order elements could then get interesting:  Imagine Li with all three (-) particles sitting in the S1 shell.  

But anyway, which is it?  I AM confused.  

The Fates are kind.

by Gaianne on Sat Feb 23rd, 2008 at 03:38:05 AM EST
[ Parent ]
Any negatively charged particle will do. You can even use an anti-proton. Lifetimes may be rather short, however.
Exotic atoms cast light on fundamental questions - CERN Courier
The Paul Scherrer Institut in Villigen has investigated pionic hydrogen (π--p) and deuterium (π--d), and DAFNE in Frascati has investigated their kaonic counterparts. Other no-less-important species include kaonic and antiprotonic helium, which have been studied at the Japanese High Energy Accelerator Research Organization (KEK) and CERN, and yet another exotic variety is formed by the non-baryonic π+- π-(pionium) and πK atoms. Finally, the antihydrogen atom, pbar-e+, which CERN has copiously produced, is in a class of its own owing to its importance for testing the CPT theorem to extremely high precision.

by someone (s0me1smail(a)gmail(d)com) on Sat Feb 23rd, 2008 at 11:07:27 AM EST
[ Parent ]
I suppose Pauli exclusion is irrelevant since we're talking about Hydrogen. But, in addition, having light bosons filling the orbitals would mean that Pauli exclusion doesn't contribute an "exchange energy" to the interaction between various hydrogen atoms, and so you get the benefit not only of smaller atomic radius but also of the ability of the atoms to actually overlap significantly.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Sat Feb 23rd, 2008 at 11:15:32 AM EST
[ Parent ]
I mean heavy bosons.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Sun Feb 24th, 2008 at 05:14:29 AM EST
[ Parent ]
Martin:
That's true. But it is even better. Unless the Higgs has pretty much exactly 116 GeV, so called loop corrections will destroy the model once more, if there are not more particles to shield these effect. As this is a kind of fine tuning people are not heavy with it, there are basically two classes of models introducing new particles.
Is that "kind of fine tuning" a fixed point of the renormalization group flow? Because, in that case, there's nothing particularly bothersome about the fine tuning: it's an internally dynamically-determined parameter of the model, not an externally finely-tuned parameter.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Sat Feb 23rd, 2008 at 11:27:27 AM EST
[ Parent ]
I don't think so. I'm an experimentalist, but its really about the exact Higgs mass, which should be a free parameter in the standard model, otherwise we could stop searching in a range, and it is used by about every theorist who wants to endorse Susy, so it should be a real effect.

Just as an update for you what are the actual issue, why else we believe standard model (SM) is incomplete.

Dark matter exists:

I've seen the picture already elsewhere, but it is from here. This are two galaxies flying through each other. The reddish part are Xrays caused by interaction of the galactic gas. The blue is, where the mass of the galaxies is, detected by gravitational lensing. So it is clear, that most mass in the galaxies is not interacting hadronically or electromagnetic.
In principle this matter has to interact only through gravitation and weak interaction is optional, but as weak interacting particles would have decoupled in the BigBang at pretty much the point that would explain todays dark matter, there is hope, that we can find it in colliders and that it is e.g. the lightest super symmetric particle, which is stable, when assuming supersymmetry to be conserved.

Another thing which the SM has problems to explain are neutrino masses, and why they are so small, though they have mass. (Cosmological structure formation indicates, that dark matter is non-relativistic, so much heavier than neutrinos, which don't contribute a lot). We think they have mass, because oscillation from one flavour into another is detected. Several experiments try to find out more about the mass and the (CKM-like) mixing matrix.

It is not clear by which mechanism the SM would create an excess of matter over antimatter in the Universe as it is seen. There are ideas for such processes and even in the SM one can explain some assymmetry, but only about 10^(-5) of the observed effect. More CP violating phases are needed. Colliders, maybe more the lower energy B fabrics (Japan plans a new one) may help to find it.

Fluctuations in the cosmic microwave background radiation (CMBR) and mesurements of the speed of galxies very far away indicate that most of the mass in the universe is dark energy

Dark energy has a negative pressure (which I personally find mind-boggling) and increases the speed of expansion of the universe. It is not clear if this is just a cosmological constant as once introduced by Einstein to make his formulas consistent with a static universe, or if it is a dynamical thing. One hopes to find out more with better obersertion of the CMBR. It could be, that an extremely prices measurement could find a shifting coppling constant alpha, if it is a dynamical thing.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Sat Feb 23rd, 2008 at 05:48:34 PM EST
[ Parent ]
Higgs mass

Okay, let me get this straight: if the Standard Model Higgs is only renormalizable for a particular choice of the Higgs' mass, this is not considered a prediction but a flaw of the model. However, if bosonic strings can only be made consistent in 26 dimensions, or superstrings in 11 dimensions and with the help of supersymmetry these thing for which there is zero empirical evidence are considered predictions of string theory and not flaws. Moreover, were a supersymmetric particle discovered, this would be considered evidence of string theory even though supersymmetry doesn't require string theory. You said it right, though:

the exact Higgs mass ... should be a free parameter in the standard model, otherwise we could stop searching in a range, and it is used by about every theorist who wants to endorse Susy, so it should be a real effect.
Since you're an experimentalist I hope you won't take the above personally, but anyway I'll offer you a bet that the Higgs will be found at 116 GeV.
Martin:
Unless the Higgs has pretty much exactly 116 GeV, so called loop corrections will destroy the model once more, if there are not more particles to shield these effect.

Dark Energy

As for Dark energy, I'm going to go with the Cosmological constant until there is any evidence to the contrary. As far as I know, even "exotic matter" can't produce a "negative pressure" stress-energy tensor.

Neutrino Mass

I don't consider throwing in a one (or several) right-handed neutrinos and a KCB mixing matrix a challenge to the Standard Model. "Explaining" the masses of the various particles is a challenge, but as far as I can tell there's no candidate for a theory that does that. [No, String Theory is not it: apart from having a proliferation of vacua, the only way they can get a low-energy spectrum of particles is by assuming they all have zero mass, nobody has a mechanism for supersymmetry breaking and the supersymmetry breaking scale just introduces a whole bunch of new parameters to explain.]

Dark Matter

My guess is as good as any other - but if dark matter only interacts gravitationally it won't be seen at the LHC. Actually, the quantum numbers match a "superheavy right-handed neutrino" too...

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Sun Feb 24th, 2008 at 05:13:22 AM EST
[ Parent ]
I won't bet against you. Indirect experimental evidence has its center value below the LEP limit, which is 114. 116 GeV really looks for the moment to be the best guess even without the renormalization argument.

I don't assume string theory to be a physical theory at all, as they can always shift their parameters in a way, that any (non)observation is explained.
My boss completely dislikes Susy, but we have another prof who is now working since decades to prove it (without success).

It may well be, that the LHC finds only a Higgs (and only after quite a long time of running, when it is so low) and nothing else. I'm not at all sure, there is something else, although there are some less compelling hints. However, Susy and some other models should really be dead, if LHC finds nothing.
I only wanted to give you an overview over the reasons why people are searching at all for other things and not simply sit down and say it is not worth to try, because anyhow nothing else than a complete SM can be expected.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Sun Feb 24th, 2008 at 08:31:02 AM EST
[ Parent ]
Oh, it is definitely worth a try, I never implied otherwise. In fact, given the accesibility of the energy range and the necessarily ad-hoc nature of the various models of the Higgs sector, it would be unforgivable not to try.

If the SM Higgs is found, with no evidence of physics beyond the SM below 1TeV (including "corrections" due to physics at higher energies), I think it will be safe to say that theoretical high energy physics will have "died of success". There would be no strong case for higher-energy accelerators, leaving aside how difficult it would be to build something to probe the 10TeV range.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Sun Feb 24th, 2008 at 09:44:27 AM EST
[ Parent ]
One needs something to explain masses, basically a scalar field. There are some theories biulding the scalar field up as a composite of vector fields, but its rather weird.

I think darrkespur was referring to non-standard model Higgses. But as explained elsewhere on the comment section, if the Higgs isn't in the low mass region, there have to be more particles to prevent other problems making the theory inconsistent.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 05:56:14 PM EST
[ Parent ]
A serious question: what important developments do you think came out of the Apollo project? I hear these kind of claims quite often, but there usually less meat on a closer look.

Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN? Especially considering that the 1000's of smart people who work on these projects wouldn't dispappear, but would be working elsewhere, on other projects with potential spin-offs.

by GreatZamfir on Thu Feb 21st, 2008 at 03:02:53 PM EST
[ Parent ]
Just to your second paragraph. Sure I think somebody else would have come up with something similar some years later. But I think the difference in economic impact if the www would have been invented 5 years later is already very big.

And on what project do you think these 1000s of smart people would have worked? Financial innovations? Micronukes?


Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Thu Feb 21st, 2008 at 03:09:35 PM EST
[ Parent ]
I think they'd probably be working that fart problem Pierre mentioned. It sounds pretty serious, to me.

Il faut se dépêcher d'agir, on a le monde à reconstruire
by dconrad (drconrad {arobase} gmail {point} com) on Thu Feb 21st, 2008 at 03:30:23 PM EST
[ Parent ]
You are not seriously claiming 1000s of smart people wouldn't have had anything useful to do? What would have happened if LHC funding didn't go through? I am quite sure the people involved had other plans, not just the people directly involved but also all the people working for companies that supply to LHC.

As for HTML, why not invert that idea? Perhaps without CERN it would have been invented earlier... The point is that there is so little relationship between CERN's activities and HTML that it seems too strong to claim that without CERN, the WWW would have taken 5 years more.

After all, the Web depends not not just on HTML, but on a whole lot of interdependent technologies, both in hardware and software, that were growing in the 80's.

by GreatZamfir on Fri Feb 22nd, 2008 at 05:16:56 AM EST
[ Parent ]
You underestimate the importance of HTML in creating the web.

Particle physics had progressed so fast since the 1940's that the particle physics community had developed a system of "preprints" in which people circulated drafts of their papers to colleagues at their institutions and others months before they were published in journals. The story goes that Tim Berners Lee got tired of e-mailing documents back and forth to colleagues at CERN and decided to invent HTML and code a bare bones browser to allow him to (we would today way) webcast his research. There is something about the pace of information exchange within CERN and in the particle physics community that supports the idea that HTML might have taken 5 more years to be developed elsewhere (and it would have been some university or other: USENET and the text-based tools to go with it, and GOPHER, developed in that environment).

The large particle physics laboratories do employ thousands of physicists, engineers and programmers specifically for particle physics experiments purposes, and that is a nonnegligeable fraction of the respective academic communities. If the large labs didn't exist these people would be competing for academic jobs elsewhere and it would result in more people going to industry, as well as fewer people getting doctorates.

If LHC funding hadn't gone through, CERN would have stagnated and maybe shrunk. You need far fewer people to run the existing facilities than you do to develop a new facility, and the LHC research programme is much more intense that what can be carried out at the existinc facilities (not that that isn't useful, too, but it's on a smaller scale in terms of people and resources).

Consider CERN and the LHC a Keynesian stimulus package for physics and engineering.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:26:34 AM EST
[ Parent ]
The key thing about CERN was that the people who work there are spread across the planet a lot of the time: HTML - and more importantly HTTP - were designed to solve exactly the problem of sharing information with a widely dispersed geographical community all of whom would be publishing data. It followed on from gopher in some pretty obvious ways but was much less structured, which is its main beauty.
by Colman (colman at eurotrib.com) on Fri Feb 22nd, 2008 at 05:33:14 AM EST
[ Parent ]
As an aside, it's only now, with people producing content all over the place that the original vision for the web is being fulfilled - the phase of company brochure sites was painful to watch.
by Colman (colman at eurotrib.com) on Fri Feb 22nd, 2008 at 06:02:16 AM EST
[ Parent ]
And we're doing it by working around the shortcomings of the current publicaiton model, as well.
by Colman (colman at eurotrib.com) on Fri Feb 22nd, 2008 at 06:02:55 AM EST
[ Parent ]
Thanks for these elucidations. To make it more general, could I say the idea is more or less "fundamental, difficult research is likely to encounter problems ahead of the rest of society, and is therefore relatively likely to find useful spin-off solutions" ?

After all, it is possible to predict in hindsight that CERN would be perfect to develop a useful hypertext sytsem. But if one wants to use the unexpected, unpredictable benefits of a project as one of the arguments for funding, there has to be a rationale why this particular project or field is especially likely to lead to unexpected benefits.

by GreatZamfir on Fri Feb 22nd, 2008 at 05:56:57 AM EST
[ Parent ]
In addition, "big science" projects tend to have engineering specs just outside what is possible when they are designed. LHC (and, before, LEP) have required faster electronics than existed at the time they were designed, efficient cryogenics, superconducting magnets, and so on. In that way, CERN drives technology development just like, say, the specs for the next generation of high-speed trains or the Shinkansen do. The same is true of NASA's plans for the next generation of space telescopes (including gravitational wave detectors).

So, big science drives technological developments in established fields, as well as occasionally resulting in new technology. [I distinguish two basic modes of technological progress: secular improvements in technology and new technologies - only the latter qualifies as "innovation" IMHO, and that is not predictable in the way that one can use, say, Moore's law when designing the specs of a computer system to be deployes 5 years in the future.]

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 06:03:38 AM EST
[ Parent ]
A bit off-topic, but the improvement/innovation distinction is another view I am rather sceptical about. If you zoom in on the 'improvements', you usually see the same picture again: Some of the improvements are seen as radical changes in the field itself, some still look as gradual improvements. Zoom in on the gradual improvements, same picture: what looks as gradual improvement from the outside, is unexpected innovation closer up.

I would argue it's innovation all the way through. Some improvements change a subfield, and from the outside it looks as gradual, expected improvement. Some change a field, and the outside world can notice it and say it's something fundamentally different.  

by GreatZamfir on Fri Feb 22nd, 2008 at 07:07:32 AM EST
[ Parent ]
Well, actually, from the point of view of I/O models of the economy there's a distinction between whether an advance just changes the productivity/cost coefficients of the model, or changes its dimensionality by adding a new process or a new product.

The difference between the dynamical systems we are used to considering in physics and biological or economic evolution is the possibility of the system of differential/difference equations changing dimensionality in response to processes within the system itself.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 07:30:00 AM EST
[ Parent ]
I would consider this more an artifact of the modelling than a fundamental point about reality. After all, how do you determine when a new product adds a dimension, or changes existing coefficients? As long as a product is perfect replacement of some existing product, only better along an existing axis, that's easy.

But in reality, new products/inventions, even improvements on existing ones, are usually not that simple. They add an extra dimension, more freedom to find better solutions to problems. But in a high-level, low dimensional description, this freedom can be collapsed into a change in parameters, or really added as extra dimension, if the effects are important enough.

Funny thing is, I am currently working on shape optimization, where it is completely natural to change the number of parameters used to describe the shape, and thus the dimension of the problem.

A related field is order reduction, where you try to (locally) approximate a physical phenomenon by its most important modes. If there is a change in the physics, you can either modify the modes, but keep the same number of them, or you might find that for the new situation more modes are required to describe it well enough.

I would suggest this is a good analogy for your innovation/improvement distinction

by GreatZamfir on Fri Feb 22nd, 2008 at 08:07:51 AM EST
[ Parent ]
Well, a new dimension corresponds to a new manufacturing process, with different inputs. As long as there is substitutability you don't have "true" innovation.

I am familiar with dimension reduction (proper orthogonal modes, principal componets, factor analysis...) and you're right, at some level the number of variables is a matter of choice. But you still have to be able to close the system of equations. You can always ascribe the effect of all the neglected modes to "noise", though.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:10:26 PM EST
[ Parent ]
well you could say that they would create something similar if they were something else and that might be true but without the funding and supply of such a project they wouldn't be able to have the freedom of a living to develop these things. There's also a great deal of cross-collaboration in these things - if they aren't working in science or are working in smaller projects the chances of coming up with something spectacular are almost certainly lower.

One of the main things coming out of Apollo etc IIRC was the computer development of chips for the project. Large advances in microchips and material science filtered out to the outside world. Whilst industry might have got there as well, I'd say almost certainly it would get there slower, due to the very nature of business - a business looking at short term profit is far less likely to allow their researchers the time and space to create a bigger, more long-life project with its associated spinoffs.

Early research is expensive mainly because you don't know what the right solution is - it could be any number of different options and until you pick it you don't know, so there has to be a lot of investment without too much pressure on results immediately or in every route as a lot of them will be blind alleys - but without checking, you'll never know whether they are the right one or not.

by darrkespur on Thu Feb 21st, 2008 at 03:17:08 PM EST
[ Parent ]
But then the million dollar question is, why did the US spend the money on Apollo, and not directly on chip research? Especially as guided missiles needing chips were not exactly unimportant outside of the Apollo/manned space flight program.
by GreatZamfir on Fri Feb 22nd, 2008 at 05:01:44 AM EST
[ Parent ]
Three related reasons: Sputnik angst, there was a race with the Soviet Union in every field, and areospace technology development for military purposes.

But the fact is, the Apollo program was a one-shot thing. It was wound down and the US lost its ability to fly to the moon. It also discontinued its high-payload rockets in favour of the Space Shuttle, so now the rocket market is occupied by the European Ariane and the Russian Proton.

The Soviet manned space program made more scientific and technical sense than the American one, and the ISS owes more to the Russian Soyuz and Mir than to the American Skylab, which was also discontinued and folded into the Shuttle.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:12:22 AM EST
[ Parent ]
Yeah, I know. I am a final year student aerospace engineering, so I have heard my fair share of space histories...

One sidestory that I found particularly intriguing was a note between, I think, McNamara and Lyndon Johnson, in the early 60s. In it they discuss the budget surplusses they are expecting for the late '60s, and they fear Congress will call for tax reductions before they can use the surplusses for their Great Society plans. So in the meantime they think Apollo a good and popular method to keep the budget balanced until they have better things to do with the money. Then came Vietnam...

But more on topic, the whole 'spin-off' concept seems pretty much invented by NASA in later years to justify their budgets, and it is used for so many 'big science & engineering' projects when the wider public has doubts about the costs.    

by GreatZamfir on Fri Feb 22nd, 2008 at 05:38:00 AM EST
[ Parent ]
(1) Spend money on chip research for what? What would the chips be used for? The great proliferation of electronics came on the back of very advanced requirements for components for space programs, etc. One could argue that only once they had been developed for such purposes were it possible to consider their use for more mundane matters. The personal computer only became possible with a maturation of integrated circuit technology, computation infrastructure, and computational techniques that allowed for cheap mass manufacture. The driver for this technology were expensive research programmes in fields requiring processing of large data sets, such as say high energy physics research. Forget about the direct spinoffs, I would argue that the influences of these expensive programmes are far more subtile.  Technological diffusion requires that the basic building blocks are already lying about looking for a different application. You don't start developing processor technology because you think that in 20 years you'll be able to make a machine to play games on.

(2) Missile programmes you say? Because military programmes with large destructive potential are soooo useful while high energy physics, space exploration and the like are vanity projects! And you know how the military loves to share the technology it develops and would not like to keep it secret. One of the great advantages of the large lab high energy physics environment is exactly that it is not a military programme. We don't build things to kill people, I think this is a plus. Further, there is not a tendency to keep progress secret. Quite the opposite in fact, thus a greater chance that progress made here can defuse faster and wider.

(Disclosure, I work at CERN.)

by someone (s0me1smail(a)gmail(d)com) on Fri Feb 22nd, 2008 at 06:06:35 AM EST
[ Parent ]
In other words (and this ties in with my comments on the HTML subthread), technological progress is largely demand-driven. If you want progress you have to create demand for advanced technology. You can choose the form your keynesian stimulus will take: will it be bis science or big guns? And other public spending is also in the same category? Do you want to drive development of medical treatment? Improvements in construction techniques and materials? Improvements in transportation technology? Energy technology? The way to do this is to publicly fund projects which push the boundaries of what's possible. The private sector could do this, too, but they can't afford the solvency risk of sinking money into failed research. The public sector can. It's just a matter of priorities.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 06:42:09 AM EST
[ Parent ]
I think Apollo was a product, not a cause. After Sputnik there was a massive push towards science and engineering in the US, and Apollo fell out of that. So did most of the computer industry.

There's very little evidence to suggest that Apollo contributed directly to electronic design. The first patent for single-chip microcircuitry was granted in 1958. Computers with modular logic were built around the same time. Putting the two together was an next obvious step, and would have happened anyway.

Apollo was mostly a PR exercise for US science and engineering. There may have been some spin-offs in materials science and - obviously - rocket science. But Apollo hasn't left much of a trace in computer science history.

In fact it's rarely mentioned at all. Projects like SAGE, which was the first generation US air defence network, were much more important. NASA did buy a big IBM System/360 for Apollo, but System/360 was already around, and IBM were more interested in selling it as a tool for airline bookings than managing a space program with it.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 08:35:45 AM EST
[ Parent ]
One bit of hearsay lore that I picked up somewhere (probably on TV) is that the physical space constraints inherent in spacecraft design prompted Apollo scientists and related engineers in various indsutries to work on making things like transistors work in a practical setting, as the existing vacuum-tube technologies were simply too big.
by Zwackus on Tue Feb 26th, 2008 at 12:51:39 AM EST
[ Parent ]
I don't think it's about volume, weight is more likely, and I think it was mainly the Minuteman program that really required them. But I would suggest this was only a slight influence. People tried to build integrated circuits all through the 50s,and the first succesful ones  were somewhere around 1960. So there might have been a few years  when rocket programs were the main users,  between their development and first commercial use in the mid-60s.

Keep in mind that the big advantage of ICs, even those years, was the possibility to get prices down through mass production. Not really something the space program, or even Minuteman was very concerned about.

by GreatZamfir on Tue Feb 26th, 2008 at 03:57:56 AM EST
[ Parent ]
GreatZamfir:
Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN?

That's a really hard question to answer. HTML didn't happen directly because of CERN, but it happened because CERN was an environment in which a quick mark-up system would be instantly useful, and because there was no need for 'research' to invent anything more complicated.

There were many, many alternatives to HTML, including horrible things from academia that are best forgotten.

I know people who were researching them, and while they were often better than HTML in many ways - e.g. no broken links - they were also wretchedly overcomplicated, with limited public appeal.

So HTML might well have never happened in its current form. We could easily have had some kind of Windows-ish or other system of gargantuan complexity and slowness.

If you look at academic vs 'public' computing there's a clear pattern of highly abstracted command line tools in academia (e.g. Latex), and much simpler WYSIWYG colouring-book computing in the public area.

HTML broke that pattern by doing something script-ish but relatively simple inside academia, which subsequently escaped into the wild.

That hasn't really happened before, which I think means it's not something that could be relied on.

Or in other words - it's likely CERN got lucky.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Thu Feb 21st, 2008 at 06:01:00 PM EST
[ Parent ]
The semiconductor industry would likely be a few decades behind its current state if there had not been military and space applications for the transistor when it was invented. The reason is the gap between military and commercial viability, defined mostly by cost, and arguably by transistor size (and thus integrated circuit complexity) as well. That gap was filled with public money in the form of the US military budget. The industry grew, and at some point commercial viability started to grow out of that, and henceforth the industry could be sustained as such.

Had that scenario not occurred the industry would not have existed until advances in other sciences / industries would have created commercial viability for it indirectly.

you are the media you consume.

by MillMan (millguy at gmail) on Thu Feb 21st, 2008 at 06:05:40 PM EST
[ Parent ]
I'm not sure it's a clean as that.

Mainframe computing was established by the late 50s, and mini-computing was just starting up. The market was already worth $billions by then. There were some prestige military projects - e.g. SAGE again - and a lot of DARPA funding for research. But the civilian market was already huge, with its own momentum.

Once TI introduced TTL logic in the early 60s, computers became a lot cheaper. At the same time a strong hobbyist culture fanned by magazines kept interest in technology running very high, so there was a steady stream of wannabe engineers with experience of digital techniques from their early teens.

Microprocessors were already being planned in the mid-60s. The biggest gap was between commercial computing and the microprocessor market, and that was bridged by developing a general purpose microprocessor and putting it into a commercial product - a dekstop calculator. It wasn't a military project.

Now you had hobbyist/hacker culture with access to microprocessors and a background of DARPA funded interface and networking research.

The rest was probably inevitable.

What's astonishing is how fast it happened. Most of the core ideas - laptops, databases, the web, GUIs and interactivity, distributed processing, networking, 3D graphics - appeared between 1958 and 1968.

There's been very little genuinely new since then. Most of what's happened has been faster and cheaper, but not so truly innovative.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 09:15:00 AM EST
[ Parent ]
It would be interesting to compare commercial to military / government revenues over time. I should study the topic further because it occurs at the intesection of several topics I'm interested in.

The early commercial viability of mainframes is a good point that I managed to forget. I'll still make my vague 20 year claim, though.

I agree that it all happened shockingly fast.

Most of what's happened has been faster and cheaper, but not so truly innovative.

I disagree. Reading IEEE mags since I became an EE major in college, there has been some stunning work over the years in the semiconductor physics realm that has been required to get the commercially viable transistor sizes we have today. From the computing point of view, though, I agree what you're saying.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 01:22:41 PM EST
[ Parent ]
There was once a plan for a really huge (I think 80 km) collider in the USA. They had already hired 2000 people. Then the program was canceled. Quite a number of these scientists made their way into finance and made complex derivatives a lot more popular.

So without CERN Europe may have simply more financial ABS, CDO, CDS, SIV soup. In some countries this counts as productivity increase, but on this I prefer the Internet.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 09:21:12 AM EST
[ Parent ]
Yeah, this guy was canned in 1993. The media at the time was promoting it as the perfect example of government waste.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 01:24:34 PM EST
[ Parent ]
An interesting episode in the killing of the SSC was how, during the Congressional hearings, a prominent physicists (someone big, like John A. Wheeler or Murray Gell-Mann or Steven Weinberg) was asked by a congressman whether the LHC would provide evidence of the existence of God. The negative answer did not help.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 03:05:41 PM EST
[ Parent ]
Nice try, but you didn't get the second LHC->SSC.
Deleting your own comments! I see what the frontpagers are up to now... For shame!
by someone (s0me1smail(a)gmail(d)com) on Fri Feb 22nd, 2008 at 03:09:24 PM EST
[ Parent ]
I was only 16 at the time and not particularly into current events, but it was clearly a political circus if there ever was one.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 03:10:18 PM EST
[ Parent ]
Some theories in circulation today contain a particle which could be better suited. It would be even much more heavy (~200 000 times the electron mass) and could have lifetimes even on the level of seconds (not longer than 10 seconds or so, because we would see effects in cosmology from the early universe).
Can you be a bit more specific about the kind of particle, and the theories, we're talking about?

Would one of the components of a Higgs multiplet work, or do you need the particle to be a fermion?

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Thu Feb 21st, 2008 at 02:29:48 PM EST
But can the physics there do anything useful?

Can't resist a final quip ... if it could destroy the Earth, that would be the ultimate in usefulness, such destruction being a certain improvement upon the current state of affairs.

(Feeling rather Schopenhaurian this morning, you see.)

by wing26 on Thu Feb 21st, 2008 at 08:19:42 PM EST
Sorry that physics can't that easily deliver this, but some dozen Hydrogen fusion bombs may do it already with current technology.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 09:03:15 AM EST
[ Parent ]
I thought I knew some things about modern physics before I stumbled in this thread. You know, from reading Hawking and stuff.
Now, I feel stupid again. Thanks a lot.

"If you know your enemies and know yourself, you will not be imperiled in a hundred battles." Sun Tzu
by Turambar (sersguenda at hotmail com) on Fri Feb 22nd, 2008 at 10:24:14 AM EST
"I thought I knew some things about modern physics before I stumbled in this thread. You know, from reading Hawking and stuff. Now, I feel stupid again."

I think you have, accidently perhaps, hit the nail of this discussion squarely on its head. The whole problem with modern science is that you have to be a specialist to understand it. I know enough about physics to know that I don't know much (one of Rumsfeld's "known unkowns"), and a big part of the problem is that the math is too hard for me. But my creationist friend doesn't realize that there is any math behind all this, and he just compares what sounds to him like a bunch of gobble-de-gook to the stuff that comes from his evangelical minister.

Without the math (and the many preliminary layers of development you have to go through to get to the modern understanding of the universe), popularized science has no more claim to correctness than the flying spaghetti monster. It's a serious, serious problem, and one result is the funding disaster hitting Fermilab and SLAC right now. Forget the gigantic new projects, even the existing good ones can't get money because no politician in the country ever even signed up for Phyiscs for Political Science majors. (Well, except for Jimmy Carter, maybe.)

The scientific community needs to figure out a solution to this problem or we are going to find ourselves in the 14th century before we know it.

by asdf on Mon Feb 25th, 2008 at 11:40:51 PM EST
[ Parent ]
Can you make that a diary?

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Tue Feb 26th, 2008 at 05:21:51 AM EST
[ Parent ]
As I am an economics/political science geuy who have actually taken a course called Physics for Poets.

Peak oil is not an energy crisis. It is a liquid fuel crisis.
by Starvid on Tue Feb 26th, 2008 at 09:31:18 AM EST
[ Parent ]
Can you describe the course?

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Tue Feb 26th, 2008 at 09:33:23 AM EST
[ Parent ]
It covers a quarter of a semester and tries to explain the scientific world view to arts students. Physics with all but the really easy math ripped out. I guess you could call the course "The Theology of Science" as it covers the important theories, "saints", history, conundrums etc of science. I think it was a great course, even if I did study the stuff (with the math) in school before I went to university.

How did the methods of natural science develop? How did Ptolemaios, Aristotle, Brahe and Newton view the world? Why does the earth circle the sun? Is everything predestined? What is quantum mechanics and the theory of relativity really? Can you travel in time? What is a black hole? What do we know about the Big Bang and creation? The course gives an overview of the scientic view of the world and modern physics.

It was really quite complex and covered things like heliums flashes, the top-down or bottom up theories of galactical formation, quarks and bosons, string theory and supersymmetry, and lots of scientific history. Mixed in with understanding the famous equations like Keplers laws.

Pretty much everyone who took the course loved it and felt everybody else should also take it.

Peak oil is not an energy crisis. It is a liquid fuel crisis.

by Starvid on Wed Feb 27th, 2008 at 06:17:34 AM EST
[ Parent ]
Maybe I'm prejudiced although I don't think so, but I think there is really some non-mathematical evidence that the Big Bang theory is superior to an ad-hoc creation 10000 years ago...
Unfortunately it is as well very difficult to explain a lot of this evidence in a short time as there are so many things which fit together.

One could ask a typical GOP member, if he thinks if it is not allowed to interprete the bible, what he thinks about "Again I tell you, it is easier for a camel to go through the eye of a needle than for a rich man to enter the kingdom of God."


Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers

by Martin (weiser.mensch(at)googlemail.com) on Tue Feb 26th, 2008 at 10:18:50 AM EST
[ Parent ]
I believe that LHC will produce mini black holes that will not evaporate.  Several reasons.  First the idea that mini black holes will evaporate is cavalier wishful thinking.  Would we accept such assurances from a biotech lab? Suppose the lab reported revolutionary benefits of conducting an experiment where they insert small pox genes into an influenza host--with the solemn assurance that any and all samples will immediately be destroyed.  Would we just let them waltz on ahead?  I think not.

The other reason I believe that the mini black holes will be stable is because I am advancing a revolutionary new model, the Dominium, that suggests that mini black holes will stay stable as voracious matter compacting beasts.  Debate has been hot and heavy on my Scientific American blog. http://science-community.sciam.com/blog/Hasanuddins-Blog/300005039  I invite anyone to come on over and join the "fun."  You wouldn't believe some of the harsh words that my detractors have against me.  The funniest thing is that the people with the harshest words adamantly declare that they have never read the model.  Go figure?  Of those who have read the model, they have nothing but positive things to say...though they all hope that I am wrong about the stable mini black-holes, or, if I am right, that LHC can be stopped in time.  

If you do chose to join in the discussion, please read the model first.  You can download the half-version at http://www.sendspace.com/pro/dl/u56srb or you can purchase the full book (the paperback is more complete) at online bookstores

by Hasanuddin on Fri Feb 22nd, 2008 at 05:04:09 PM EST
I have given this guy a 2 rating, as I think he is about as far out of the scientific community as a creationist is out of biology, but unexperienced readers might not notice.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 05:27:59 PM EST
[ Parent ]
[ET Moderation Technology™] That's not exactly warranted as his comment is not uncivil. However, I understand it if you find it offensive.

We have met the enemy, and he is us — Pogo
by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:47:57 PM EST
[ Parent ]
Personally I'd have explained why he was wrong, without the 2, and then watched him attempt to justify. Sooner or later his logic would have hanged him if he's so far out of reality.


Any idiot can face a crisis - it's day to day living that wears you out.
by ceebs (ceebs (at) eurotrib (dot) com) on Fri Feb 22nd, 2008 at 06:02:42 PM EST
[ Parent ]
Thanks, next time I'll try that.

Der Amerikaner ist die Orchidee unter den Menschen
Volker Pispers
by Martin (weiser.mensch(at)googlemail.com) on Fri Feb 22nd, 2008 at 06:22:03 PM EST
[ Parent ]
What is your mechanism for halting the evaporation of microscopic black holes? Hawking radiation is a well-established result from QFT on curved spacetimes ("semiclassical quantum gravity"), and if you have a rebuttal of Hawking radiation that in itself should make waves in theoretical physics.

The issue is that when the black hole becomes small enough that semiclassical approximations don't hold any longer it could be that either the black hole becomes unstable to dacay into photons, or that a "smallest" black hole state is obtained. This would be like a new elementary particle - it has even been suggested that elementary particles are actually microscopic black holes.

Thinking about this, another question that arises is whether a stable microscopic black hole can accrete matter faster than it radiates it back as Hawking radiation. Remember that Hawking radiation is more intense the smallest the black hole. It's possible that matter around Earth is not dense enough for a microscopic black hole to accrete and grow.

We have met the enemy, and he is us — Pogo

by Migeru (migeru at eurotrib dot com) on Fri Feb 22nd, 2008 at 05:33:25 PM EST
[ Parent ]


Display:
Go to: [ European Tribune Homepage : Top of page : Top of comments ]

Top Diaries