The European Tribune is a forum for thoughtful dialogue of European and international issues. You are invited to post comments and your own articles.
Please REGISTER to post.
My project is computer based, simulating what they should see if the Higgs Boson has a mass of 165GeV (decaying to two W bosons which each decay to a lepton and a neutrino). I didn't know a huge amount about CMS (the detector) or the programming aspects before I started so it was a big learning curve. With three weeks to go until I have to stop doing practical work I'm reconstructing the Higgs Mass ok, just need to play around with my electron reconstruction and cuts.
I think to be honest a lot of the best real world applications of the LHC will be indirect, much like the Apollo missions contributed a lot to developments in technology not originally intended for the uses it ended up in. The LHC will process so much data every second that in a year it'll be something like 1/6th of ALL of the world's yearly information going through the detector and its computers.
The research that's gone into the various parts of the detectors (calorimeters, silicon trackers, etc) as well as the magnets, computers, triggers (that rapidly choose which of the billions of collisions are worth keeping and which should be chucked), all of this will have other applications, half of which probably haven't even been thought of yet.
So even if the physics discovered doesn't directly contribute practically, the engineering behind it definitely will.
And what if the standard model Higgs isn't found? What are the alternatives?
Once in a graduate course the professor mentioned in passing that the standard model without a symmetry-breaking sector violates unitarity at about 1TeV, so there has to be some new phenomenon, be it the Higgs or something else, before that energy is reached. What is there to that? We have met the enemy, and he is us — Pogo
The US recently has cutted financing of R&D. And what was the reason? The democrats were angry because Bush has vetoed some of their pet projects. As a revenge they cutted spending on some of Bush's pet projects. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
(Me? Biased? Naaahhh...)
All the standard model higgs (where it's just a higgs boson with no other weird physics) decay into two particles. It depends on the mass of the higgs what these decays are - for instance i'm studying 165Gev which is roughly 2 times the W mass, so in this region this is by far the most likely decay.
So far they've not seen this above the background and the more data they have, the more they can rule it out. The LHC will provide more data at the low (~114) region, where the main decays are b/bbar quarks, photons, tau lepton, c/cbar quarks and two gluons. All of these occur anyway in the detector so it's hard to spot - which is why there has been the least work in this region so far, and why ruling out the less likely WW and ZZ decays has been easier. Although it's the decay that occurs the least, the photon decay is the easiest to spot.
If they don't find the Higgs at any of these energies, it means it's not just a standard model one - i.e. it behaves not just by the laws we already know, but by new physics we've never seen. Supersymmetry and Extra Dimensions are two of the more prominent theories why the Higgs may not be in the standard model region.
The one is supersymmetry. This gives to each particle we already know a heavier superpartner, which has the property to cancel the contribution of its original particle in these loops (for physicists not connected to the matter, it makes a boson partner for every fermion and vice versa. In Feynman diagramms fermions come in with a negative sign, bosons with a positive, so the superpartners cancel their normal partners). The proposed particle I wrote in the diary to catlyse fusion is the stau, the superpartner of an even much heavier partner of the electron than the muon. In some parameter space it can be realively long living.
The other is some variation of 'techni color' or 'warped extra dimensions'. This is in general the more 'natural' solution of the problem, but even more than in the case of supersymmetry one would have expected to find deviations of the standard model about 20 years ago.
The good things about the LHC is, that we really enter the region in which these models have to exist, if they should help to solve any problem. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
Unless a tau particle is a boson, which would mean I know even less than i thought. Isn't the tau a fermion like the electron?
Or is Pauli exclusion irrelevant? Higher order elements could then get interesting: Imagine Li with all three (-) particles sitting in the S1 shell.
But anyway, which is it? I AM confused. The Fates are kind.
The Paul Scherrer Institut in Villigen has investigated pionic hydrogen (π--p) and deuterium (π--d), and DAFNE in Frascati has investigated their kaonic counterparts. Other no-less-important species include kaonic and antiprotonic helium, which have been studied at the Japanese High Energy Accelerator Research Organization (KEK) and CERN, and yet another exotic variety is formed by the non-baryonic π+- π-(pionium) and πK atoms. Finally, the antihydrogen atom, pbar-e+, which CERN has copiously produced, is in a class of its own owing to its importance for testing the CPT theorem to extremely high precision.
That's true. But it is even better. Unless the Higgs has pretty much exactly 116 GeV, so called loop corrections will destroy the model once more, if there are not more particles to shield these effect. As this is a kind of fine tuning people are not heavy with it, there are basically two classes of models introducing new particles.
Just as an update for you what are the actual issue, why else we believe standard model (SM) is incomplete.
Dark matter exists: I've seen the picture already elsewhere, but it is from here. This are two galaxies flying through each other. The reddish part are Xrays caused by interaction of the galactic gas. The blue is, where the mass of the galaxies is, detected by gravitational lensing. So it is clear, that most mass in the galaxies is not interacting hadronically or electromagnetic. In principle this matter has to interact only through gravitation and weak interaction is optional, but as weak interacting particles would have decoupled in the BigBang at pretty much the point that would explain todays dark matter, there is hope, that we can find it in colliders and that it is e.g. the lightest super symmetric particle, which is stable, when assuming supersymmetry to be conserved.
Another thing which the SM has problems to explain are neutrino masses, and why they are so small, though they have mass. (Cosmological structure formation indicates, that dark matter is non-relativistic, so much heavier than neutrinos, which don't contribute a lot). We think they have mass, because oscillation from one flavour into another is detected. Several experiments try to find out more about the mass and the (CKM-like) mixing matrix.
It is not clear by which mechanism the SM would create an excess of matter over antimatter in the Universe as it is seen. There are ideas for such processes and even in the SM one can explain some assymmetry, but only about 10^(-5) of the observed effect. More CP violating phases are needed. Colliders, maybe more the lower energy B fabrics (Japan plans a new one) may help to find it.
Fluctuations in the cosmic microwave background radiation (CMBR) and mesurements of the speed of galxies very far away indicate that most of the mass in the universe is dark energy Dark energy has a negative pressure (which I personally find mind-boggling) and increases the speed of expansion of the universe. It is not clear if this is just a cosmological constant as once introduced by Einstein to make his formulas consistent with a static universe, or if it is a dynamical thing. One hopes to find out more with better obersertion of the CMBR. It could be, that an extremely prices measurement could find a shifting coppling constant alpha, if it is a dynamical thing. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
Okay, let me get this straight: if the Standard Model Higgs is only renormalizable for a particular choice of the Higgs' mass, this is not considered a prediction but a flaw of the model. However, if bosonic strings can only be made consistent in 26 dimensions, or superstrings in 11 dimensions and with the help of supersymmetry these thing for which there is zero empirical evidence are considered predictions of string theory and not flaws. Moreover, were a supersymmetric particle discovered, this would be considered evidence of string theory even though supersymmetry doesn't require string theory. You said it right, though:
the exact Higgs mass ... should be a free parameter in the standard model, otherwise we could stop searching in a range, and it is used by about every theorist who wants to endorse Susy, so it should be a real effect.
Unless the Higgs has pretty much exactly 116 GeV, so called loop corrections will destroy the model once more, if there are not more particles to shield these effect.
Dark Energy
As for Dark energy, I'm going to go with the Cosmological constant until there is any evidence to the contrary. As far as I know, even "exotic matter" can't produce a "negative pressure" stress-energy tensor.
Neutrino Mass
I don't consider throwing in a one (or several) right-handed neutrinos and a KCB mixing matrix a challenge to the Standard Model. "Explaining" the masses of the various particles is a challenge, but as far as I can tell there's no candidate for a theory that does that. [No, String Theory is not it: apart from having a proliferation of vacua, the only way they can get a low-energy spectrum of particles is by assuming they all have zero mass, nobody has a mechanism for supersymmetry breaking and the supersymmetry breaking scale just introduces a whole bunch of new parameters to explain.]
Dark Matter
My guess is as good as any other - but if dark matter only interacts gravitationally it won't be seen at the LHC. Actually, the quantum numbers match a "superheavy right-handed neutrino" too... We have met the enemy, and he is us — Pogo
I don't assume string theory to be a physical theory at all, as they can always shift their parameters in a way, that any (non)observation is explained. My boss completely dislikes Susy, but we have another prof who is now working since decades to prove it (without success).
It may well be, that the LHC finds only a Higgs (and only after quite a long time of running, when it is so low) and nothing else. I'm not at all sure, there is something else, although there are some less compelling hints. However, Susy and some other models should really be dead, if LHC finds nothing. I only wanted to give you an overview over the reasons why people are searching at all for other things and not simply sit down and say it is not worth to try, because anyhow nothing else than a complete SM can be expected. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
If the SM Higgs is found, with no evidence of physics beyond the SM below 1TeV (including "corrections" due to physics at higher energies), I think it will be safe to say that theoretical high energy physics will have "died of success". There would be no strong case for higher-energy accelerators, leaving aside how difficult it would be to build something to probe the 10TeV range. We have met the enemy, and he is us — Pogo
I think darrkespur was referring to non-standard model Higgses. But as explained elsewhere on the comment section, if the Higgs isn't in the low mass region, there have to be more particles to prevent other problems making the theory inconsistent. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN? Especially considering that the 1000's of smart people who work on these projects wouldn't dispappear, but would be working elsewhere, on other projects with potential spin-offs.
And on what project do you think these 1000s of smart people would have worked? Financial innovations? Micronukes? Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
As for HTML, why not invert that idea? Perhaps without CERN it would have been invented earlier... The point is that there is so little relationship between CERN's activities and HTML that it seems too strong to claim that without CERN, the WWW would have taken 5 years more.
After all, the Web depends not not just on HTML, but on a whole lot of interdependent technologies, both in hardware and software, that were growing in the 80's.
Particle physics had progressed so fast since the 1940's that the particle physics community had developed a system of "preprints" in which people circulated drafts of their papers to colleagues at their institutions and others months before they were published in journals. The story goes that Tim Berners Lee got tired of e-mailing documents back and forth to colleagues at CERN and decided to invent HTML and code a bare bones browser to allow him to (we would today way) webcast his research. There is something about the pace of information exchange within CERN and in the particle physics community that supports the idea that HTML might have taken 5 more years to be developed elsewhere (and it would have been some university or other: USENET and the text-based tools to go with it, and GOPHER, developed in that environment).
The large particle physics laboratories do employ thousands of physicists, engineers and programmers specifically for particle physics experiments purposes, and that is a nonnegligeable fraction of the respective academic communities. If the large labs didn't exist these people would be competing for academic jobs elsewhere and it would result in more people going to industry, as well as fewer people getting doctorates.
If LHC funding hadn't gone through, CERN would have stagnated and maybe shrunk. You need far fewer people to run the existing facilities than you do to develop a new facility, and the LHC research programme is much more intense that what can be carried out at the existinc facilities (not that that isn't useful, too, but it's on a smaller scale in terms of people and resources).
Consider CERN and the LHC a Keynesian stimulus package for physics and engineering. We have met the enemy, and he is us — Pogo
After all, it is possible to predict in hindsight that CERN would be perfect to develop a useful hypertext sytsem. But if one wants to use the unexpected, unpredictable benefits of a project as one of the arguments for funding, there has to be a rationale why this particular project or field is especially likely to lead to unexpected benefits.
So, big science drives technological developments in established fields, as well as occasionally resulting in new technology. [I distinguish two basic modes of technological progress: secular improvements in technology and new technologies - only the latter qualifies as "innovation" IMHO, and that is not predictable in the way that one can use, say, Moore's law when designing the specs of a computer system to be deployes 5 years in the future.] We have met the enemy, and he is us — Pogo
I would argue it's innovation all the way through. Some improvements change a subfield, and from the outside it looks as gradual, expected improvement. Some change a field, and the outside world can notice it and say it's something fundamentally different.
The difference between the dynamical systems we are used to considering in physics and biological or economic evolution is the possibility of the system of differential/difference equations changing dimensionality in response to processes within the system itself. We have met the enemy, and he is us — Pogo
But in reality, new products/inventions, even improvements on existing ones, are usually not that simple. They add an extra dimension, more freedom to find better solutions to problems. But in a high-level, low dimensional description, this freedom can be collapsed into a change in parameters, or really added as extra dimension, if the effects are important enough.
Funny thing is, I am currently working on shape optimization, where it is completely natural to change the number of parameters used to describe the shape, and thus the dimension of the problem.
A related field is order reduction, where you try to (locally) approximate a physical phenomenon by its most important modes. If there is a change in the physics, you can either modify the modes, but keep the same number of them, or you might find that for the new situation more modes are required to describe it well enough.
I would suggest this is a good analogy for your innovation/improvement distinction
I am familiar with dimension reduction (proper orthogonal modes, principal componets, factor analysis...) and you're right, at some level the number of variables is a matter of choice. But you still have to be able to close the system of equations. You can always ascribe the effect of all the neglected modes to "noise", though. We have met the enemy, and he is us — Pogo
One of the main things coming out of Apollo etc IIRC was the computer development of chips for the project. Large advances in microchips and material science filtered out to the outside world. Whilst industry might have got there as well, I'd say almost certainly it would get there slower, due to the very nature of business - a business looking at short term profit is far less likely to allow their researchers the time and space to create a bigger, more long-life project with its associated spinoffs.
Early research is expensive mainly because you don't know what the right solution is - it could be any number of different options and until you pick it you don't know, so there has to be a lot of investment without too much pressure on results immediately or in every route as a lot of them will be blind alleys - but without checking, you'll never know whether they are the right one or not.
But the fact is, the Apollo program was a one-shot thing. It was wound down and the US lost its ability to fly to the moon. It also discontinued its high-payload rockets in favour of the Space Shuttle, so now the rocket market is occupied by the European Ariane and the Russian Proton.
The Soviet manned space program made more scientific and technical sense than the American one, and the ISS owes more to the Russian Soyuz and Mir than to the American Skylab, which was also discontinued and folded into the Shuttle. We have met the enemy, and he is us — Pogo
One sidestory that I found particularly intriguing was a note between, I think, McNamara and Lyndon Johnson, in the early 60s. In it they discuss the budget surplusses they are expecting for the late '60s, and they fear Congress will call for tax reductions before they can use the surplusses for their Great Society plans. So in the meantime they think Apollo a good and popular method to keep the budget balanced until they have better things to do with the money. Then came Vietnam...
But more on topic, the whole 'spin-off' concept seems pretty much invented by NASA in later years to justify their budgets, and it is used for so many 'big science & engineering' projects when the wider public has doubts about the costs.
(2) Missile programmes you say? Because military programmes with large destructive potential are soooo useful while high energy physics, space exploration and the like are vanity projects! And you know how the military loves to share the technology it develops and would not like to keep it secret. One of the great advantages of the large lab high energy physics environment is exactly that it is not a military programme. We don't build things to kill people, I think this is a plus. Further, there is not a tendency to keep progress secret. Quite the opposite in fact, thus a greater chance that progress made here can defuse faster and wider.
(Disclosure, I work at CERN.)
There's very little evidence to suggest that Apollo contributed directly to electronic design. The first patent for single-chip microcircuitry was granted in 1958. Computers with modular logic were built around the same time. Putting the two together was an next obvious step, and would have happened anyway.
Apollo was mostly a PR exercise for US science and engineering. There may have been some spin-offs in materials science and - obviously - rocket science. But Apollo hasn't left much of a trace in computer science history.
In fact it's rarely mentioned at all. Projects like SAGE, which was the first generation US air defence network, were much more important. NASA did buy a big IBM System/360 for Apollo, but System/360 was already around, and IBM were more interested in selling it as a tool for airline bookings than managing a space program with it.
Keep in mind that the big advantage of ICs, even those years, was the possibility to get prices down through mass production. Not really something the space program, or even Minuteman was very concerned about.
Someone above mentioned HTML as spin-of of CERN. But should we really believe something similar would not have been invented elsewhere within a few years, had there been no CERN?
That's a really hard question to answer. HTML didn't happen directly because of CERN, but it happened because CERN was an environment in which a quick mark-up system would be instantly useful, and because there was no need for 'research' to invent anything more complicated.
There were many, many alternatives to HTML, including horrible things from academia that are best forgotten.
I know people who were researching them, and while they were often better than HTML in many ways - e.g. no broken links - they were also wretchedly overcomplicated, with limited public appeal.
So HTML might well have never happened in its current form. We could easily have had some kind of Windows-ish or other system of gargantuan complexity and slowness.
If you look at academic vs 'public' computing there's a clear pattern of highly abstracted command line tools in academia (e.g. Latex), and much simpler WYSIWYG colouring-book computing in the public area.
HTML broke that pattern by doing something script-ish but relatively simple inside academia, which subsequently escaped into the wild.
That hasn't really happened before, which I think means it's not something that could be relied on.
Or in other words - it's likely CERN got lucky.
Had that scenario not occurred the industry would not have existed until advances in other sciences / industries would have created commercial viability for it indirectly.
you are the media you consume.
Mainframe computing was established by the late 50s, and mini-computing was just starting up. The market was already worth $billions by then. There were some prestige military projects - e.g. SAGE again - and a lot of DARPA funding for research. But the civilian market was already huge, with its own momentum.
Once TI introduced TTL logic in the early 60s, computers became a lot cheaper. At the same time a strong hobbyist culture fanned by magazines kept interest in technology running very high, so there was a steady stream of wannabe engineers with experience of digital techniques from their early teens.
Microprocessors were already being planned in the mid-60s. The biggest gap was between commercial computing and the microprocessor market, and that was bridged by developing a general purpose microprocessor and putting it into a commercial product - a dekstop calculator. It wasn't a military project.
Now you had hobbyist/hacker culture with access to microprocessors and a background of DARPA funded interface and networking research.
The rest was probably inevitable.
What's astonishing is how fast it happened. Most of the core ideas - laptops, databases, the web, GUIs and interactivity, distributed processing, networking, 3D graphics - appeared between 1958 and 1968.
There's been very little genuinely new since then. Most of what's happened has been faster and cheaper, but not so truly innovative.
The early commercial viability of mainframes is a good point that I managed to forget. I'll still make my vague 20 year claim, though.
I agree that it all happened shockingly fast.
Most of what's happened has been faster and cheaper, but not so truly innovative.
I disagree. Reading IEEE mags since I became an EE major in college, there has been some stunning work over the years in the semiconductor physics realm that has been required to get the commercially viable transistor sizes we have today. From the computing point of view, though, I agree what you're saying.
So without CERN Europe may have simply more financial ABS, CDO, CDS, SIV soup. In some countries this counts as productivity increase, but on this I prefer the Internet. Der Amerikaner ist die Orchidee unter den MenschenVolker Pispers
by Frank Schnittger - Dec 2 2 comments
by Oui - Dec 10
by Oui - Dec 9 6 comments
by Frank Schnittger - Dec 3 2 comments
by gmoke - Nov 28
by Frank Schnittger - Nov 21 10 comments
by Oui - Dec 117 comments
by Oui - Dec 96 comments
by Oui - Dec 88 comments
by Oui - Dec 718 comments
by Oui - Dec 54 comments
by Frank Schnittger - Dec 32 comments
by Oui - Dec 214 comments
by Frank Schnittger - Dec 22 comments
by Oui - Dec 26 comments
by Oui - Dec 114 comments
by Oui - Dec 14 comments
by Oui - Nov 306 comments
by Oui - Nov 289 comments
by Oui - Nov 276 comments
by gmoke - Nov 26
by Oui - Nov 268 comments
by Oui - Nov 26
by Oui - Nov 2513 comments