The European Tribune is a forum for thoughtful dialogue of European and international issues. You are invited to post comments and your own articles.
Please REGISTER to post.
As for HTML, why not invert that idea? Perhaps without CERN it would have been invented earlier... The point is that there is so little relationship between CERN's activities and HTML that it seems too strong to claim that without CERN, the WWW would have taken 5 years more.
After all, the Web depends not not just on HTML, but on a whole lot of interdependent technologies, both in hardware and software, that were growing in the 80's.
Particle physics had progressed so fast since the 1940's that the particle physics community had developed a system of "preprints" in which people circulated drafts of their papers to colleagues at their institutions and others months before they were published in journals. The story goes that Tim Berners Lee got tired of e-mailing documents back and forth to colleagues at CERN and decided to invent HTML and code a bare bones browser to allow him to (we would today way) webcast his research. There is something about the pace of information exchange within CERN and in the particle physics community that supports the idea that HTML might have taken 5 more years to be developed elsewhere (and it would have been some university or other: USENET and the text-based tools to go with it, and GOPHER, developed in that environment).
The large particle physics laboratories do employ thousands of physicists, engineers and programmers specifically for particle physics experiments purposes, and that is a nonnegligeable fraction of the respective academic communities. If the large labs didn't exist these people would be competing for academic jobs elsewhere and it would result in more people going to industry, as well as fewer people getting doctorates.
If LHC funding hadn't gone through, CERN would have stagnated and maybe shrunk. You need far fewer people to run the existing facilities than you do to develop a new facility, and the LHC research programme is much more intense that what can be carried out at the existinc facilities (not that that isn't useful, too, but it's on a smaller scale in terms of people and resources).
Consider CERN and the LHC a Keynesian stimulus package for physics and engineering. We have met the enemy, and he is us — Pogo
After all, it is possible to predict in hindsight that CERN would be perfect to develop a useful hypertext sytsem. But if one wants to use the unexpected, unpredictable benefits of a project as one of the arguments for funding, there has to be a rationale why this particular project or field is especially likely to lead to unexpected benefits.
So, big science drives technological developments in established fields, as well as occasionally resulting in new technology. [I distinguish two basic modes of technological progress: secular improvements in technology and new technologies - only the latter qualifies as "innovation" IMHO, and that is not predictable in the way that one can use, say, Moore's law when designing the specs of a computer system to be deployes 5 years in the future.] We have met the enemy, and he is us — Pogo
I would argue it's innovation all the way through. Some improvements change a subfield, and from the outside it looks as gradual, expected improvement. Some change a field, and the outside world can notice it and say it's something fundamentally different.
The difference between the dynamical systems we are used to considering in physics and biological or economic evolution is the possibility of the system of differential/difference equations changing dimensionality in response to processes within the system itself. We have met the enemy, and he is us — Pogo
But in reality, new products/inventions, even improvements on existing ones, are usually not that simple. They add an extra dimension, more freedom to find better solutions to problems. But in a high-level, low dimensional description, this freedom can be collapsed into a change in parameters, or really added as extra dimension, if the effects are important enough.
Funny thing is, I am currently working on shape optimization, where it is completely natural to change the number of parameters used to describe the shape, and thus the dimension of the problem.
A related field is order reduction, where you try to (locally) approximate a physical phenomenon by its most important modes. If there is a change in the physics, you can either modify the modes, but keep the same number of them, or you might find that for the new situation more modes are required to describe it well enough.
I would suggest this is a good analogy for your innovation/improvement distinction
I am familiar with dimension reduction (proper orthogonal modes, principal componets, factor analysis...) and you're right, at some level the number of variables is a matter of choice. But you still have to be able to close the system of equations. You can always ascribe the effect of all the neglected modes to "noise", though. We have met the enemy, and he is us — Pogo
by gmoke - Oct 1
by Frank Schnittger - Sep 24 3 comments
by Oui - Sep 19 19 comments
by Oui - Sep 13 36 comments
by Frank Schnittger - Sep 11 5 comments
by Cat - Sep 13 9 comments
by Oui - Sep 3020 comments
by Oui - Sep 29
by Oui - Sep 28
by Oui - Sep 2711 comments
by Oui - Sep 2620 comments
by Frank Schnittger - Sep 243 comments
by Oui - Sep 1919 comments
by gmoke - Sep 173 comments
by Oui - Sep 153 comments
by Oui - Sep 15
by Oui - Sep 1411 comments
by Oui - Sep 1336 comments
by Cat - Sep 139 comments
by Oui - Sep 129 comments
by Frank Schnittger - Sep 115 comments
by Oui - Sep 929 comments
by Oui - Sep 713 comments
by Oui - Sep 61 comment
by Oui - Sep 1215 comments