Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.
The semiconductor industry would likely be a few decades behind its current state if there had not been military and space applications for the transistor when it was invented. The reason is the gap between military and commercial viability, defined mostly by cost, and arguably by transistor size (and thus integrated circuit complexity) as well. That gap was filled with public money in the form of the US military budget. The industry grew, and at some point commercial viability started to grow out of that, and henceforth the industry could be sustained as such.

Had that scenario not occurred the industry would not have existed until advances in other sciences / industries would have created commercial viability for it indirectly.

you are the media you consume.

by MillMan (millguy at gmail) on Thu Feb 21st, 2008 at 06:05:40 PM EST
[ Parent ]
I'm not sure it's a clean as that.

Mainframe computing was established by the late 50s, and mini-computing was just starting up. The market was already worth $billions by then. There were some prestige military projects - e.g. SAGE again - and a lot of DARPA funding for research. But the civilian market was already huge, with its own momentum.

Once TI introduced TTL logic in the early 60s, computers became a lot cheaper. At the same time a strong hobbyist culture fanned by magazines kept interest in technology running very high, so there was a steady stream of wannabe engineers with experience of digital techniques from their early teens.

Microprocessors were already being planned in the mid-60s. The biggest gap was between commercial computing and the microprocessor market, and that was bridged by developing a general purpose microprocessor and putting it into a commercial product - a dekstop calculator. It wasn't a military project.

Now you had hobbyist/hacker culture with access to microprocessors and a background of DARPA funded interface and networking research.

The rest was probably inevitable.

What's astonishing is how fast it happened. Most of the core ideas - laptops, databases, the web, GUIs and interactivity, distributed processing, networking, 3D graphics - appeared between 1958 and 1968.

There's been very little genuinely new since then. Most of what's happened has been faster and cheaper, but not so truly innovative.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Feb 22nd, 2008 at 09:15:00 AM EST
[ Parent ]
It would be interesting to compare commercial to military / government revenues over time. I should study the topic further because it occurs at the intesection of several topics I'm interested in.

The early commercial viability of mainframes is a good point that I managed to forget. I'll still make my vague 20 year claim, though.

I agree that it all happened shockingly fast.

Most of what's happened has been faster and cheaper, but not so truly innovative.

I disagree. Reading IEEE mags since I became an EE major in college, there has been some stunning work over the years in the semiconductor physics realm that has been required to get the commercially viable transistor sizes we have today. From the computing point of view, though, I agree what you're saying.

you are the media you consume.

by MillMan (millguy at gmail) on Fri Feb 22nd, 2008 at 01:22:41 PM EST
[ Parent ]


Occasional Series