Welcome to European Tribune. It's gone a bit quiet around here these days, but it's still going.

Utopian Ideas - Digital Infrastructure

by Zwackus Tue Jun 11th, 2013 at 05:03:18 AM EST

Here on ET, it's generally acknowledged that infrastructure is something that is best done by the government, and preferably during recessions via deficit spending.  Infrastructure is commonly thought of as things that provide for the public good, and that are most useful when provided to all.

Lots of things count as infrastructure.  Roads and bridges are the obvious examples, but power generation and distribution, mail and package delivery, rail and air transport, phone and data networks, medical services, security and disaster relief, and a whole variety of other things could also be considered infrastructure.

How about software?

front-paged by afew


I wonder if there's any value to thinking about and treating software as a form of infrastructure.  PC operating systems might be the most obvious case - it's a piece of code that is necessary to run every other piece of code, and there are advantages to everyone when everyone else uses the same or compatible code.  However, those who actually work in network IT, or with any sort of networked hardware, could surely chime in with examples of the sort of programs and protocols that make everything else work.

However, its a fact that a lot of software, if not most software is rather crappy.  It's unreliable, it's prone to attack, it frequently does not do what we want it to do, and it doesn't tell us what's wrong in a way that most users can understand.  Windows is the most obvious example of chronically crappy software, but it is far from the only example.

Why is this?   There are all kinds of complicated reasons which people other than me could explain in much better detail, but I think the list would typically include such problems as . . .

1 - Big code is complicated, and complicated things are hard.
2 - The need to write software for many different pieces of hardware, and the corresponding difficulties with compatibility.
3 - Reliability is only occasionally a matter of concern for the private companies that fund software development.
4 - Security is only occasionally a matter of concern for the private companies that fund software development.
5 - Usability is only occasionally a matter of concern for the private companies that fund software development.
6 - Employment patterns in the industry do not help much with the process of finding, developing, and maintaining talent.
7 - A lot of good code is kept secret.
8 - Making real progress in reliability, usability, or security may well require a lot of open-ended primary research, and perhaps different hardware.

If my argument holds water up until this point, then it may well make a lot of sense for software to be considered a part of the national/global infrastructure, and treated accordingly.

1 - Rationally designed specs and standards on the national or international level, to insure that things work together and meet minimum standards of reliability and flexibility.  Designing meaningful ways to measure and evaluate such things would be a major accomplishment in and of itself.
2 - National and/or international centers for software research and design, which have the funding and the mandate to solve the really difficult issues of security and reliability and compatibility, and to build software packages from the group up with these goals in mind.
3 - Open and free sharing of the resulting product.
4 - Monitoring and approving the private sector addons to the public sector code, to make sure they're not screwing everything up.

I'm not an IT person, so maybe this just sounds silly.  But ET is always a great place to get well-informed and reasonable commentary on silly-sounding ideas.

Display:
This is a vast topic with each item having it's own discussion (read: Shouting Match) within Computer Engineering, defined broadly.

Let me take:

Rationally designed specs and standards on the national or international level, to insure that things work together and meet minimum standards of reliability and flexibility.

as the easiest to address.

There are national and international specifications and standards.  Take take good old American National Standard Institute (ANSI.)  They've spent decades coming-up with specs and standards for the hoary C language.  Problem is there's no national or international enforcement and so what happens is the people who implement the standards "improve" them in various ways.  The result are flavors of 'ANSI' C which are incompatible to each other, to the ANSI standard, and to the original language; the latter so much so examples in the standard, basic, manual on the language (The C Programming Language by Kernighan and Ritchie) no longer compiles per examples using gcc, the widest available C compiler.  

Before Colman jumps on me, I note there are reasons for people to improve on the standard.  Some of them are valid.  Some are, as we call it, religious, i.e., strongly held opinions based on personal preference, training, experience, and job-at-hand.  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Fri Jun 7th, 2013 at 01:28:42 PM EST
Basically, standards aren't.

And when they are, they take so long to ratify that the industry has moved on. (HTML5 isn't due to be specified until 2014, even though it's been in browsers in some form or other since 2008. And Microsoft will misimplement it anyway, because they always do.)

But all of this is a stupid way to do software. There are smarter ways, and even self-organising smarter ways, with some nominal assured quality. But we're a couple of decades at least from seeing them in general use.

First, people need to get past the geek job hand-coding idea first, get over all the drama associated with project managament, and get past the idea of creating de facto standards for commercial advantage or 'official' standards as an academic exercise.

Another drag on development is Unix/Linux culture's 'Slap it together until it sort-of works, make it public, then go do something else.'

Infrastructure has to be robust, and making software robust is currently too boring for many developers.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Fri Jun 7th, 2013 at 01:41:26 PM EST
[ Parent ]
The problems described in both above posts seem to support my idea - funding for the long-term creation of real, useful standards backed by government authority - and funding for the creation of fundamental bits of the architecture for everyone else to use that are fundamentally reliable and secure.

The situation kind of reminds me of 19th century railroads and track gauge - somebody had to step in a tell everybody to use the same gauge, to stop railroads from using alternative track gauges as a self-destructive competitive measure.

by Zwackus on Fri Jun 7th, 2013 at 09:25:07 PM EST
[ Parent ]
Except governments have no interest in making our computer systems secure, however much they try and secure their own systems.  Standards for commercially available encryption programs are purposely "broken" so messages can be read by intelligence agencies and services.  It seems to be the case the stuxnet mal-ware was designed and spread by the US and Israeli governments.  It is fairly clear countries have been conducting cyber-espionage for decades, the recent outbreak of outrage from the US wrt the PRC's efforts is a case of the biter bitten.

Must Follow Standards freezes the Cybernetic Sector into existing functionality.  In some cases, e.g., ASCII that's an undeniable good ... for a while.  ASCII has a range of coding to support teleprinters.  Who the heck uses teleprinters these days?  But there they sit, hogging space that these days could be used for other, more important, purposes.  It's possible to state ASCII is obsolete; it was designed for 8 bit systems in a 64 bit world.  

In the late 70s 80 megabyte mass storage devices were the size of a small end table costing $80,000.  Today I can purchase 180 terabytes for ~$7,500.  For sheer raw computing power my desktop development system obliterates the IBM 360/70 I worked on 'back in the day.'  The Raspberry Pi at $35 a pop, is more capable than any microcomputer available in, say, 1985.  

The technological change over the past 40 years continues today.  Much of it is not reaching the consumer market because of existing "standards," e.g., WinTel.  And the fact 95% of the people on the software side know bugger-all about hardware, its design, architectural trade-offs between hardware and software, and hardware/software integration.  Putting it simply, computer systems available in 2013 are squarely based on the limitations of 1975 hardware using paradigms and heuristics developed in 1956.  

MicroSoft developed Windows 8 in an attempt to force a move to 2013 technology.  BUT it was an "update" that didn't threaten their market dominance.  Apple forced a change with the various "i" devices but only under the control of a narcissistic control freak: Steve Jobs, who was deeply interested in freeing people to consume anything ... he permitted.  Want to do your own thing?  Tough shit.

Like everything, Standards have a Good side, a Downside, and a range in between.  They are 'an' answer to some things, 'the' answer to some things, 'meh' to some things, and a real hindrance to other things.

 

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Sat Jun 8th, 2013 at 11:20:08 AM EST
[ Parent ]
ATinNM:
Except governments have no interest in making our computer systems secure, however much they try and secure their own systems.

But as more and more of government is run on computers and the internet, reliability becomes a real issue. Not thinking so much of hackers as malfunctions.

Also thinking less about the deep state as the rest of it: taxation, emergency services, judicial system, the lot of the not gun-carrying parts of the state.

Sweden's finest (and perhaps only) collaborative, leftist e-newspaper Synapze.se

by A swedish kind of death on Mon Jun 10th, 2013 at 02:22:56 PM EST
[ Parent ]
Should we ask bankers, how do they secure the data and transactions?
by das monde on Tue Jun 11th, 2013 at 07:03:24 AM EST
[ Parent ]
Design carefully, encrypt everything, lots of firewalls, test a whole lot.

There is no rocket science involved in securing banking transactions. And in general, it's pretty secure. Almost all money leaks in banking systems are not actual security breaches these days, but social engineering scams.

It is rightly acknowledged that people of faith have no monopoly of virtue - Queen Elizabeth II

by eurogreen on Tue Jun 11th, 2013 at 09:41:56 AM EST
[ Parent ]
eurogreen:
Design carefully, encrypt everything, lots of firewalls, test a whole lot.

And fix problems quickly when they appear. Also if possible keep the problems out of the public eye. Which makes it look safer then it is.

Sweden's finest (and perhaps only) collaborative, leftist e-newspaper Synapze.se

by A swedish kind of death on Tue Jun 11th, 2013 at 02:25:45 PM EST
[ Parent ]
And eat the bill for your own fuckups.

And most of your customers' fuckups too.

And, most importantly restrict access: A universal computer connected to the Internet is inherently unsafe, because the wetware can be tricked into overriding even the most stringent software controls.

There are two ways to make it safe: Either remove the wetware's ability to modify the programming in any material manner (unfortunately, this turns the device from a universal computer into a dumb console). Or disable the wetware's access to the Internet (unfortunately, this introduces the problem of who gets to censor your Internet traffic).

Smart corporate sysops will do both. But that is because most corporate machines don't need to be anything more fancy than dumb consoles, and almost no corporate machines actually need to have access to the Internet (as opposed to the child- and idiot-proofed playpen defined by your sysop's favorite censorware).

- Jake

Friends come and go. Enemies accumulate.

by JakeS (JangoSierra 'at' gmail 'dot' com) on Tue Jun 11th, 2013 at 07:36:33 PM EST
[ Parent ]
Or people falling asleep.
The hapless employee appeared before an industrial tribunal in the state of Hesse today to explain his actions. He told the tribunal that he had intended to transfer €62.40 from a retired employee's account but "momentarily fell asleep" and ended up transferring €222,222,222.22.
by gk (gk (gk quattro due due sette @gmail.com)) on Tue Jun 11th, 2013 at 04:59:48 PM EST
[ Parent ]
With technoglogy progressing exponentially, governments can't keep up. We cna expect that.

But if all this technology and economy were to slow down for a while, then there might be a chance for standartization, enforcement, some energy efficiency.  

by das monde on Tue Jun 11th, 2013 at 07:06:45 AM EST
[ Parent ]
You'd be more likely to lock in inefficient tech. Wait until the curve flattens.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 07:10:50 AM EST
[ Parent ]
I think Microsoft, Apple, and the rest have been locking in inefficient tech for a while.
by ThatBritGuy (thatbritguy (at) googlemail.com) on Tue Jun 11th, 2013 at 07:13:41 AM EST
[ Parent ]
What about a "doomsday computer" project, if there is a chance of economy and energy infrastructure breakdown? Would there be a market soon for slower, clunkier but more durable hardware for personal data saving and reading, worst-scenario computing? What software would be worth saving or (re)making?
by das monde on Tue Jun 11th, 2013 at 07:33:20 AM EST
[ Parent ]
Very useful from the collapsitarian viewpoint. After reunification they dismantled the parallel communication system that was supposed to keep up if the Cold War would have turned hot. A robust backup would be a good thing to have. You wouldn't believe how commonplace system failure is. We're all waiting for the big one.

Schengen is toast!
by epochepoque on Thu Jun 13th, 2013 at 05:28:04 PM EST
[ Parent ]
Infrastructure has to be robust, and making software robust is currently too boring for many developers.

New product development is driven by VC funding.  VCs have a 5 year time horizon.  There's no funding, thus no time, for the required 'Quality Control testing - Quality Assurance testing - Redesign' iterative cycle to achieve "robustness."

Second, in most companies the product development team and product test team are in different departments with different reporting structures uniting at the uppermost of Upper Management where the final decision is taken by someone knowing bugger-all about product technology and mainly focused on short term profitability.

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Sat Jun 8th, 2013 at 11:40:29 AM EST
[ Parent ]
When discussing these things it's important to know what Market Segment the software, hardware, or software/hardware is aimed at.  Product development and the products from SAP are totally different in needs, aims, & etc. than those from Apple.  Big Business IT systems and applications tend to be more robust than those for the Consumer Market; Big Business is purchasing for the long haul and is willing to pay the costs for robustness versus the Consumer Market which isn't and doesn't.


She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Sat Jun 8th, 2013 at 11:58:46 AM EST
Obviously I'm out of my depth in my knowledge of computer science.  However, I've not yet seen anything that throws me off the idea yet.  Let me stick for a moment with what I know better, consumer OS.

Why should the OS that runs on the majority of PC systems be created and controlled by a private profit seeking entity with no meaningful regulation when it can clearly be seen as a critical part of the national infrastructure?

I'm not really thinking of a set of standards, like ASCII, but rather a national institute whose job it is to write a solid, up to date, secure and reliable OS.  Private vendors could make add ons and flavors for the OS, provided they pass some sort of quality inspection.  Perhaps identify a couple of applications that are also critical aspects of the national infrastructure, like email and web interface, and write those as well.  Further, it would make sense to regulate and secure the software that runs the actual physical infrastructure, to make sure it is properly robust and secure from attack.

The mandate of the institute would be to continually update and manage the software infrastructure to keep it reasonable up to date, functional, and secure, in the short and medium and long term.  THis would be backed up by long-term basic research.

The idea would be to take up the various aspects of software development that the private sector does not want to bother with, and put them in the hands of an institution with the mandate and funding to do them properly.  Let the private sector share the benefits, but keep the control and ownership and management in public hands.

Not only would a technology and reliability focused development process benefit consumers, who would no longer be screwed over by private companies seeking their own short term profits, but it would also be possible to start protecting the entire national digital infrastructure from the increasing threat of destructive cyber attacks.  

As you mentioned earlier, most existing software is built on years of legacy code and compatibility, much of which was never intended to be secure from malicious attack.  This is where primary research would come into play - figure out a way to make things inherently secure, so that malicious outside attacks are simply impossible.  Maybe that's a pie in the sky dream, but moving in that direction would seem to be a worthwhile endeavour.

Furthermore, this is the sort of labor and time intensive project that government spending in the future economy really should pick up and focus on.  In the end, all you're really spending money on are a few buildings and a lot of people.

by Zwackus on Sat Jun 8th, 2013 at 09:21:03 PM EST
[ Parent ]
I'm not really thinking of a set of standards, like ASCII, but rather a national institute whose job it is to write a solid, up to date, secure and reliable OS.

Appalling as the current corporate mess is, I think a government managed mess would be even worse.

The problem is that computer science is not like physical engineering. Bridges and nukes have obvious failure paths. If something falls over and/or explodes and/or kills people, someone has failed. The science is relatively stable, and the modelling tools are advanced enough to have some obvious relationship with reality.

There is no equivalent concept of modelling in Comp Sci. So there is no such thing as provable reliability, or provable security. The last time someone tried to create a reliable, secure, system, they made something called ADA, which is pretty much a joke.

A variety of formal methods have come and gone, but most are based on managerial ad hoc-ery - like Agile, which includes super-sophisticated techniques such forcing programmers to work in pairs and take turns on the keyboard while the other says 'Hang on, that line isn't a good idea.'

That's where comp 'science' is. It's really just a lot of people throwing things at the market and hoping some of them stick, with the added occasional of marketing and evangelism (including open source) for extra adhesion.

So the idea of a 'provably reliable and secure' system is - optimistic. Given the state of the art, no such thing can exist. Hackers are enormously inventive, and if you give them nation state level supercomputers to play with, they can crack pretty much any security that isn't nailed down with physical rivets and air gaps. (And even then everyone knows you can crack that by bribing people.)

As for better web and email - yes, but, you'd have to rewrite all the email and web software in the entire world.

Just because something can be improved technically - and there's plenty to dislike about email and web software - doesn't mean it can be improved in practice.

Put all of those points together and you have systems that can't be modelled accurately, can't be proven to be secure and reliable, are perpetually being tested for exploits by some very clever people, and can't be replaced with something better for practical reasons.

Or in other words - good idea, but no.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Sun Jun 9th, 2013 at 08:02:41 AM EST
[ Parent ]
Heartily agree.  The following is intended as expansion on two points, not a criticism of TBG's post.

Re: CompSci

A good example of numbnuts things Computer Scientists come-up with is their horror of unrestricted GOTOs.  Well, I've got news.  If all unrestricted GOTOs were eliminated nobody could boot their computer.  If unrestricted GOTOs were eliminated wouldn't have Assembler level Branching Instructions.  If unrestricted GOTOs were eliminated have to drop the "L" in ALU.

The CompSci folks have very, very, carefully isolated CompSci from Reality.

TBG:

... you have systems that ...  be replaced with something better for practical reasons.

Nothing is going to change until somebodies derive a "More Better" way of doing things in degree and kind.

The three biggest reasons are:

  1.  The total global market for Computers is roughly US $900 billion/year when everything from clam shell mobiles to 900 tera/flop machines is added together.

  2.  The installed base: hardware, software, "mental-ware," and hardware/software/"mental-ware" union, is both vast in terms of money and non-monetary asset calculation.  I have no idea what these work out to in dollars but US $200 trillion is probably conservative.  

  3.  A challenge to the above that is not substantial "More Better" in degree and kind is fiercely resisted, a current example is the outcry over MicroSoft Windows 8.0.

As a Computer Engineer I look at the awe striking advances in hardware technology and am disgusted by the lack of advances in software.  


She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Sun Jun 9th, 2013 at 11:59:25 AM EST
[ Parent ]
The observation was that using gotos generally makes a program much harder to understand.  It's the programming equivalent of 'you halve your readership for every equation you use'.

People have tried sneaking them back in the guise of callback based programming, and that results in programs that are even harder to understand.

Yes, they are used at the cpu instruction set level.  But almost nobody writes (or even understands) that.

by njh on Tue Jun 11th, 2013 at 12:34:12 PM EST
[ Parent ]
And Branch instructions.

Good Flow-of-Control design eliminates 95% of unrestricted GOTOs.  The other 5% are unavoidable.  I can't help the fact "almost nobody" understands the instruction set for the CPU powering the machine they are programming and for all the Demons of Stupidity that flow therefrom.  The solution for ignorance is to learn.

(Hey!  You kids.  GTFO my address bus.)

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Wed Jun 12th, 2013 at 01:21:36 PM EST
[ Parent ]
The solution is to use INTERCAL which disallowed GOTO, providing the COME FROM statement instead.
by gk (gk (gk quattro due due sette @gmail.com)) on Wed Jun 12th, 2013 at 06:56:06 PM EST
[ Parent ]
And the far more troubling COMPUTED COME FROM.
by njh on Thu Jun 13th, 2013 at 12:42:17 PM EST
[ Parent ]
I think modern architectures are simply beyond most humans.  Can you even explain what the carry less multiplication instruction is for, or how the branch prediction system works?  And is it worth having all that understanding when you'll be writing programs which are mostly just moving bytes backwards and forwards.
by njh on Thu Jun 13th, 2013 at 01:10:05 PM EST
[ Parent ]
Going all the way back to the beginning, architectures were beyond most humans.  Most humans have no interest in computer architecture so they don't undergo the grind to learn.  

There's no Law stating everybody has to learn computer architecture.  However, when the discussion moves to Standards or cybernetic design the people involved damn well better have some idea.  Otherwise the thing will end up as a royal mess: LISA, Windows 8, etc.

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Fri Jun 14th, 2013 at 11:01:11 AM EST
[ Parent ]
And Gotos really fuck up verification/provability as well.
by Colman (colman at eurotrib.com) on Wed Jun 12th, 2013 at 01:37:23 PM EST
[ Parent ]
And maintainence.  I've done my share of trying to keep spaghetti code with a double portion of GOTO sauce from falling over.  Fun it ain't.

In a proper design GOTO is verifiable to the same extent any program is.  The trick is "proper design."  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Fri Jun 14th, 2013 at 11:09:58 AM EST
[ Parent ]
Is it not possible to segregate program memory from applications memory and to sufficiently restrict access to program memory that web access is impossible - starting with 'cookies'? I can see that this might be inconvenient and might trash some currently beloved commercial practices but I would be happy to have only a few software changes per year that I installed via CD I received in the mail from a trusted source. Were this to drastically cut into the business models of Google, Yahoo, etc. they or their replacements could devise new ways to make money. Would work for me. After all, this is a Utopian thread. :-)

"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Mon Jun 10th, 2013 at 01:55:33 PM EST
[ Parent ]
H'mmm.  OK.  

First you gotta understand commercially available consumer computer architecture is stuck in 1975 using "mental-ware" thunk-up in 1956.  

Second, the security folks are brought in after the hardware, hardware/software, and software folks have designed and built their stuff.  And the conversation goes something like:

Management:  OK, make this secure.

Security Folks:  We need to make fundamental design changes x, y, and z to secure the system.

Management:  Can't change anything.  OK, fake it.

Security Folks:  ^#$%^&!

So ...

Is it not possible to segregate program memory from applications memory

Yes and No.

Yes in the sense everything needed is laying around waiting for someones to get off their asses and plug 'em together.  (The Raspberry Pi gives a hint of what is now possible.)

No because the Decision Makers in the Computer Industry are either technologically illiterate or technologically obsolete.  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Mon Jun 10th, 2013 at 04:03:02 PM EST
[ Parent ]
So computer insecurity is just another aspect of the present financialized economy where all decisions are made on the basis of the short term profits of the biggest economic incumbents - unless someone can find a way to make new, spectacular profits from some disruptive development. OR - unless governments or philanthropists make grants to exceptionally able individuals, such as pi suggests down thread, or some combination.

As ASKOD has noted, there is a large potential market for secure, reliable software to run important infrastructure programs, such as taxes, welfare, the entire medical records complex, vehicle registration and tracking, real estate records, etc. The problem, again, is the structure of the society and the role of finance, which, as Migeru notes, is to be the brain cancer of our society. Trying to get software that takes as its most important roles functionality and security, under the present paradigm, will reliably turn into a death match between politically powerful business entities - with a continuation of the current bungling incompetence with regard to functionality and security remaining the likely default solution.

That is a very different conclusion than TINA, which is what the first response seemed to be tending towards. The structure of our societies, optimized as they are for the maximum wealth extraction capability of the very wealthy, quite naturally make all aspects of our existence miserable. Perhaps the situation will self resolve with the current plague of viruses ending in the destruction of the ability to have any confidence in the existing ownership records within out private ownership societies.  

"It is not necessary to have hope in order to persevere."

by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Mon Jun 10th, 2013 at 06:16:13 PM EST
[ Parent ]
So computer insecurity is just another aspect of the present financialized economy where all decisions are made on the basis of the short term profits of the biggest economic incumbents

The entire computer industry is just another aspect of the present financialized ... & etc.  Has to be.  

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Tue Jun 11th, 2013 at 01:57:03 AM EST
[ Parent ]
And this, once again, is why I wonder if a completely different funding and production model for software should be given a shot.  If the majority of problems are due to past ignorance and present laziness, then focused attention and effort in the right circumstances oughta be able to accomplish something.

Fund people to do the actual hard work of rebuilding things the right way, without an expectation of profit in the near or medium term.

by Zwackus on Mon Jun 10th, 2013 at 08:55:54 PM EST
[ Parent ]
That would be more interesting than trying to enforce standards first and innovate around them later.

In fact there's a fair amount of research in Comp Sci (sic). There are even alternative OS models that are more interesting than anything that's been commercialised.

Problem is, it's not interesting to corporates and it would take too much time/money to make it economic.

What will probably happen instead is a new wave of stuff that's computing++ - a completely different hardware/software/philosophical model, completely new applications, and none of the baggage we have now.

Getting to there from here would be more interesting than trying to reinvent what's around today and make it the-same-but-better.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Tue Jun 11th, 2013 at 07:19:09 AM EST
[ Parent ]
I'm sorry, I guess I hadn't properly explained myself.  What I wanted to propose was . . .

1 - Establish a government body, and hire people to write good OS (for whatever device categories that need it - including buy not limited to home PC, network servers, database hubs, and the computer-bits that run industry and infrastructure hardware) and basic applications software, with the eye on the medium to long term, and with things like security and reliability built in from the beginning.

2 - As a side project, do basic research into things like software verification and whatever other basic things that we don't understand all that well, but which might be useful for the staff working on 1.

3 - When a bunch of the stuff starts to coalesce, think about standards based on the new stuff, and how to use them to bring everybody else up to par over time.

by Zwackus on Tue Jun 11th, 2013 at 09:41:08 AM EST
[ Parent ]
And I hadn't explained myself because a side-idea got stuck in my head in the writing, and went first, and obscured the bit that I thought was more important all along.  Grrr.

What I'd really meant by standards, at least when I was writing it, would be something less like ASCII and more like an objective way of measuring how secure a piece of software is.  I don't think there's really any way right now to formally state or measure something like that, and this seems like a problem.  Maybe it's utterly impossible, but it seems like it would be useful to have a proper security rating, that is properly testable, and legal restrictions based along it.  For example, anything that accesses the internet must get an 8/10 on the formal security scale, or something.

by Zwackus on Tue Jun 11th, 2013 at 09:45:49 AM EST
[ Parent ]
There are lots of security ratings. They're mostly useless or so time consuming and expensive to pass that they apply to previous generations of tech and can only be passed by the big corporates.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 09:50:06 AM EST
[ Parent ]
Well - that's been my point here. Such a thing is simply not possible given the current state of the art, no matter how much money you throw at it and how many clever people you hire.

Even if you devised a perfectly secure system - using quantum signalling, or something - there's still a key on file somewhere, or stuck on a postit note next to someone's desk. Etc.

Even if not, security services will demand a back door, which can be exploited.

Security is relative. Most security is non-existent. A few applications pretend to offer 'almost good enough', with hope rather than certainty.

All information has a market value, and if the cost of breaking security is higher than the value, you're safe, up to a point.

But some hackers like breaking into things just because they can. So 'secure' is pretty much meaningless in absolute terms, and certainly not something you can rely on with any confidence.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Tue Jun 11th, 2013 at 11:22:18 AM EST
[ Parent ]
As computing is a fairly new thing, in terms of human endeavors, there may still be a fair bit of wiggle room when thinking about what might be possible or impossible.  Throwing steady, full time employment at people and asking them to think about the problem may be a waste of time if all one is looking at is the final success of e project.  However, this sort of job creation program seems no more harmful or misguided than most, and worse comes to worse, the engineers and programmers so employed, and their families, and they people from whom they purchased goods and services, will have been better off for it.

And even if the project fails in terms of its main goal, its possible that something good may well come of it.  It's a heck of a lot more likely than putting people to work on weapons tech, where success is its own form of failure.

by Zwackus on Wed Jun 12th, 2013 at 12:52:12 AM EST
[ Parent ]
Fund people to do the actual hard work of rebuilding things the right way, without an expectation of profit in the near or medium term.
Government...

Finance is the brain [tumour] of the economy
by Carrie (migeru at eurotrib dot com) on Tue Jun 11th, 2013 at 08:16:16 AM EST
[ Parent ]
"Government..."... which so regularly has such spectacular failures while trying to buy custom software from private sector vendors. If 20% of the money regularly wasted on such fiascoes were put into an ongoing program to develop basic secure, reliable software for a range of governmental operations...we would blow a major hole in the private software business.


"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Tue Jun 11th, 2013 at 10:35:26 AM EST
[ Parent ]
ThatBritGuy:
There is no equivalent concept of modelling in Comp Sci. So there is no such thing as provable reliability, or provable security.

I was under the impression that using formal methods, one can prove the reliability of a program. Just being slow, expensive, and complexity arising sharply if it is run on a machine that also runs any other programs. So only being used for things likes nukes and such.

But perhaps the state of formal methods can be improved if a ton of money was thrown on it. This being a diary for utopian ideas ad all.

Sweden's finest (and perhaps only) collaborative, leftist e-newspaper Synapze.se

by A swedish kind of death on Mon Jun 10th, 2013 at 02:41:32 PM EST
[ Parent ]
See Formal verification. I remember some of this from an introductory programming course in my math degree 20 years ago. It was fascinating but evidently the amount of resources needed for algorithm verification of anything but the simplest algorithms was prohibitive. In addition, we only saw how this applied to simple procedural languages. Real, modern computer systems are full of event-driven programs, hardware interrupts, declarative languages, and what-not. Algorithm verification is probably only useful as a debugging technique, not for a-priori certification of real life programs.

Finance is the brain [tumour] of the economy
by Carrie (migeru at eurotrib dot com) on Mon Jun 10th, 2013 at 02:48:50 PM EST
[ Parent ]
Basically you can prove reliability if there's a finite number of program states and execution paths, you can specify the transitions through/between them, and then prove you're not in some kind of NP tar pit.

Which is fine as far as it goes. But - as you say - most software is event driven, and doesn't run like a sausage factory where stuff goes in, you wait a while as things happen, and stuff comes out.

So in practice you can't even completely specify the inputs, never mind define states/paths/transitions.

And then there are horrible things like natural language interfaces, where the semantics are inherently fuzzy and there is often no unambiguous meaning at all.

In fact I think there's going to be an eventual singularity in software development, where formal systems become smart enough to define improved formal systems. We'll probably get software proving that software sort-of works by the end of the decade, and certainly by the end of the 2020s.

Once humans no longer have to write code by hand, development will speed up a lot and become much tidier and more reliable.

by ThatBritGuy (thatbritguy (at) googlemail.com) on Mon Jun 10th, 2013 at 03:16:37 PM EST
[ Parent ]
Formal Systems will never be able to verify formal systems until we can make an end-run around Gödel's incompleteness theorems.  Which, at the moment, is a lot like saying, "We can achieve FTL when we make an end-run around the Theory of Relativity."

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Mon Jun 10th, 2013 at 04:11:52 PM EST
[ Parent ]
That there are undecidable questions of arithmetic doesn't mean that every algorithm verification is undecidable.

Finance is the brain [tumour] of the economy
by Carrie (migeru at eurotrib dot com) on Mon Jun 10th, 2013 at 04:23:49 PM EST
[ Parent ]
I agree some can be.  

At a former job I verified a program's algorithm would accomplish the required task and then halt.  I also found it would halt after ~600 years.  Thus it wasn't Formally NP ...

& whoop-de-do

Computer Engineering is engineering.  As such it is different than the tools: Logic, Maths, etc., it uses.  Axiomatic Deductive Logic is a nice tool.  In all too many cases using only ADL is a sure route to, "How the %$^@#$! did we get into THIS mess?"

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Wed Jun 12th, 2013 at 01:36:44 PM EST
[ Parent ]
Axiomatic Deductive Logic

No, no it's not. Just no. Bleurgh.

by Colman (colman at eurotrib.com) on Wed Jun 12th, 2013 at 01:38:55 PM EST
[ Parent ]
We have to be gentle with Migeru.  His mind has been warped by Physics.

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Wed Jun 12th, 2013 at 03:42:41 PM EST
[ Parent ]
was formulated explicitly for number theory. There are two parts (from 40 year old memory):

  1. Any theorem that is capable of proving as true only theorems that are true is not capable of proving as true all theorems known to be true.

  2. Any theorem that is capable of proving as true all theorems known to be true will also prove as true theorems known not to be true.

Of the types of theorems described the first category seems the more useful. Even though it was first developed for number theory it has not prevented us from doing accurate calculations. (Which was always my problem to begin with  :-)

"It is not necessary to have hope in order to persevere."
by ARGeezer (ARGeezer a in a circle eurotrib daught com) on Tue Jun 11th, 2013 at 10:55:10 AM EST
[ Parent ]
In fact I think there's going to be an eventual singularity in software development, where formal systems become smart enough to define improved formal systems. We'll probably get software proving that software sort-of works by the end of the decade, and certainly by the end of the 2020s.

I'd say about fifty years.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 05:25:34 AM EST
[ Parent ]
Model checking is already useful for finding subtle bugs in reactive controller programs. But those methods are only used on safety-critical projects. Avionics, nuclear reactor controller etc.

However, the general verification/synthesis problem cannot be automatically solved (there is a proof).

Schengen is toast!

by epochepoque on Thu Jun 13th, 2013 at 07:11:23 PM EST
[ Parent ]
Donald Kunth, Frequently asked questions
Beware of bugs in the above code; I have only proved it correct, not tried it.
by gk (gk (gk quattro due due sette @gmail.com)) on Mon Jun 10th, 2013 at 07:07:52 PM EST
[ Parent ]
LOL

Finance is the brain [tumour] of the economy
by Carrie (migeru at eurotrib dot com) on Mon Jun 10th, 2013 at 07:27:18 PM EST
[ Parent ]
Formal methods etc is still pre-Newtonian, while software development is busy building the equivalent of relativistic starships (massive networks with feedback loops everywhere). Not a good match.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 05:24:47 AM EST
[ Parent ]
In my experience, formal methods become mature enough for industrial applications when the researchers involved decide they need to buy a house and settle down.
by Colman (colman at eurotrib.com) on Tue Jun 11th, 2013 at 06:01:45 AM EST
[ Parent ]


She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre
by ATinNM on Mon Jun 10th, 2013 at 08:46:37 PM EST
[ Parent ]
Marketing & Sales: Fine, we'll put it out there and let the punters customers debug it.
by afew (afew(a in a circle)eurotrib_dot_com) on Tue Jun 11th, 2013 at 01:41:37 AM EST
[ Parent ]
In the US the funding for leading edge computer research is government funded.  Most of the funding comes from ARPA, DARPA, and other military and intelligence organizations for military and intelligence applications.

Despite the mythological horseshit spread by the ignorant Silicon Valley wouldn't exist if it hadn't been for US Federal government funding.  

Watch:

for the real story of how SV came to be.

She believed in nothing; only her skepticism kept her from being an atheist. -- Jean-Paul Sartre

by ATinNM on Tue Jun 11th, 2013 at 01:50:02 AM EST
[ Parent ]
Software isn't like infrastructure. You are not required to have big teams to produce good results.

  1. One solution behind writing good software lies in accepting that there is some other motivational factor beyond and orthogonal to money and probably also beyond fame. There seems to be a motivational factor of doing the right thing, which maybe correlates to doing the motivation of doing pure science. It sometimes might be the wrong 'right' thing, so someone else has to backtrack and go some other direction. It is like searching the path in a landscape full with small and large hills, and some hills (local optima) aren't that optimal, in hindsight.

  2. Writing good software needs openness, so avoid patents or IPR that is too stringent to allow reuse of ideas.

  3. Writing good software takes time, so how should society provide the time ?

  4. Writing good software needs a certain brilliance in thinking and it is still undecided whether this skill can be acquired (and how) or what else brings it into the world.

One idea is to provide scholarships like we do for literary works. Or give 5-year grants to people that already proved that they might be able to turn out results. If they do not, there should be no harm in failing. If they do provide results, allow them to re-apply for additional 5-year-terms.

It is probably similar to a professorship.

We can even give journalism room to breathe with this kind of process.


Now what ?

by pi (etribu-at-opsec.eu) on Sun Jun 9th, 2013 at 04:44:05 PM EST
About the process that produces good standards: There are quite a few institutions that produce standards. Not all of them are good. For example, have a look at the International Telecommunication Union technical committee work. Those are mostly unreadable, very slow to progress and sometimes even ambiguious.

In contrast, take a look at the way internet standards are developed in the IETF, the Internet Engineering Task Force. They are written by volunteers, and with an eye to openness and running code. Here is a short overview.


Now what ?

by pi (etribu-at-opsec.eu) on Sun Jun 9th, 2013 at 04:53:32 PM EST
One part of my day job is to prepare a software system for the Single European Payment Area (SEPA), a standard based on another standard (ISO 20022). The EU deemed it wise to standardize financial transactions across Europe and before that industry&government types went about creating a new standard in the heydays of XML.

Sepa is supposed to save billions every year but the way forward has been difficult. Every country has its own cavalcade of systems or no real electronic payment system at all. E.g. in Germani there is something called DTAUS to initiate direct debits, which works reasonably well. You send a file to the bank and they pull the money for you as soon they can. Sepa direct debits are another animal entirely. You need a creditor id, issued by the good folks of the Bundesbank, every debitor needs a unique mandate id. Every debitor needs to be prenotified at least x days before any transaction. The changeover to Sepa itself requires notification to the debitor. At first an entirely new agreement (with signature) was required but because of some legal changes the old ones are still valid. Then, depending on whether it's the first or a recurring transaction for that debitor, the file needs to be sent 5 or 2 TARGET/bank days before the actual transaction. If you mess up, the transaction can be rolled back by the debitor for more than a year not just the usual six or eight weeks.

This is painful and potentially disastrous stuff because come February the old system is verboten. Currently, less than 0.2 percent of direct debits are issued using Sepa [in Germany]. Insolvency looms for a lot of people. After a lot of haggling a new intra-German "local instrument type" called COR1 will be introduced in November. Which basically means trying to do something similar to DTAUS in the confines of Sepa: you only need to send the file one TARGET day before the transaction.

Meanwhile, Sepa also introduces the common citizen to the pleasures of the International Bank Account Number (IBAN). It consists of 16 to 30 characters, in most countries it's more than 20 (why do the Palestinians need 29?). Problem: there has only recently been movement towards a somewhat reliable conversion of old account numbers to IBANs. Because some banks have their own special subaccount numbers. And of course every bank has its own implementation of the 'standard' of which there are numerous official versions. This goes on and on and reminds me of the ETCS chaos.

So a lot of blabla. In total: data integration is hard. Standardization is anything but. Standards go one way and reality goes another way. If a standard finally becomes workable it's also close to irrelevance.

This is the kind of infrastructure that necessarily involves government because a lot of law making is involved. But government should not be counted upon to deliver the technical side of it. It is unsafe to regard yourself as the only reliable player.

Schengen is toast!

by epochepoque on Thu Jun 13th, 2013 at 06:47:34 PM EST
[ Parent ]
I think the first thing is to rewind back to the question of where the dividing line is between what should be public and what should be private. Perhaps the rule is something like "whatever is required to maintain a minimum acceptable standard of living should be under public control." With that rule, you might get the following as public:

  • water supply
  • housing
  • a base minimum income
  • medical care
  • productive employment opportunity
  • transportation options
  • education

Then anything beyond that is private, with varying levels of public regulation. Under this rule, you would have to decide whether Internet access is to be considered part of the minimum acceptable standard of living.

My local public library has around 50 free PC workstations where anybody can get an account and use mail, web, etc. The homeless community's daily routine is to sleep in the park, eat breakfast at the soup kitchen, and then break out into a group that goes to the library and a group that goes out to panhandle. The library workstations are completely occupied all day long.
http://www.ppld.org/computers-ppld

This suggests that the user interface, at least, for general purpose routine computer use extends across the entire population and should therefore be controlled by the government.

(However, I will remain on my Mac.)

by asdf on Sat Jun 15th, 2013 at 10:24:29 AM EST
The expectation for significant UI change rate is now one major release per year.

Google, for example, is on the cusp of releasing a new version of Android. Android 5.0 "Key Lime Pie" is expected to debut this fall, at about the same time iOS 7 will leave beta and become available to everyone. Although Google provides minor system and app updates for Android fairly regularly, it hasn't announced a major new version of Android since October 2012.

http://www.informationweek.com/mobility/smart-phones/ios-7-changes-smartphone-battle/240156718?googl e_editors_picks=true
by asdf on Sat Jun 15th, 2013 at 11:37:40 AM EST


Display:
Go to: [ European Tribune Homepage : Top of page : Top of comments ]