Monday, 21 July 2008

Mulling over architecture

I've been mulling in no specific direction today. Letting the wind take me to wherever it wishes and that often leads me to think more clearly.

I was talking to my dad over the weekend. He was involved in avionics most of his working life and was the examiner of the institute of Quality Assurance in the UK. We were talking about the Farnborough Airshow - which I attended on Friday - and about fly by wire. The European Fighter Aircraft (Typhoon) is a fly by wire as is the A 380. I don't know about anyone else but the standard of software development would make me concerned if I had to fly in one of those. I try to forget it all when I board an aircraft so I don't worry.

It turns out that much of what they do to increase the reliability of fly-by-wire is to use redundancy. Triple systems is one approach that is used. The components are developed independently and then monitored. When they diverge majority voting is used to determine the likely correct behavior. It is statistical because there is no guarantee that the behavior of the two that agree is correct. It is simply the case that two agreeing is likely to lead to correctness.

In high availability system of this nature they use multiple compliers too and they might choose to use different chip sets.

Given my stance on top-down it is heartening to see that project definition is what often guides these systems. No code is cut until after project definition which in turn provides the specification of the system. So what happens when they change things? They negotiate the change and provide a plan for the introduction of the change. It cannot be done on the fly because the complexity of the change often mitigates against a more agile approach. So they simulate they test and so on. What they have is very good governance processes which document the change, the impact and the resulting tests prior to productionisation. All similar to the approaches we use in commerce oriented solutions without simulation and without testing the architecture in any way.

The role of project definition is to provide the requirements against which testing can be measured and against which simulation can occur. The simulation provides a first step towards testable architecture ensuring that the overal design is commensurate with the requirements.

As an aside High Integrity Software development became a real vogue in the 1980's. One of the earliest proponents of formal methods which have underpinned High Integrity Software has been Tony Hoare (Elliot Brothers 1960 until 1968) who oddly enough was at Elliot Brothers during my dad's tenure (John Talbot 1961 until 1965 and then again 1968 until retirement in the 1990's).

So what does this mean to the world of software and commerce. Formal methods are valuable but not a panacea. They need to be introduced by stealth with their benefits laid out. They need to be employed early. As Anthony Hall elaborates:

"It is well known that the early activities in the lifecycle are the most important. According to the 1995 Standish Chaos report , half of all project failures were because of requirements problems. It follows that the most effective use of formal methods is at these early stages: requirements analysis, specification, high-level design. For example it is effective to write a specification formally rather than to write an informal specification then translate it .It is effective to analyse the formal specification as early as possible to detect inconsistency and incompleteness. Similarly, defining an architecture formally means that you can check early on that it satisfies key [functional and non-functional] requirements such as [message order and content], security [and performance SLA's]."

Formal method have been used in avionics for some time. Hence Tony Hoare's involvement in Elliot Brothers. They are for he most part hidden and become a function of tools that are used for design. The use of the Z notation in the early 1980's found popular acclaim in Ada based systems. The problem was it's cryptic nature and therefore lack of skills in using it. Which is why stealth is a good thing. Tie it up in a tool and make it easy (just like type systems in programming languages).

We have a much clearer formal understanding today of distributed computing through the work of Tony Hoare and Robin Milner. What is needed are tools to help us define architectures that remove the ambiguity of human translation and provide a mechanism for the analysis that is needed, henace the cry for formalism. The pi4soa tool suite is but a start. It can become more refined and integrated with other tools (such as Archimate and those specific tools that support Archimate). Architecture tooling and tooling for design is not the most popular of directions because of the lack of runtime scale for remuneration. But they are much needed as in the end they will enable solutions to be build faster, with higher quality and at lower cost whilst remaining suitable agile and aligned to the business and this is what formalism (suitably hidden) can provide.

Tuesday, 1 July 2008

The Industrialisation of IT

Possible the most important invention that gave rise to the industrial revolution was the micrometer. The inventor of the micrometer was William Gascoigne in the 17th century. It was directly responsible for the engineering discipline in constructing the steam engine and in constructing the Enfield rifle that was used during the civil war in the United States.

What the micrometer did was remove ambiguity. It gave rise to a language of design that enabled precision engineering which by extension gave rise to industrialisation with bullets being made in one place and gun barrels in another.

What has this got to do with CDL?

CDL is possibly as important in the industrialisation of IT. It gives us a language of precision of a system of services which in turn ensures that services are precise (by design) and so interoperate properly - just as the micrometer did for Enfield and Stevensons Rocket.

In classic engineering today simulation is used along with some formal mathematics to test out a design to ensure that it will work. In CDL the same principle is used to simulate and to test so that before a line of code is cut the CDL description is shown to be valid against the requirements and to be correct in computational terms (i.e. free from live locks, dead locks and race conditions).

Testable architecture along with a language of discourse that is precise removes the ambiguity between implementation and requirements. It enables industrialisation and facilitates off shoring of implementation in the same way that Enfield used the micrometer and the precision it gave to design to manufacture solutions in different locations and yet ensure that things work when they are put together.