I don't usually excited by articles I read in the trade press. There are lots of people writing that all have their own, often slanted perspective. However, I like BP Trends and the people who write for them are usually informed and articulate. Anyway, this morning one article caught my attention-- "Is Your SOA a Disaster Waiting to Happen?" by Keith Harrison-Broninski
( http://www.bptrends.com/publicationfiles/1-08-COL-HumanProcesses-Harrison-Broninski-rev-12-12-cap-final.pdf )
In this article, Harrison-Broninski had a very provocative quotes:
"I recently heard a BPM Suite vendor speak proudly of a BPMN process they had implemented that contained 250,000 steps. This represents a vastly complicated piece of business software, so I asked how they had tested it. He replied that testing was unnecessary since their tools did not require the user to write lines of code (just to draw diagrams), and anyway their products contained simulation features if you really wanted to prove that an application worked as expected. He then went on to say how this approach must be OK, since other organizations are using their suite to build safety-critical applications."
Now, I don't know which vendor this was, but if the article is true, SOA and our industry is in for some very hard times. The approach suggested in this quote is not just questionable, it represents "malprogramming". The first thing to note is that compexity doesn't scale. If you have that many components, you have to have a way to test them independent of all the rest and test groups independently. You have to pay attention to time dependencies and data dependencies. Otherwise, it is truly a house of cards.
More on this subject later...