I was speaking to a conference in San Diego last week. At the end of a very interesting demo of a state of the art application generation tool, there was a Q & A session. After a number of good questions, someone at the back of the room asked, "Isn't this like CASE?" A little perplexed I responded, "This is what CASE would have been 20 years later." To which the fellow at the back of the room said, "But we tried CASE."
But behind this dialog was a failure to understand the evolution of technologies, expecially important ones. Just because a technology didn't live up to over-hyped expections doesn't mean that it won't ever work. Major technologies often take a long time to perfect. That was true of the steam engine, that was true of the electric light bulb, that was true of the Internet. If we are going to make the most of our technology investments, we need to keep history in mind.
One of the problems with computer technology is that the Industry is always over selling. Not surprisingly, most new technologies fail, at least the first time round. In general the industry always overestimates the short term and underestimates the long term effects of a new technology.
Sunday, March 23, 2008
Thursday, January 10, 2008
He knows nothing -- Nicolas Carr is back
From time to time I watch Jim Cramer's Mad Money show. Despite the histrionics, there is often a lot of very good investing information. I especially like that fact that Jim is not afraid to take on the experts. One of his favorite phrases is "He knows nothing!!!" when he is taking on some supposed industry expert.
That's sort of how I feel about Nicolas Carr. Carr has taken a series of extreme positions, which, unfortunately, don't help anyone in IT management. Carr's original judgment was that for most organization IT is a commodity. Now Carr has a new book that says that large IT organizations will all be moving to a utility model. It is all about economics, Carr argues. Information is just like electricity or telephone service and history has shown that utilities, now international utilities, are much better equipped to operate worldwide networks than individual in-house IT organizations.
At this point, I'm tempted to invoke the spirit of Jim Cramer, "he knows nothing." If IT were such a commodity, why is it that so few organizations have been able to replace their aging legacy systems and keep up with technological change? Why have the big winners in the Internet age been folks like Google and Amazon who have not only continued to focus on running their own computer applications, but running their own computer centers and networks and even developeding their own operating systems?
My day job often involves my analyzing the Enterprise Architecture of large organizations, and what I can tell you is that most of these organizations have such complex environments that it is laughable to imagine that some large organization will be able to take over and treat it as a utility activity. Many of these organizations don't even know what they have. Moving the whole operation to a group that is more interested in making money on change orders than it is in corporate success is not going to improve the situation. As IT becomes more and more critical, if I were a CEO there are a few things that I would be totally unwilling to let out of my control and IT would be right at the top of my list.
What gets lost in the discussion of IT and economics is the fact that the is a huge knowledge gap between what a good outsourcer knows about an organization's systems and data and what a good internal IT organizations knows. This is going to come to a head over the next decade when a great many of the people who built all those old systems a generation ago retire.
Unfortunately, as misguided as Carr's ideas are, they are sure to gain currency--it just what large outsourcing organizations want their customers to hear. As a result, don't be surprised to see quotes from Carr's latest book crop up in the next marketing presentation that you see.
That's sort of how I feel about Nicolas Carr. Carr has taken a series of extreme positions, which, unfortunately, don't help anyone in IT management. Carr's original judgment was that for most organization IT is a commodity. Now Carr has a new book that says that large IT organizations will all be moving to a utility model. It is all about economics, Carr argues. Information is just like electricity or telephone service and history has shown that utilities, now international utilities, are much better equipped to operate worldwide networks than individual in-house IT organizations.
At this point, I'm tempted to invoke the spirit of Jim Cramer, "he knows nothing." If IT were such a commodity, why is it that so few organizations have been able to replace their aging legacy systems and keep up with technological change? Why have the big winners in the Internet age been folks like Google and Amazon who have not only continued to focus on running their own computer applications, but running their own computer centers and networks and even developeding their own operating systems?
My day job often involves my analyzing the Enterprise Architecture of large organizations, and what I can tell you is that most of these organizations have such complex environments that it is laughable to imagine that some large organization will be able to take over and treat it as a utility activity. Many of these organizations don't even know what they have. Moving the whole operation to a group that is more interested in making money on change orders than it is in corporate success is not going to improve the situation. As IT becomes more and more critical, if I were a CEO there are a few things that I would be totally unwilling to let out of my control and IT would be right at the top of my list.
What gets lost in the discussion of IT and economics is the fact that the is a huge knowledge gap between what a good outsourcer knows about an organization's systems and data and what a good internal IT organizations knows. This is going to come to a head over the next decade when a great many of the people who built all those old systems a generation ago retire.
Unfortunately, as misguided as Carr's ideas are, they are sure to gain currency--it just what large outsourcing organizations want their customers to hear. As a result, don't be surprised to see quotes from Carr's latest book crop up in the next marketing presentation that you see.
Tuesday, January 8, 2008
Doing SOA wrong
I don't usually excited by articles I read in the trade press. There are lots of people writing that all have their own, often slanted perspective. However, I like BP Trends and the people who write for them are usually informed and articulate. Anyway, this morning one article caught my attention-- "Is Your SOA a Disaster Waiting to Happen?" by Keith Harrison-Broninski
( http://www.bptrends.com/publicationfiles/1-08-COL-HumanProcesses-Harrison-Broninski-rev-12-12-cap-final.pdf )
In this article, Harrison-Broninski had a very provocative quotes:
"I recently heard a BPM Suite vendor speak proudly of a BPMN process they had implemented that contained 250,000 steps. This represents a vastly complicated piece of business software, so I asked how they had tested it. He replied that testing was unnecessary since their tools did not require the user to write lines of code (just to draw diagrams), and anyway their products contained simulation features if you really wanted to prove that an application worked as expected. He then went on to say how this approach must be OK, since other organizations are using their suite to build safety-critical applications."
Now, I don't know which vendor this was, but if the article is true, SOA and our industry is in for some very hard times. The approach suggested in this quote is not just questionable, it represents "malprogramming". The first thing to note is that compexity doesn't scale. If you have that many components, you have to have a way to test them independent of all the rest and test groups independently. You have to pay attention to time dependencies and data dependencies. Otherwise, it is truly a house of cards.
More on this subject later...
( http://www.bptrends.com/publicationfiles/1-08-COL-HumanProcesses-Harrison-Broninski-rev-12-12-cap-final.pdf )
In this article, Harrison-Broninski had a very provocative quotes:
"I recently heard a BPM Suite vendor speak proudly of a BPMN process they had implemented that contained 250,000 steps. This represents a vastly complicated piece of business software, so I asked how they had tested it. He replied that testing was unnecessary since their tools did not require the user to write lines of code (just to draw diagrams), and anyway their products contained simulation features if you really wanted to prove that an application worked as expected. He then went on to say how this approach must be OK, since other organizations are using their suite to build safety-critical applications."
Now, I don't know which vendor this was, but if the article is true, SOA and our industry is in for some very hard times. The approach suggested in this quote is not just questionable, it represents "malprogramming". The first thing to note is that compexity doesn't scale. If you have that many components, you have to have a way to test them independent of all the rest and test groups independently. You have to pay attention to time dependencies and data dependencies. Otherwise, it is truly a house of cards.
More on this subject later...
Thursday, November 22, 2007
What will the next generation of development look like?
A long term associate and friend who has worked the business of assessing software productivity told me not long ago that his data showed that the productivity in 75% of large organizations have not shown significant improvement over the last 10-12 years. Of the remaining organizations, about 13% had improved their productivity, but the other 12% had actually declined.
As I have written elsewhere, the reasons why the software industry has stagnated, if it has, has to have something to do with the complexity of the current software development environment. The software that we develop today is, at one level, much more sophisticated. It is object-oriented distributed, web-based, distributed, etc. etc. The user interfaces are much more sophisticated. We have video objects, Flash objects, Google Map mashups, etc. The data stores are growing exponentially, with unstructured data (email, attachments, multimedia) growing fastest of all.
The the price of all this complexity is the many of our primany software activities have become much more complex to develop than they were just a few years ago. A decade ago, you hardly ever heard the word "deploy". Today, every product has to be deployed over larger and larger universes. Clearly, as we are seeing everywhere, software is becoming more complex at the price of cost, reliability and auditability.
In the next couple of blog entries, I'm going to talk about what we need to do to change our development paradigm.
As I have written elsewhere, the reasons why the software industry has stagnated, if it has, has to have something to do with the complexity of the current software development environment. The software that we develop today is, at one level, much more sophisticated. It is object-oriented distributed, web-based, distributed, etc. etc. The user interfaces are much more sophisticated. We have video objects, Flash objects, Google Map mashups, etc. The data stores are growing exponentially, with unstructured data (email, attachments, multimedia) growing fastest of all.
The the price of all this complexity is the many of our primany software activities have become much more complex to develop than they were just a few years ago. A decade ago, you hardly ever heard the word "deploy". Today, every product has to be deployed over larger and larger universes. Clearly, as we are seeing everywhere, software is becoming more complex at the price of cost, reliability and auditability.
In the next couple of blog entries, I'm going to talk about what we need to do to change our development paradigm.
Subscribe to:
Posts (Atom)