Technorati Profile Blog Flux Local IT Transformation with SOA
>

Sunday, February 21, 2010

Interlude Three: On Technology

This being my 50th blog, it represents a good vantage point to take stock of the road traversed and the reason why we are in this journey in the first place. I started my blog describing the promise of technology and the importance of technology transformation to accomplish that promise. I moved on to a discussion on how to make the business case to start the technology transformation ball rolling. I then proceeded to cover more technical matters, such as the characteristics of Service Oriented Architecture and the various classifications of services, and then delving even deeper into the detailed considerations for SOA design and management.

In fact, we have gone so deep that I am reminded of this “gedanken” (mental experiment):

Assume there is a tunnel so deep that it reaches the center of the earth. In fact, imagine digging this tunnel until it reaches the surface on the opposite side of the Earth (the antipode). Now, let’s have a brave athlete jump into the tunnel. What would happen?

Removing all other physical considerations such a as air resistance, and temperature and pressure, as the athlete reaches the center of the Earth, she should begin to feel less and less gravity. In the center of the Earth she should be completely weightless. The force of gravity is zero down there. The reason being that gravity is caused by the Earth’s mass. At the center, the gravitational pull is offset by the Earth’s surrounding mass.

So far, so good, but now the athlete has inertia and will continue to “fall upwards” towards the surface on the other side of the Earth! As the athlete falls upward, the gravitational pull will increase (more and more mass from the Earth will be behind her), slowing her down until the “upward fall” is halted just as she reaches the surface at the antipode. At this point, our athlete will begin to fall once again toward the center of the planet until she returns to the entrance of our tunnel . . . only to fall again. In this hypothetical, frictionless environment, our athlete would act like a perpetual Yo-Yo, repeatedly falling and re-falling back to the surface.

So, imagine that this SOA blog is a bit like this athlete. It feels like we have reached the center and that it’s time to now “fall” upwards towards the surface. Next I will be covering detailed engineering considerations (remember, we are still near the SOA core!), followed by less technical discussions. These items will be related to program execution governance, project management, and organizational and people matters. That is, we will return from the detailed to the general.

Still, given that we are still knee deep in the details, it is also good to remind ourselves why we are on this journey. In the end, this is not about SOA or even Technology, but about what we can do with SOA and with Technology. Yes, there is the technologist viewpoint regarding the power of SOA. While you can certainly run non-SOA system in a Cloud Computing environment, without SOA it is almost impossible to truly leverage the power of Cloud computing on behalf of an enterprise-wide system. Then again, the labor involved in creating SOA systems has an objective beyond Cloud Computing or using Software-as-a-Service. The most exciting goals are all about shaping the future of technology. That is, our ability to make technology so flexible that it eventually becomes hidden.

Arthur C. Clarke’s famed third law states that any sufficiently advanced technology is indistinguishable from magic. I would add a fourth law: The best indication that a technology has matured is that it has become invisible.

Think of electricity, the water supply, or even the internal workings of an automobile. In all these cases, we operate these technologies almost obliviously in a Switch On/Switch Off basis.

For the most part, technologies follow a well-defined life-cycle that takes them from inception in a lab all the way to invisibility. The time spent within a cycle is technology-dependent, but the average time to maturity can span decades.

Many futurists believe that one of the main evolutionary aspects of computing in the future is to have it also become invisible—embedded in the fabric of the thing we call “reality”. Instead of screens, keyboards and mouses, users will interface with computers in a seamless manner.

The ultimate interface achievement will be to hide the fact that a user is accessing, or even programming, a computer. This later attribute is often confused with the famed Turing Test of Artificial Intelligence (AI). However, the Turing Test establishes that Artificial Intelligence will only be achieved when a computer is able to hide the fact that it is a computer when communicating with a human in a broad domain. AI has been long in coming, and many believe it to be still a century away; others that it is around the corner. But AI requires common-sense and pattern recognition capabilities if it is to work, and progress has been fairly slow on these fronts. I tend to agree that AI as originally envisioned will take a long time to be achieved. However once it happens, AI will not appear as an overnight invention; instead, we will continue to see improvements in computer systems that gradually appear to make them smarter and smarter.

Think about your car’s navigation system which already appears quite smart and of the novel capabilities of your digital camera, such as face recognition. Pseudo-AI behavior in narrow knowledge domains is arriving thanks to the growing computer power made possible by Moore’s law. Consider that in the beginning it was assumed that a chess program capable of beating a chess grandmaster would require a full-fledged AI system. However, this feat has been achieved thanks to the use of the brute-force represented by massive parallel processors and the ingenuity of sophisticated heuristics; not by the invention of a human mind emulator. In May, 1997 an IBM computer nicknamed Deep Blue beat World Chess champion Garry Kasparov much to the chagrin of the Grand Master who found it difficult to accept he had been beaten by a computer! To all intents and purposes, playing against a chess computer does convey the eerie feeling of competing against an “intelligent” device. The machine behaves like AI, but it is actually based on the narrow domain of chess-playing, making the computer an “idiot-savant” of sorts.

As discussed earlier, most transformative technologies are the result of synergistic combinations of various evolutionary advances. To the degree that we see continued advances in user interface paradigms as represented by gestures ala iPhone or voice recognition, combined with improved algorithms and availability of ultra-fast communication bandwidths, we will see a wealth of interesting applications; many of them with true transformative effects. For example, enhanced user interfaces in the future, combined with more advanced artificial intelligence heuristics and the merging social networking paradigms, can deliver a suite of Virtual Sidekick capabilities:

· Attaining complete knowledge of your preferences. In fact, complete knowledge of you as a person.

· Exercise controlled empowerment to take independent action.

· Have immediate access to all sources of information available electronically. The ability to alert you to those specific developments that interest you, such as breaking news or TV specials.

· Adopt different service personalities based on context.

· Monitor actions performed on your behalf in a non-obtrusive manner. Certain events will automatically initiate pre-approved actions. For example, a calendar event schedule change will automatically trigger an action from your Virtual Sidekick to initiate a flight change.

This type of automated avatar will spawn new industries just as the Internet has spawned the multi-billion dollar Google. The Virtual Sidekick is but one example of the kind of thinking that should be propelling your R&D efforts. There are others. For example, it’s logical to imagine a future in which web access devices will have become so small and non-intrusive that they can be implanted into our bodies. In a world permeated with wireless access to the Web (the” Infosphere”, I discussed earlier), imagine a scenario where you can search and access the Internet by simply thinking about it; where you can “Skype” your wife and talk to her using your own embedded phone. You won’t even need to speak to communicate. A microprocessor embedded in your brain will convert your brain waves into speech. Think of this scenario as technology-enabled telepathy! These and other interesting possibilities can be extrapolated from the intriguing technology forecasts by author, Ray Kurzweil, in his book “The Singularity is Near: When Humans Transcend Biology”.

There can be no doubt that the transformative effects of such future inventions will generate heated debates about the ethics and dangers associated with their use, but that’s a subject matter for a future blog.

Labels: , , , , , , , , ,

Friday, June 12, 2009

Creating the Technology Strategy

So far so good, the business strategy is understood, the requirements are known with a reasonable degree of certainty and, by now, you have framed the relevant scope and priorities of your company (are you in a business that requires a view of technology as a competitive weapon or as a utility?). To boot, you have also apportioned how you will deal with complexity. It is finally time to develop the overall technology strategy that most clearly matches the technology plan of your business.

At its simplest, the technology strategy should provide a summary, a digest of the technical “Whats” and “Whens”, that translates in technical terms the key business needs. These are really the technical strategy answers to the business requirements, defined in terms that business folks can understand. Here are some examples:

What: “We will replace the entire system with new technology components via gradual investments.”

When: “Over a two-year period, starting with a Phase 1 deliverable that will support new customer analysis tools.”

What: “We will buy a new ERP from vendor XYZ and outsource all new development to India.”

When: “We will hold off until next year. In the meantime we will research outsourcing partners and develop an RFP.”

What: “We will continue to use the operating system ‘as is’, but we will use new technologies to develop any new business systems.”

When: “We will do so on an on-going basis as new business systems become approved.”

If you’ve ever watched a group of children playing a casual game of soccer, you may have been amused at the way they wildly chase after the ball as a pack (believe me, this happens whenever children are let loose with a soccer ball!) There is no structure to the way they play and, as a result, their efforts are usually wasted as they comically interfere with each other. The children know the “What”: to play soccer; they also know the “When”: right-now, but what they lack is knowledge of the “How”.

Just as the “What” and the “When” should have emerged from the definition of the agreed transformation previously discussed, the “How” is the essence, the secret-sauce, of the technology strategy. The “Whats”, the “Whens”, and the comprehensive explanation of the “Hows”, represent the detailed strategy that is to serve as the blueprint for the multi-year IT transformation plan. This technology strategy blueprint is depicted in the diagram below: the core technical solution, including the overall technical approach, the high level architecture, and the standards and tenets you will adopt.

The technical strategy is not meant to be a photograph—something static—but rather an evolving movie. To become a living, breathing, strategy, it must be continuously cross-checked with the changing business requirements. If adjustments are needed, you should ensure the solution continues to conform to the business requirements, and that any resulting changes to the architecture and standards only occurs on an as-needed basis and not as a result of whims or sudden changes in direction driven by dictates of fashion or politics. You will need to implement a governance to oversee the evolution of the strategy, as well as to manage the change control processes in order to avoid the dangers that so-often plague many large projects: scope diffusion, runaway requirements, or irrelevance of the solution.

One of the key roles of the strategic governance is to ensure the strategy is communicated and understood at all levels of the organization. The technical staff, from the most junior programmer to the most senior manager, should be intimately familiar with all the elements of the technology strategy.

While your communication to the technical staff will most likely emphasize the “How” of the strategy, the message to less-technical constituencies, such as the executives, will have to be framed in a more easy-to-understand ‘elevator-chat’ form dealing with the “Whys” and the “Whats” of the strategy.

In this elevator talk you can still speak about the core technical solution and the general principles regarding the transformation: Will you develop the solution internally? Will you rely entirely on the services of a third party vendor? Will you use a hybrid of internally developed modules, with modules available externally as a service? (Software-as-a-Service: SaaS). If you are going to build the solution internally, what role will the IT department play? Will you hire new developers, or integrate third party augmentation services? Will you bring in temporary contractors for software development or will you off-shore development? If so, which components will you off-shore? Will you outsource the whole damn thing? What are the timeframes and general costs? What are the benefits? When will the benefits be realized? Which vendors do you consider strategic?

Clearly, this (possibly long—it’s a tall building) elevator talk will not easily accommodate the explanation of the detailed architecture or standards, but you should at least be able to cover the general concepts, such as whether you plan to use open source software or whether the solution will be accessible via the web.

Now, why are there no longer elevator operators? Remember the days when operating an elevator was considered to be such a specialized role that it required a, usually bored-looking, attendant to press those buttons as a service to you?

Come to think about it, you may want to mention to the CFO that you will be using something called Service oriented Architecture. . .

Labels: , , , , , , , ,