Technorati Profile Blog Flux Local IT Transformation with SOA: April 2009
>

Thursday, April 23, 2009

Virtual Presence—A Prognostication Example

Let’s play around a bit with the techniques covered in my prior Blog. My most recent experience is from the travel and hospitality industry. It’s my believe that this industry is on the verge of major tectonic shifts due to changes in travel patterns caused by the energy costs, economic restructuring, and yes, technology. For example, when travel costs rise and the economy tightens, people change their leisure travel habits, and there is a reduction in business travel as more business is conducted via telephone, emerging online meeting means, or via old-style videoconferencing.

Still, anyone who has ever tried to hold a productive meeting via a 25 inch TV screen knows how hard is to detect nuances in facial expressions or to read the body language from other attendees—aspects of human interaction that are undeservedly underrated. The quality of such business meetings usually turns out to be less than satisfactory. Good business communication is about sensing moods, detecting reactions, and establishing the kind of warm rapport that only seamless proximity can provide.

Could there be room here for an emerging technology in this area? I suggest we can try to answer this question by following the prognostication techniques I covered earlier:

· Reinterpret the past, don’t ignore the value of second generation pruning

· Extrapolate what is known and to imagine what would happen if a known element were to become pervasive

· Identify the various technology trends whose trajectories will make them combine in novel ways

· Define a likely frame of reference with assumptions bound by bracketed extremes

Original Videoconferencing can rightly be seen as a first generation technology, something like the Atari was for Videogames, but now technology has advanced to a degree that overcomes some of the original limitations thanks to the use of large high definition screens and better control of the interaction. Clearly, companies like CISCO and HP have already identified enhanced teleconferencing as an area of great opportunity for the future. However, even though both companies refer to their enhanced teleconferencing products as “Telepresence”, what if an economic and effective form of actual 3-Dimensional Telepresence were developed? What I have in mind is something more like the stuff shown in the latest Star Wars movies—something I will refer to as “Virtual Presence”.

Just like with this very important meeting between Ki-Adi-Mundi, Yoda, and Mace Windu,[1], wouldn’t be great to use Virtual Presence in order to avoid having to fly from LA to New York for just a two hour meeting?

If Virtual Presence were to become ubiquitous, efficient, and economical, it could definitely become a transformational technology. The real question is this: How close are we to developing it?

Economically speaking, if past trends are an indication, chances are the initial engine for introduction of this type of capability will come from the so-called adult entertainment industry—especially if Virtual Presence is complemented with technologies intended to enhance sensory experience. There could be the addition of gloves with actuators to simulate the sense of touch, and . . . you get the idea. Smirk all you want, but economics wouldn’t be a problem in that area!

What would be next? Just as travel became a leading application in the early Internet days, its avoidance will surely drive Virtual Presence technology. After all, business travel takes time, money, and energy (in a world that's becoming more aware of energy consumption, this is not a small issue). Also, let’s be frank: business trips are not often as fun or effective as we would all like them to be.

I once participated as a panelist in a conference, answering questions related to the aftermath of 9/11. One of the questions was how I would recommend companies reduce travel. Being a representative from the hospitality industry, my answer was plain and sincere: “please don’t cut your travel, just stay in one my company’s brands!”

Clearly, business travel avoidance is not good if you happen to be in the travel and hospitality business. Now, I know some of you may argue that travel avoidance was precisely the goal original videoconferencing was meant to achieve, but I’m not talking here about dated NTSC TV screen resolutions. Virtual Presence is all about truly replicating the experience of being in the same room with someone who may be several thousand miles away. Compared to that, existing videoconferencing systems are just primitive thinker toys.

To create Virtual presence, we will need a 3D scanner—perhaps a laser-based system that will rapidly trace each participant, digitizing the contour of their bodies and faces in real-time.

Next, we are going to need an extremely fast network to transfer the digitized information generated by the scanner. We will certainly need a very fast pipe to transfer what are certain to be terabytes of scanned data even after compression. Also, in order to rapidly compress this information, we’re going to need a very fast computer. After all, we still won’t be able to exceed the speed of light and there’s just so much information that can be transferred on a sub-second basis.

Fast computers on the receiving end will be needed to decode the scanned images and 3D technology will then be needed to project the resulting image, using hologram techniques, somewhere in the virtual conference room.

Do we really have the technologies to make this happen? Well, not quite yet, but we could be very close…

Huge bandwidth: Take the shift from analogue to digital spearheading novel uses for the telecommunication networks. The first time ever that telephone wires carried more digital data than voice conversation was in 1997. Yet, it took only 7 additional years before, of all the data transmitted across telephone networks, only 3% consisted of voice.

In 1999, Bell Labs was able to transmit 1.65 gigabits of information across a single fiber optic line in one second. This is equivalent to transmitting the entire contents of the Library of Congress in about six seconds. Only five years later, in 2004, a record announced at the Spring 2004 Internet2 Conference was the transmission of data over nearly 11,000 kilometers at an average speed of 6.25 gigabits per second, and in 2007 it went up to 9.08 gigabits per second.

Moore’s Law to the max: hardware costs continue to be driven down

3D holographic display technology is currently being tested in labs

Basic 3D scanners are now on the market and their prices are dropping



Virtual Presence could just be around the corner. In my prognosticating opinion, we should expect real, albeit expensive, applications no later than 2015 (remember when the 50” plasma TVs went for $100K?). So the main questions are these: when will economics allow for its broad deployment? And better: What is the business impact of such a technological development likely to be?

Next week, I’ll go over some thoughts on how to envision the business impact of technology.

Till then!



[1] Okay, as Star War fans we forgive the movies’ scientific inaccuracies such as sound being generated in the vacuum of space, just as we shall ignore the manner in which the teleconference is riddled with static more closely associated with analogue transmissions rather than digital!

[2] As the Future Catches You—Juan Enriquez

Labels: , , , , , , , , , , ,

Friday, April 17, 2009

Prognosticating the Future

So far, I have covered these themes in my previous Blogs:

· What is IT Transformation

· Why is the transformation needed

· The drivers that are used to justify transformation

Ultimately though, IT Transformation should be aligned with our vision of the future. After all, if we lack this vision then when it comes to defining the direction of the transformation, we would be no better than the proverbial drunken man only searching for his lost keys under a lamppost because “that’s where it is illuminated”. Now, before you say, “Hogwash! No one can predict the future,” I’m not suggesting you’ll need to engage in an exercise of mindless divination but rather in one of informed assessment. There’s a method to the madness. After all, there are different degrees of confidence between forecasting, predicting, prognosticating, and prophesying—the former being more worthy of respect than the latter.

Financial experts forecast into the future by extrapolating known variables, and domain experts can predict the future on the basis of their specialized knowledge—a doctor can predict the progression of a disease, given a certain treatment. But although predicting is great, far more intriguing is the field of prognostication.

The palm reader in the hippie district performs divination, and the mystic interprets dreams as prophecy, but I’m not suggesting you plan your business’s future based on these techniques (at least not yet!). Instead, most of us, poor mortals can prognosticate the future given the right methodology. Prognostication is defined as to foretell from signs or symptoms. It is prediction with a bit less certainty, but it is nevertheless based upon informed opinion. There’s a reason why experts like author Michio Kaku are recognized for their ability to provide sensible prognostications.

No magic is needed to prognosticate. Prognostication’s first technique involves the identification of the various technology trends whose trajectories will make them combine in novel ways. In hindsight (and isn’t everything easier in hindsight!), most major technology events are the result of a novel synergistic convergence or an evolutionary technology development. Revolutions that come about as a result of unexpected scientific discoveries, such as the mastery of fire, the invention of the wheel, or the invention of writing, are a rarity. The chances of serendipitous transformation in a world blanketed by thousands of scientists who are engaged in all forms of structured research are now low, so most of tomorrow’s developments are likely to be the result of inventions that are already well understood today.

Think about it. In hindsight, the “invention” of the Internet should have been easily prognosticated.[1] After all, the appearance of the Internet is almost a foregone conclusion once you combine the concept of a computer on every desk, the availability of faster communication bandwidths, a move to global networking and presentation standards, and lowered technology costs. 

A second prognostication technique is to extrapolate what is known and to imagine what would happen if a known element were to become pervasive. You do this, regardless of how outlandish the extrapolation might appear at first. This is where a Western Union committee, chartered with evaluating the telephone invention from a certain Alexander Graham Bell, got it wrong when they concluded: “Bell’s proposal to place his instrument in almost every home and business is fantastic. The central exchange alone would represent a huge outlay in real estate and buildings, to say nothing of the electrical equipment. In conclusion, the committee feels that it must advise against any investment in Bell’s scheme. We do not doubt that it will find uses in special circumstances, but any development of the kind and scale which Bell so fondly imagines is utterly out of the question.”

Extrapolate this technique to computers. Recall the now famously wrong prediction of IBM’s CEO, Mr. Watson, back in 1943 that the world market for computers is around five? Computers are now everywhere. Do you know how many computers can be found in your car? Indeed, Bill Gates genius, encapsulated by his original motto, “A computer on every desk”, was his ability to rightfully prognosticate the PC revolution no matter how outlandish his vision might have seemed at the time. Contrast Gates’ view with Ken Olsens’ (DEC CEO) 1977 statement, “There is no reason for any individual to have a computer in his home,” and look at what happened to DEC. Bill Gates applied the principle of pervasiveness; Mr. Olson did not, and as a result of this failure to respond adequately to the advent of PCs and open networking, DEC faltered until it was ultimately acquired by Compaq, which was then acquired by HP. Last I checked, Microsoft is still around and going strong despite the emergence of new challengers, so good prognostication is undoubtedly essential to the future of a company.

A third technique of prognostication is this: reinterpret the past, don’t ignore the value of second generation pruning. Let me explain. Let’s say an invention comes along and it becomes popular. The smart investor knows that something even better will be showing up before long. Remember the Atari videogame console? Many at the time thought the videogame market was saturated and that it had nowhere to go. Even the Atari people thought so, and basically gave up on a business which today exceeds the movie and music businesses combined!

I plead guilty for having believed back in the late nineties that the market for Internet search engines was saturated and done for. After all, wasn’t Yahoo the perfect indexing tool? And weren’t there very capable search engines out there such as Altavista, Excite, and others? Good thing, our friends at Google didn’t agree with that limiting point of view! So, look around for first generation ideas with the potential to really stir things up if a new and improved version can be developed.

A fourth prognostication technique is to define a likely frame of reference with assumptions bound by bracketed extremes. For example, when trying to prognosticate the future of Earth, one can assume that it will fall somewhere in between the following possible extremes:

Future that can ruin your retirement plans

Future that would make the Jetsons jealous

World economy continues into a protracted recession that collapses civilization into “Mad Max” territory

Nuclear terrorism tilts civilization towards a new dark age

Yellowstone eruption wipes out Earth

Plagues: Ebola virus mutates to produce air-borne contagion, Avian Flu

Asteroid crashes on Earth

Global warming continues unabated. Polar caps melt flooding most coastal areas.

Energy crisis. No oil left.

World Peace via global social awareness

Environment is managed

World hunger is conquered thanks to artificially produced food

Water is made available to all people

Hypersonic flying makes global travel even more practical

Humanity takes to space: Space tourism becomes common

Genetic therapies cure diseases

Genetic regeneration. Grow body parts

Chances are that the future will turn out to be a goldilocks choice between these two extremes: not all good, but hopefully not all bad. As technology leaders it is our job to assess changes in terms of their probability and of the impact they could have on our business. Your IT strategy should match this assessment. Look again at the good and the bad prognosticates listed above. It doesn’t take a genius to know that climate change and energy supply issues will surely be in the forefront over the next several decades. Of the items listed to the right, which ones could be driven by those listed to the left? Look closely and you will see that every major bad news risk could drive advances on the good news future list. There is something to the cliché that says, ‘for every problem there is an opportunity’:

Nuclear Risks could lead to additional focus on defense and safety technology spending

Plagues could give us greater focus on genetic research

Climate Change can motivate increased research and development of low carbon emission energy sources

Observations determine an asteroid will be heading our way in the near future? You can be sure that investments in space technologies would be the top priority. Then again, that might be one challenge we wouldn’t want to ever have to deal with.

That’s it. If a crystal ball is needed it is just for effect and nothing more. In my next Blog I’ll go over an example of how to apply these techniques to prognosticate a specific future opportunity.



[1] To be fair, legendary MIT professor Michael L. Dertouzos is on record as having predicted the WWW as early as the late seventies.

Labels: , , , , , , , ,

Friday, April 10, 2009

The Drivers for Transformation

Let’s face it, everything turns into a legacy pretty much the moment it goes into production. Systems deteriorate or eventually become obsolete due their inability to meet new objectives. The world changes, new problems emerge, new discoveries are made, disaster strikes, people are mortal and new generations bearing new ideas and fresh visions ultimately take over.

But this still leaves the open these questions: what precisely is it that should be changed? And what drives what?

I’m thinking of the old age debate about whether it is technology or the needs and wants of the business discourse that ultimately drive technology. My own view is that technology’s impact on business and business’s impact on technology are dialectic in nature. Oftentimes it is “business” that drives technology (the 1960’s NASA Moon-shot comes to mind), but sometimes it is technology that drives business by opening previously untapped revenue-generation areas. Take the video-game industry; a direct result of the invention of the micro-processor which was spearheaded to meet the requirements of an electronic calculator manufacturer.

The initial driver sometimes kicks up other drivers in a feedback-loop manner. For example, new satellite communication technology encouraged globalization. International broadcasts of sporting events have become common, so that people in Brazil can now watch the same soccer match as people in England. On the other hand, globalization in communications also drove the need for significantly higher bandwidth and for shortened propagation delays than those from satellites. This encouraged development of more bandwidth capable fiber-optics, which led to new uses of that bandwidth, and well, you get the picture.

Business and Technology are but two of the key transformation drivers. I suggest that Competition and the internal dynamics of your company are another two. To me the list of drivers for change can then be narrowed to these four:

Business Drivers. Ultimately, if you are going to justify a new project to remedy problems with an aging infrastructure, or need to introduce a new reporting system, or, frankly, anything that requires a purchase order, you are going to have to justify the results in business terms. Rarely will the business directly request you to change the technology. Do not expect your CEO to come up to you and say, “Hey Joe, here’s a budget for us to move to SOA”. Truth of the matter is that you will be asked to deliver a specific list of business functionalities and capabilities, and you will then be expected to do so with a much reduced budget and tightened timeframes (“need this by yesterday”). Your challenge, should you decide to accept it, is to explain why satisfaction of these requirements merit the necessary funding and patience to launch an IT transformation effort.

Technology Drivers. So, often, it’s not a business requirement that drives change, but rather the pure fact that new technologies emerge with breakthrough functionalities, lower costs, or novel usages, and your refusal to adopt them would rightly place you and your company in the nomination awards list of the International Luddites Association (forget about Googling them; they don’t believe in having a Web site). Still, when the driver is technology alone (not the competition and not the business), it is usually wisest to introduce the change in line with normal replacement lifecycles. Take for instance the recent popularity of flat screen monitors. At first, executives drove their adoption (sign-of-status is a business driver, right?), then the artist-types demanded them with legitimate request to live in a flicker-free world, and soon after, the senior technical staff acquired them (how come those folks at graphic design have those monitors and we don’t?), until they eventually became cheap and prevalent enough to be the natural replacement technology for everyone with an outdated CRT monitor.

Ignore the forces presented by technology drivers at your own peril. Failure to change with the advent of transformational technologies is perhaps one of the biggest reasons that previously secure companies have tumbled down precipitously. The list of examples can be quite long, but suffice to note that Polaroid’s misread of the digital technology, and Wang’s failure to understand the ability of the PC to be used as a word processor show how serious is the need to understand the impact of new technologies on our businesses.

Your Competition. The world would be a much better place without those damn competitors. Alas, they do exist, and they sometimes introduce products and services that challenge your business. Back in the earlier 60s, American Airlines revolutionized the way travel was booked by allowing travel agencies direct access to its central reservations system (SABRE). Competing airlines had no choice but to react, and to react quickly, lest they face the prospect of a quick and painful death. After all, travel agencies using the SABRE system were more likely to book flights in American Airlines than in other competitor. Being the earliest to use automated travel reservations gave AA a tremendous boost in market share and in reduced distribution costs.

Your Organization. That is: you and your organization. You are that competitor who is the first at introducing a new feature or capability. Someone from inside your R&D group may come up with that one game-changing idea that truly deserves support. A new business process is identified that cries out for use within your system to effectively reduce costs now and into the future. Perhaps your company has acquired another concern, or there’s been a merger; or simply process and control inefficiencies are preventing scaling up the system.

Planning for your IT transformation involves assessing the future for each of these drivers. Next week, I'll talk about some techniques useful in prognosticating what's yet to be. . .


Labels: , , , , , ,

Friday, April 3, 2009

Assessing the current state of your IT system

A key step of the initial IT transformation process is to understand the true state of your IT system today.

In medieval times, paper was so rare that monks transcribing books were forced to erase previous text and reuse the paper. A “palimpsest” is a manuscript page that has been scraped off and then reused. Chances are that your company’s IT heritage is a palimpsest of a series of Rube Goldberg tactical solutions, deployed throughout the years.

You can deconstruct your current IT system just as a seasoned archeologist can figure out how different civilizations emerged by analyzing the ground strata and digging up past artifacts. The difference is that while the archeologist’s analysis will reveal the clues of long extinct tools and civilizations, your analysis will reveal that nothing really has been laid to rest—most legacy IT data centers are all like the town of Macondo in ‘A Hundred Years of Solitude’. In that town everything remains forever; the spirits of the past never go away.

There is a traditional pattern to a legacy information system. This pattern will most likely include some form of mainframe or mini-computer complex still faithfully executing relic languages like COBOL or some of those fourth generation languages that were so popular in the last decade—that software being gingerly maintained by the dwindling group of developers still able to remember these computer languages[1].

Located next to that central complex, you are likely to encounter a cluster of mini-computers, so popular in the seventies, executing a obscure network protocol translation or running programs serving no clear purpose but that everyone is fearful to remove, lest the entire system collapse. Yes, as you dig deep into your eight-year old plus system documentation (last partially updated thanks to your company’s internship program), you will find a bunch of PC’s still emulating dumb terminals, and in a topsy-turvy fashion, a few dumb terminals emulating PCs.

You will also find out that, unbeknown to most, the most critical business intelligence reports are coming not from a that data mining OLAP system deployed by expensive consultants, but by a bunch of spreadsheets from a PC in one of the MS Access data bases, scripted by a programmer-wannabe in the accounting department.

Compounding this, chances are that your IT environment will have a legacy of ossified technologies that never made it to prime time and are no longer supported. These are the kind of technologies that author Ray Kurzweil labels as “False Pretenders” such as transatlantic blimps, Quadraphonic sound systems, IVRs, and the original Apple Newton PDA[2]

All this would be amusing if it weren’t the case that this very IT system is being tasked to support the new online world and it's expected to handle millions of real-time transactions and very-large information volumes. According to U.C. Berkeley, the entire world’s print and electronic media is producing about 1.5 exabytes[3]worth of data per year (i.e. 500 billion U.S. photocopies, 610 billion e-mails, 7.5 quadrillions minutes of phone conversations). As comparison, every word spoken by all humans throughout the history of the world could be stored on around 5 exabytes. Consider now that these statistics were compiled prior to the social-networking explosion and the massive data storage growth represented by Web 2.0 features. Today, companies are expected to exploit knowledge of every nuance, preference, detail, and characteristic of a customer, would-be-customer, partner, or business event. Ultimately, the IT revolution is all about how to best master that thing called “Information.”

Utilizing and accessing this monstrous volume of information with legacy systems is simply a non-starter proposition. This is not due to constrains in the mainframe technology, or even on the storage capabilities inherited from distributed file server solutions, but to the haphazard way legacy architectures construction led to an unstructured and heterogeneous mix of systems and databases.

Now that real industry standards have finally taken hold and technology costs have dropped it actually makes sense to apply computer resources with improved usability and more enduring flexibility to support future changes. It’s time to begin the arduous process of re-architecting the new systems to use SOA. Unlike with earlier “distributed-processing” epochs be assured that SOA is not a false pretender technology, but the real thing.

The need for the IT system to support the incredible emergence of new business can be a starting point in the justification for transformation. But keep in mind that technology alone cannot be the main reason for this investment. Ultimately, the justification for the IT Transformation should come from the need to support the business and indirectly from other drivers. In my next blog I will cover what those drivers might be. . .

[1] Actually, remembering a language is easy, what’s hard is keeping current the tools used to develop and compile the programs under that particular language.
[2] The Singularity is Near—Ray Kurzweil.
Let’s call a spade a spade, Interactive Voice recognition (IVR) today is the king of dales pretenders today. I will agree that IVR works only when they manage to understand my accent!
[3] An Exabyte being equivalent to 1000 petabytes. A petabye being equivalent to 1000 terabyes. A terabyte being equivalent to 1000 Gigabytes, which is about what you can get with two external disk drives for less than $200 in 2008.


Labels: , , ,