Wednesday, December 19, 2007

What's Next? Take One

The end of every year brings 1) a retrospective of accomplishments and failures in the current year; and 2) a look ahead to expected developments in the coming year and the longer-term future. Herewith is my review of the past year and prognostication for the coming year.

The movement against global warming reached critical mass:

Going back to the early 1970s, issues of ecology have been discussed, debated, and slowly evolving. Anyone remember Silent Spring by Rachel Carson (1962)? How about The Population Bomb by Paul Ehrlich (1968)? These books had a profound impact on me, but concern about the environment has been slow to develop. Indeed, the current administration discounts much of the concern even today. However, over the last year, it would appear that a critical mass has been reached where individuals, institutions, and governments now understand the dire consequences and have taken steps to remediate the damage already done. Whether it will be too little, too late remains to be seen. However, from Al Gore's Nobel Peace Prize to the accords signed in Bali, action appears to be gaining strength.

Central to the environmental discussion is carbon emissions. Combustion, whether coal in a power plant, gasoline in an automobile engine, or propane gas in our bar-b-que grills, they all generate carbon emissions into the atmosphere. When released, they combine with the free oxygen in the atmosphere to form carbon dioxide (CO2). It's great for dry ice, but bad for global warming. The reason is that increased CO2 in the atmosphere creates a barrier much like the glass in a greenhouse. The result is the same as a greenhouse, increased warming. Therefore, global warming is all about CO2.

Today, the passions around reducing carbon emissions take many forms. First is a movement to make it no worse than it is today. These are the people pushing "carbon credits." Since countries are in the process of committing to limiting carbon emissions, their approach is to continue doing what they have always done, but trade their "excess" carbon emissions to a country that emits carbon at a rate less than they are allowed by treaty. These are usually underdeveloped nations where there are high levels of agriculture and free forests. The deal is that developed, carbon spewing countries will pay money to underdeveloped countries in exchange for their excess carbon credits. The approach is doomed over the long-term as underdeveloped countries become developed and need their carbon credits for themselves. The rich countries keep spewing carbon, only the they feel better about it because they can point to the offsets they have bought at the expense of some underdeveloped country. Unfortunately, this seems to be the U.S.'s approach.

A second approach is to recapture the CO2--remove it from the atmosphere. On paper, this sounds good. The problems are 1) how do you recapture it; and 2) what do you do with it? Recapturing CO2 takes energy. Most likely, that energy consumption emits carbon. Recapturing CO2 is like creating an infinite block of dry ice. It takes up room. Therefore, one solution is to bury it. Evidently, there is plenty of room for excess CO2 dry ice blocks underneath the numerous trash landfills, but above the buried spent nuclear fuel that is also being buried. Does anyone else see a problem with this?

A third approach is being investigated at Sandia National Laboratories. This one is to break down CO2 into carbon monoxide (CO) and oxygen. In turn, the CO could be reprocessed into burnable fuels, thus emitting the CO2 back into the atmosphere. Nowhere in the discussion is it mentioned how much energy will be needed to break down the CO2, but something tells me that it will involve more CO2 than will be reprocessed.

The fourth approach is evidently the least attractive--decrease CO2 emissions. This one does not require more energy. This one does not require reprocessing CO2. This one does not take the lazy path of buying off poor countries in exchange for their carbon credits. All that is required is simply don't emit CO2 in the first place. So committed are we to petroleum, internal combustion engines, coal-fired power plants, and industrial-age manufacturing processes, the U.S. evidently sees this as the least attractive option. Who knows, if we play our cards right, all our manufacturing will move overseas and we will once again become an agrarian economy. The only problem is no one will need our excess carbon credits.

Monday, November 26, 2007

Peak Oil versus Peak Demand

I was listening to the Diane Rehm Show (NPR) this morning and there was a discussion about whether the energy industry has reached "peak oil,"--the point at which oil production peaks, subsequently leveling off, and then declining. While I did not have the time to listen to the whole discussion, the debate seemed to revolve around when oil production would peak. The years mentioned ranged from 2012 to 2030. However, I would propose that the argument is misguided. Debating when peak oil production will occur is a little like being in a leaky boat and debating when the volume of water coming into the boat will peak. It becomes irrelevant when one considers that the boat is adding people into the boat faster than water is being bailed from the boat; at some point the boat is going to sink, regardless of whether the boat has reached peak water intake. If people are coming on board faster than water is being bailed, the boat is going to sink. Indeed, if the water leak is fixed and people keep coming on board, the boat is going to sink.

In the same way, whether the oil industry has reached peak oil production is irrelevant: The "peak" point that is important to understand is when demand exceeds production. Once that point is reached (for world oil production, it has already been reached), what is important to understand is the "oil gap."

The oil gap is the difference between oil production and oil demand. As the gap increases, oil prices increase (price is determined by supply and demand). At some point, the price reaches a point were substitute energy sources become price competitive. This is the environment we are in today.

Therefore, the question becomes not when oil production peaks, nor when oil demand exceeds oil production. Rather the important question is when demand drives oil prices to the "tipping point" where alternative energy sources become cost competitive. At that point, those organizations developing alternative energy sources, particularly renewable or sustainable energy sources, become competitively advantaged in the market. Therefore, the debate should not be when oil production will peak, but when alternative sources become advantaged. At that point, those organizations that have been investing in alternative energy sources will be the hot stocks for the near-term future. The next Googles are out there ready to take off.

Friday, November 23, 2007

Closer to the Edge

Edge computing is all about moving computing capability to a variety of devices enabling users to perform a variety of activities when and where they need it. Most often, edge computing is interpreted to mean mobile devices, but it could just as easily be your refrigerator, your television, your entertainment system, your mobile phone, or the various systems in your automobile.

Much of edge computing is already in place, it is just not obvious. For example, the Tivo used to track, schedule, record, and playback television programming is nothing more than a media center computer that has been programmed to record broadcast programming. In this context, the AppleTV is nothing more than an alternative programming computer tied to the television. The Slingbox is another programming device that captures video and streams that video to a variety of devices--from your TV to your computer, to your mobile phone. In these examples, the device is a server providing services to edge devices.

Another is the OnStar system in your automobile. Through this system, your car's performance can be tracked, your doors unlocked, or automatic alerts can be sent to a centralized monitoring center. X10 and similar technology provides monitoring of your house, enabling you to remotely turn on and off lights, set alarms, and control edge devices such as a television or appliance. In these example, the device is a monitoring and control unit providing manipulation and control over remote devices.

All of this edge technology is available now. Add to this the ability to view television programming on mobile phones (VCast, SprintTV, MobiTV, YouTube, and the increasing number of streaming videos available on Web sites); the ability to listen to audio programming through tools such as iRadio and streaming sites on broadcast Web sites; and the increasing number of sites that enable word processing, spreadsheet manipulation, and presentation development (Google Apps is a primary example). All of these are providing expanded access to computing by any device that can access those sites and capabilities.

As a result, computing is no longer limited to the desktop or the laptop computer. Today, browsing the Web can be accomplished on your phone and your game console. Word processing can be accomplished on any device that accesses the Web. Is the computer dead? Not by a long shot. However, its central importance in information management is being reduced as processing is delivered through the Web and access and control devices become more incorporated into edge devices. We are moving closer to the edge.

Saturday, October 13, 2007

Top Tech Strategies for 2008

The title is from an article in ComputerWorld by Paul Thibodeau recapping Gartner, Inc.'s key strategies that will be driving IT over the next year. (Refer to the PCWorld Web-site, http://www.pcworld.com/businesscenter/article/138327/top_tech_strategies_for_2008.html). Although not that far in the future, the strategies are worth noting here:

Green IT: The issue of energy efficiency and improved energy utilization will become a driving strategic issue for IT. As Thibodeau notes, if IT does not address it, governmental regulators will. Aside from these threats, increasing energy costs and access to adequate energy sources will increasingly become an organizational issue, not just impacting IT. Those organizations that address this issue early obtain a significant competitive advantage because while energy costs are currently variable, the trend is that those costs will continue to increase at an ever increasing rate.

Unified Communications: Simply put, running everything down the same pipe. Once information is converted to digital, a bit does not know whether it is a voice bit, video bit, image bit, or data bit. As a result, communication convergence enables capabilities such as email read through an audio device, phone messages to be available through an email system, and similar simplification of communication interchanges. The result is better communication, improved information transfer, and improved interaction among organizational entities and customers.

Business Process Management (BPM): While Thibodeau describes this as not being technology, I would disagree. BPM is a technology issue. It represents the development of common communication standards, common technology interfaces, and common transaction types that enable multiple departments within an organization and multiple organizations to efficiently and effectively trade information. In turn, this ability will reduce cycle time between transaction initiation and transaction fulfillment. The result will be increased responsiveness, improved accuracy, reduced rework, reduced cost, and increased competitiveness. It also offers the opportunity for an organization to more closely couple with their customers, thus effectively increasing customer loyalty.

Metadata Management: The vast majority of information an organization needs to effectively operate and position for the future exists today in the organization's various IT repositories. Unfortunately, the ability to find, retrieve, relate, join, consolidate, analyze, and use that various information is problematic at best. Increasingly, competitive advantage is being obtained by organizations that can effectively use this various data. It requires extensive understanding of what data are available, creating common definitions of data types and data usage, and providing integrated retrieval capabilities. The good news is that most companies are behind in this area. The bad news is that those to first achieve this ability will enjoy significant competitive advantage in terms of reduced response to market time, increased organizational flexibility, and the ability to effectively integrate customers directly into the infrastructure via Business Process Management technologies discussed earlier.

Virtualization: Utilization is the key here. Whenever you have a dedicated anything, if it is not being used at optimal capacity, there is waste. Anytime you have physical devices that operate at less than optimal capacity, you experience this problem. The solution is to work with "virtual" devices--servers, communication circuits, and display devices. By virtualizing the device, the ability to scale the device is increased so that utilization remains high. The result is that one large device can serve multiple uses and can adjust to meet peak demands from various users without the need to augment overall capacity. This provides cost savings (you use what you buy), responsiveness (you have what you need when you need it), and increased manageability (working with logical devices is simpler than working with physical devices). As an aside not addressed by Gartner, the ability to virtualize certain devices provides improved customer satisfaction. For example, the iPhone is a virtualized device (it is in fact a network connected computer) that provides phone access, Internet access, video access, and information access--morphing as the need arises.

Mashups: Mashups are the embodiment of what John Naisbett called "mass customization." Mashups provide the ability for the end-user to combine a variety of Internet information into new information views that meet a particular need. The objective is that the mashup is easy to accomplish and that various data sources provide the ability to be "mashed." In turn, the ease with which a particular organization provides their data in a format that can easily be used in a mashup determines how many customers will actually access and use that information. Once a customer has a mashup they like, they will be reluctant to change. The result is that if you are not first in, you may be excluded for a long time.

The Web Platform: Closely aligned with virtualization, this strategic development defines the move from local, dedicated applications to virtualized, Web-based applications that are effectively network-centric. As mentioned in the article, this strategy describes the shift from owned hardware and software devices to the Internet "cloud" computing, storage, and networking environment. Probably the most interesting thing about this development is that it is not new. It has been a trend that started in the late 1990s but has continued to gain momentum ever since. It will continue to do so for the foreseeable future.

Computing Fabric: Gartner treats this development as something new that is in the early stages of development. In actuality, it has been in development and has been rolling out for more two decades. Back in the mid-1980s, AT&T (the original one) developed a concept called the "Closely-coupled Computing Ensemble" (C3E). It basically consisted of a high-speed bus with various computing resources attached--compute, storage, math processors, graphics processors, and input/output devices. Sitting on top of this ensemble was an operating system supervisor that directed work to available resources. The result was maximum utilization, reduced bottlenecks, and improved throughput. But the computing fabric has been around for decades. Storage-area networks, network-attached storage, specialized servers (such as print servers), grid computing, on-demand computing, Web-based applications, and cloud computing are all examples of the deployment of a computing fabric. The point is that this is not a future development, but rather a rapidly accelerating trend.

Real-World Web: Similar to my thoughts on ”Computing Fabric,” the real-world Web is all about how the computing, access tools, and the network are increasingly being used. As with Mashups, end-users are increasingly finding increased value in Internet services, the overwhelming majority being provided through the Web. Retail sales have been long established, dating back to text-based services such as Prodigy in the 1980s. Today, that simple service has developed into the highly sophisticated retail/distribution/warehousing/fulfillment/logistics ecosystems such as Amazon.com. What I believe Gartner is getting at here is that these ecosystems will continue to multiply and though Mashups, continue to evolve into ever sophisticated combinations that will provide more, valued information and services to the end-user. As an example, I can now tie retail services such as Circuit City to Google Maps and Weather Channel forecasts to obtain a specific product, ensure that it is in-stock, gain the best price, find the closest location, obtain the best route (with fewest delays and construction), and determine the weather along that route—all from a consolidated Web page. The result is better service, reduced seek time, maximized value, increased customer satisfaction, and the establishment of a customer loyalty chain (if it worked this time, it will likely work the next, thus eliminating competition). The real-world Web is not about looking for things; it is all about finding things and solving real-world problems.

Social Software: Humans are by nature social animals. That’s why we form groups, societies, cultures, organizations, and nations. Throughout the development of the computing/information age, there has been this concern that we humans are becoming increasingly isolated and non-social. Lost in these beliefs is the fact that people write emails much more often than they ever wrote letters; or the fact that almost everyone I see is talking on a cell phone, something not possible with a landline phone. Add to that the typical cell phone has a single rate regardless of location within a country (and increasingly often international access), and there is a strong argument that technology has made us more socially connected. What differentiates this technology-based social connection from social software is the difference between point-to-point communication and multi-point and collaborative communication—it re-establishes the social network so that communication, exchange, ideation, interests, recreation, and work can be shared, group experiences. This strategic development probably started with telephone party lines and has morphed through chain email letters to today’s online shared environments such as Google’s Docs and Spreadsheets, MySpace, and Facebook (not to mention these blog spaces).

Regardless of your view on these strategies or trends, they are developments that will directly impact how organizations perform work, reach customers, interact, inform, and compete in the future. From that viewpoint, organizations must look at them, determine how and when they should engage, and determine whether value lies in development, acquisition, or partnering to take full advantage of these developments.

Sunday, September 9, 2007

Security Cameras

There is currently a lot of discussion on information and intelligence gathering related to anti-terrorist activities. It is a two-edged sword: How do you protect against terrorist attacks here in the U.S. and abroad while at the same time protecting individual rights and liberties? On the one hand, we pass Patriot Act laws knowing that they could be abused to invade the personal lives and liberties of completely innocent people. We are assured by the government that it will only be used for anti-terrorist activities. We subsequently find that not only terrorist suspects are tracked and eavesdropped upon, but also anyone with whom they have had contact. Using the six degrees of freedom theorem, that means it is very easy to justify eves-dropping on everyone in the world.

However you feel about the work of the NSA, warrant less eavesdropping, and domestic intelligence gathering, it is often difficult to relate it to our personal, day-to-day lives. Let's face it, hopefully the overwhelming number of us are law abiding citizens (of whatever country you are in) going about our daily lives, trying to make a living. Therefore, from a personal point of view, these programs are abstracts, far removed from what we do day-in and day-out. In addition to our "life, liberty, and the pursuit of happiness," there is a reasonable expectation of privacy--personal privacy; communications privacy, and geographic (locational) privacy. We just don't think that any of this Patriot Act stuff applies to us.

It was this train of thought that drew my attention to the subject of cameras. Specifically, a lot of money has been allocated by the Department of Homeland Security on camera systems for tracking traffic through urban and suburban areas. Add to that the increased use of red light traffic enforcement through camera systems. Also don't forget camera systems that record your vehicle as it passes through highway toll plazas. And finally, most organizations have camera systems that track entry and exit of individuals in the organization--whether that company is a local quick-stop market, a bank, a department store, a shopping mall, or a corporate campus, not to mention the ATM you use.

So, how private are our lives today? As one example, I counted the number of cameras that I pass by each day going to and coming from work. I travel almost exactly 6 miles to work, a total 12 miles round-trip. As it happens, I take a different route home because traffic patterns are different in the evening.

The result of the tally: going to work I pass 35 intersection cameras and 13 private cameras that seem to have a view of the streets. Coming home from work, I pass 38 intersection cameras and 9 private cameras that seem to have a view of the streets. That works out to:

  • Going to work, 8 cameras per mile or 1 camera every 660 feet
  • Coming home from work, 7.8 cameras per mile or 1 camera every 674 feet
Remember, these cameras view all traffic, not just terrorist suspects. They're watching 24/7. The odds of them actually capturing the image of a terrorist suspect is probably infinitesimally small. The odds of them catching me and my fellow innocent citizens is 100 percent.

Tuesday, July 31, 2007

A Different View of Technology Cycles

There are at least two types of technology cycles--those that describe the evolving capabilities of technology (for example memory capacity and compute power doubling every 18 to 24 months) and those that describe how technology is acquired, managed, and controlled. This post seeks to inform the latter management and control cycles.

A whole science (or perhaps art) has been made out of tracking and predicting various technology and product lifecycles. Indeed, consulting organizations have based practices around this subject and companies have attempted to differentiate themselves from the competition based on their expertise in "thought leadership" and ability to keep customers current through technology refresh capabilities.

However, what these various entities seem to overlook are the larger (and from my perspective more important) organizational and cultural implications of these cycles.

Specifically, the cycles I am referring to are those that define the relative relationships between technology managers (CIOs, systems managers, IT directors, and the like) and technology users (business end-users and consumer end-users).

Looking at the history of information technology, it can be seen that the 1950s, 1960s, and much of the 1970s were controlled by the corporate computer services, data processing, information services, information technology, or whatever the corporate department was called. This control took the form of allocating who would have access to scarce and expensive computing resources, standardizing technology to gain economies of scale, and driving down corporate costs by automating repetitive tasks. In doing so, pent-up demand was often unmet.

In response, departments and individual users began to look for alternative access to compute resources. To improve responsiveness to IT needs users resorted to contracting for services (indeed the whole IT services market developed from fulfilling this need), "borrowing" or sharing compute time among end-users through time-sharing, and purchasing packaged software to reduce development time.

A second technology cycle began when IT departments rapidly regained control by declaring time-sharing systems to be the responsibility of IT, not a department. Once the compute resource was under IT control, the software that could run on it was also controlled.

Again, whenever compute resources become constrained (as they always will when they come under the control of a department whose sole purpose is improving cost and gaining economy of scale), users seek alternatives that will address the users' unmet needs. During the mid- to-late 1970s, departments and end-users sought out mini-computers that could be used for departmental tasks outside the purview of the IT department.

The third cycle began when IT attempted to reassert control over computing resources by gaining control over these departmental computers. Under the guise of maximizing the capability of distributed computing, the IT department argued that data redundancy and duplicate development on mini- computers was actually driving up IT and total administrative costs. However, the IT department was never totally successful in regaining control.

The reason was the invention of the personal computer (PC). As IT departments took control of mini-computers, departments (mainly technically savvy end-users) began to adopt emerging PCs such as the Apple II and the Adam to perform repetitive tasks. This time it occurred during the early 1980s. The PC--soon evolving into the predominant IBM Compatible PC--along with versatile applications such as the spreadsheet, greatly expanded the end-user's reach. The combination required minimal expertise to operate. The applications were suitably flexible to enable the end-user to quickly re-purpose and adapt the combination. It was also during this time that the PC and applications moved into the consumer space for the first time thereby multiplying productivity because the end-user could continue work at home (and with the development of the "luggable," "portable," "laptop," and finally the "notebook" computer, information technology could be taken on the road).

The fourth technology cycle took almost a decade before the corporate IT department could regain any real control. The process that emerged to enable control represented the very capabilities that enabled the end-user (and now that end-user was more often coming into contact with technology as an individual consumer) to regain control of much of that technology. First, business became accustomed to the increased productivity that resulted from working at home and on the road.
Therefore, the IT controls that emerged were required to have the flexibility to allow access from outside the enterprise. Second, since the compute resource had also become a consumer device, there were some areas where the IT department did not have the opportunity to gain control.

Just as the IT department was reasserting a limited level of control over the exploding PC population through security and access controls, and through technology standardization, the end-user found another approach to satisfying unmet IT needs: Access to the Internet.

The end-user began to disconnect from large corporate compute resources and connect to the Internet at large. Through the PC, and later through laptops and phones, the end-user gained access to information and data that was previously available only through a library in paper form. In many cases, the access was to information that an IT department had no desire to accumulate and store within the corporation.

The fifth cycle began as corporate IT attempted to rein in external Internet access. They did so through the implementation and expansion of corporate intranets (as portals to trusted information) and through repeated attempts to filter external information sources. Success has been limited at best--success at filtering pornography and similar sources, but continued failure at filtering less obvious and objectionable sources. Adding to the difficulty has been continued evolution of the end-user device; from a simple pager, to a text pager, to a cellular phone, to an Internet mail services device, to a Web-enabled Internet device, and currently to a multi-function smart device that can do most, if not all, desktop compute activities (though on a much smaller scale so productivity has been sacrificed for portability).

The result of this current shift is that the IT department defines the technology (often stating what they will and will not support), and the end-user ignores the restriction and acquires what they think will provide the best use. In turn, the adoption reaches critical mass when enough corporate users (especially executives) demand support so that the IT department has no choice but to support it.

It could be argued that the sixth technology cycle has begun, but each successive cycle has become significantly shorter. As a result, it is becoming increasingly difficult to tell when one cycle has ended and another one has begun. Indeed, it appears that the cycles have begun to overlap. This is occurring because emerging, end-user technologies are coming to market faster than IT departments can gain control over them.

As an example, the iPhone went on sale on June 29th without the benefit of robust third-party support (Apple has made the iPhone a closed architecture meaning access to the underlying hardware and software is severely restricted). As a result, little "Web 2.0" (the one open-standard that was made available to third-party vendors) capabilities were available at launch. Yet, by the end of the weekend following the launch, a wide variety of useful business applications had been written, tested, and published for general use.

These cycles will continue to vex and challenge the corporate IT department (and the end-user, depending on your perspective). Rapidly emerging technical devices and capabilities, open-process standards, and developing business process management standards will continue to present solutions to the end-user and control issues for the IT department.



Wednesday, July 18, 2007

More Thoughts on iPhone as the Future

My last post related to positioning technology such as the iPhone in context of what that meant for the future. The intention of this post is to expand further on iPhone's impact now that I have had a chance to work with the platform and the apparent ecosystem that is rapidly developing around it.

The iPhone as Communication Device: The more I have played with it, the more I believe the iPhone to be a new class of network access device. It provides both telephonic and digital data access in a small, large-screen, lightweight footprint. The key for the future is that it is fully capable of wireless cellular voice and data and Wi-Fi voice (VoIP) and data. Future versions (whether from Apple or another
vendor) can be expected to expand on this to offer 3G, Wi-Max or whatever else comes along on the next couple of years. The key here is that the communications platform is becoming independent of the communications network. This results in more freedom for the consumer and commoditization and consolidation of the network services sector.

The iPhone as Presentation Device: There is a lot of talk about the lack of a finder, inability to save documents, and the inability to add applications to the iPhone. I suppose that is true and the OS X operating system definitely has the power and capability to provide these. However, suppose for a moment that the intention of the iPhone is not to add more to the phone, but rather to provide improved access to content ,applications, storage, and capability through the network.
If that is the case, the iPhone could change not just phones, but change our whole view of mobile computing. Seen in this context, the iPhone becomes the disruptive enabler of the next generation Software-as-a-Service (SaaS) model. Supporting Apple in this venture are Google, Yahoo, and Salesforce.com to name a few. Indeed, a quick search for "iPhone applications" through Google will bring back a number of individual applications as well as a number of application aggregators. If interested, a few that will indicate the range of applications are http://www.mockdock.com and http://getleaflets.com. Both of these sites provide a Web page that looks like the main screen on the iPhone and enable the user to place application icons on that page so access to the applications can be accomplished by pressing a linked icon.


At present, these applications are really "gadgets" or "widgets" that are available through tools such as Google's gadget bar or Windows Vista's gadget bar. While fun and convenient on a desktop, these small applets become highly usable portable tools when provided through a mobile communication and presentation device such as smart phone. For example, there are gadgets available that will find Wi-Fi hotspots simply by keying in the ZIP code (although in all fairness, similar information can be gained by keying in "wifi" and a ZIP code in the iPhone's Google Map application). Other gadgets range from applications that list the lowest gas prices in a ZIP code to news readers and social network interfaces.

Together, the communication and presentation capabilities of this class of device represents quite a force in the emerging mobile communication space, especially when you consider that a number of these applications target small to medium size businesses and those are the enterprises that are typically early technology adopters. Indeed, it was this market sector that was the early adopter of the PC, the Web as a market, and SaaS as an application delivery mode.

Wednesday, July 4, 2007

Disconnecting

Okay, this may seem like a review of the iPhone, but the intention is to place technology SUCH AS the iPhone into perspective when considering where technology is going. This discussion is not about new marketing and sales models between device manufacturers and network providers. This discussion is not about Steve Jobs' ability to pull together hardware, software, and content providers to sell an integrated whole. This discussion is all about the impact a new class of device and the associated capabilities that device will have on the ability to communicate effectively, efficiently, and effortlessly.

To date, wireless technology has taken (by my count) nine major advances:
  • Yelling loudly
  • Sound signalling using drums and other devices to make sound carry over a longer distance
  • Visual signalling using pinafore flags, smoke signals, fire, mirrored surfaces, and the like to enable recognition over line-of-sight distances
  • Point-to-point radio transmission enabling communication over long distances, but limited to users that have compatible equipment, using the same modulation methods, and using the same frequencies
  • Two-way radio systems that enabled anyone with compatible equipment to communicate with each other and providing multiple channels enabling higher capacity
  • Pagers enabling basic signalling to a unique individual from virtually any telephone
  • Analog cellular telephone systems enabling the dial-up telephone to become truly mobile
  • Digital cellular/Personal Communication Services (PCS) increasing the carrying capacity of the wireless network while providing additional wireless communication capabilities such as caller ID, SMS messaging, email, and various abbreviated versions of Internet browsing
  • The integrated wireless communications device such as the iPhone

Yes, I think the iPhone is and will be making that big an impact on wireless communications. The prediction is based as much on the concept as it is the product itself. Apple has designed a truly innovative user interface. To be sure, there are features that are missing (more about that later), but it still represents a giant leap forward. A comparison to other smart devices will show why.

The first of the lot was RIM's Blackberry. When it started, it was a pure play messaging device, with a smaller screen than a computer and a smaller keyboard to match. The innovation was shrinking the features but still making it usable and providing an efficient interface to popular corporate email systems (this latter feature remains one of its main draws). Additional features such as a thumb wheel for scrolling, or the "pearl" button for navigating were present on other, less smart phones at the time.

Then came the Treo. It took its cue from the Blackberry by using a similar shrunken keyboard and adding a phone to the popular Palm PDA. Once Handspring/Palm moved away from script (Graffiti) input, the only innovation was to combine a PDA with a phone. To be sure, later models expanded on the capability to approach a handheld computer, but the real innovation was combining a full-featured PDA and phone.

Then came Windows Mobile phones. These phones took their cues from various smart-phones by attempting to take a desktop operating system and make one for phones that would provide much the same feel. In doing so, the number of compromises that were made in early versions was such that it managed to transfer the look and feel of the desktop Windows operating system to a phone, but lost in the effort was the fact that doing so made for a lousy phone interface. Subsequent Windows Mobile releases have significantly improved on the initial versions to the point where Windows Mobile has managed to mimic much of the Palm Treo's features. In this context, all the Windows Mobile operating system (and resulting phones) has managed to do is capture market share from Palm. There was really no true innovation in feature or function.

In the case of RIM, Palm, and Microsoft, all three (along with their supporting vendors) have managed to add MP3, Web browsing, and email capabilities. However, there have been significant trade-offs made that make these devices mediocre MP3 players, Web content viewing instruments, and email applications. In the case of MP3 playback, the interface has not been particularly elegant or easy to use. This is where the Apple iPod distinguished itself--it created a totally new interface that was intuitive, easy to use, space efficient, and very effective.

In the case of phone browsers, they all have had severe limitations. Some lacked popular plug-ins (such as Flash which is a major drawback of the current iPhone), while others attempted to institute new ways of presenting standard size Web pages in compact readable formats suitable for a phone or smart phone screen. For example, the Palm Blazer browser made a valiant effort at this providing both a narrow (Palm-sized screen) and a wide (scrollable full-size Web page) format. For the most part, this approach provided a passable Web browser. However, in many cases the Blazer browser was ill-behaved, resulting in weird rendering. Surprisingly, the scaled-down Internet Explorer embedded in Windows Mobile had many of the same issues.

In the case of email applications, all have had their faults. Palm's Versamail was ill-behaved when dealing with HTML formatted messages. Microsoft's mobile Outlook provided much of the look and feel of its big desktop brother, but there were still problems with rendering media-rich emails.

In contrast, the iPhone has provided a giant innovative leap on all these fronts. First, the iPod capabilities of the iPhone has moved the user interface forward from their already industry-leading interface--the touch-sensitive wheel. I would suspect that the iPhone previews how the third-generation iPod will look and work.

Second, while there is a lot to be desired in the Safari browser, the iPhone's Safari browser is simply the best Web rendering application available on a phone. As mentioned earlier, there are drawbacks with the lack of Flash support and some Java holes, but I fully expect these to be fixed fairly rapidly as adoption of the iPhone continues to expand. With that said, with few exceptions, the iPhone's browser provides the best rendering match to desktop browser available. This is particularly true since the image can be viewed sideways in a wider view. Add to this the innovative "double-tap" or "pinch" magnifying, and Apple has provided a capability that can effectively replace a desktop browser. Now, it they can just add the features to edit this blog through the iPhone (the Java available evidently does not allow me to write or edit this through the iPhone--however, I can post to messages). To underscore this, today is the first time I have been on a laptop in three days. All browsing has been done on the iPhone.

Third, the iPhone's email interface is positively outstanding. It is easy to navigate, read, and delete. Add to these basic features the fact that it renders full HTML email and the iPhone's email system is simply the best around. However, it could be better (it may be better and I haven't found the features yet). For example, I would like to be able to create email folders and move messages to those folders on the iPhone. A "delete all" capability for the "Trash" folder would also be a nice addition. Yet, even with features such as this missing, the visual look and the gesture control is a leap forward in mobile email applications. Again, this is the first time I have looked at email on a laptop in three days.

Taken together, the iPhone's various features represent the first of a new generation of devices that could well begin to replace the laptop much as the laptop has replaced the desktop computer. The iPhone has already done that for email and Web-browsing. If Apple decides to expand the iPhone's OS-X capabilities so that files can be saved and searched via a "Finder" application, they will have moved significantly closer. At present, the iPhone is a benchmark for email, Web browsing, and messaging. I would also add to that the fact that Google's Maps application is easier to use than their Web-based version. While the Yahoo Stock application is passable, it would be nice to re-order stocks to individual liking and to take a person to the Finance page relating to that stock instead of a specialized combined page that places too much irrelevant information on the screen. If a person is looking at Yahoo's Stock application and they want to go to the Web, they want to drill-down into the specific stock, not do a general search.

The same problem arises with the Yahoo Weather application. When you go to the Web, you want more detail on the weather for that location, now an additional summary and then a general search result for the location. What I would really like is a view of radar for that location. I have bookmarked the Weather Underground page in Safari to provide the radar view, but it would be a lot more usable if pressing the "Y!" icon at the bottom of the application would take you to a radar view. A final gripe about the weather application is that it would be nice if the icon on the main iPhone page showed the current temperature much as the appointments icon shows the current day. With all the attention to detail, go figure how these simple aspects were overlooked.

As you can see, I had to dig deep to find something to complain about on the iPhone. There are others. However, for a first generation attempt, especially one that can fix many of its shortcomings in software upgrades, Apple has more than lived up to the hype created for this phone. If they keep their development momentum and continue to expand their partnerships with Google and Yahoo, Apple will have no problem selling 10 million phones by the end of 2008 as well as taking a commanding lead in the mobile communication device market. It is the disruptive technology that will truly enable users to disconnect and remain in contact. It is that good.

Saturday, June 2, 2007

Innovation versus Quality

It is amazing how two decades can bring the same issues facing U.S. corporations back to the top of the pile. In an article in the June 11, 2007 BusinessWeek, "At 3M, A Struggle Between Efficiency and Creativity," the discussion centers on how the introduction of Six Sigma has brought cost reductions, improved efficiencies, and increased profit. At the same time, 3M has fallen from its time honored position as the number one corporate innovator.

The rush to Six Sigma has placed concentration on improved operating efficiency, driving up bottom-line profits. However, Six Sigma as most often implemented stifles creativity and innovation because it is aimed at improving existing processes, not determining when existing processes are outdated or no longer adequate. That is, Six Sigma does not describe how to address a process when that process is not capable of producing the desired output. I suppose one could argue that Design for Six Sigma is a process framework that enables an organization to design capable systems. Even so, what happens when an idea is conceptualized to which there is no practical use? Or what happens when there is a practical use for a concept, but one does not immediately recognize that use? From a Six Sigma point of view, it has no use and constitutes waste. Therefore, it should be discarded so as to simplify existing processes and reduce costs.

It reminds me of the days of Crosby quality where the goal was zero defects. It didn't matter whether the customer liked your product, just that you produced it with quality. As Dr. W. Edwards Deming used to note in his lectures, the last buggy whip manufacturer probably produced excellent buggy whips. Too bad nobody wanted them.

Innovation is not a process. Rather, it is a discontinuous act of creativity. It may spring forth while the mind is otherwise engaged. It may occur through the linkage of two or more unrelated ideas or concepts. It may emerge from the fog of vague ideas to a crystal clear concept in an instant. That is not to say that innovation cannot be planned. However, innovation must mature and ferment in the mind until it is ready to emerge. While the plan may be executed, innovation emerges on its own schedule.

Therefore, can a formal quality process such as Six Sigma and innovation coexist? It is doubtful. At least in the context of a formal Six Sigma process for innovation it is doubtful. Six Sigma is a tactical approach to improve an organization's performance in terms of efficiency, cost, and profit. Innovation is a strategic approach to provide the means for an organization to survive and thrive in the future.

Tuesday, May 29, 2007

Over Technologized

I took a trip over the U.S. Memorial Day holiday to attend a friend's wedding. For the most part, it was uneventful. However, an interesting thing happened when I stopped to fill-up with gasoline. I got out of the car and as I was about to enter my credit card information, the store attendant came out and said that I would be unable to pump gas because "the computer system is down." This wasn't a big imposition, I simply drove to another gas station.

It got me thinking about an event several years back when I stopped in at a local grocery store (a national chain) to pick up some plastic cups for a business exercise I was giving the next day. It was about 2:00AM, and after looking all over the store for the cups, I finally found them and took them to the checkout lane. The store clerk looked at me and stated that it would be approximately 30 minutes before she could check me out because the system was being updated.

This technological world is a real marvel of convenience, responsiveness, and availability. We can email from our phones; take pictures with digital cameras; talk globally from our computers; receive HD television over our fiber-optic cable; carry, view, or listen to thousands of pictures or songs on our iPods; determine our location within a few feet with our global positioning system; and even view the temperature on our watches. Well, we can do all those things if everything works right. But as the two preceding examples indicate, if they don't work right, we may very well just be standing around.

As our world becomes more automated and electronically integrated, one has to wonder what would happen if a "glitch" caused a cascading failure. Of course there are back-ups, redundancies, alternates circuits and paths, so the chance of a catastrophic failure is remote, or so it would seem. Then again, Apollo 13 was a masterpiece of back-ups and redundancies. So was Challenger. So was Columbia. Yet they all failed.

Self-configuring and self-healing systems can help, but what happens if they can't? Or what happens if they are not fast enough? It could be a great plot for a novel. Think what would happen if a large number of these systems failed at once. I am reminded of that 1970 science fiction movie called "Collossus: The Forbin Project," where a super-computer is joined to the grid and it proceeds to take control of all computers in the world. It may be science fiction and it may be far fetched, but so were some of our space failures.

Sunday, April 29, 2007

The Verbal Commons

Historically, I have not been a big user of Internet community applications. I looked at chat rooms years ago, but my impression was a large number of people in a noisy room, with pairs attempting to talk across a large room to each other. Not my idea of discourse.

I have been a member of a number of special interest discussion groups that have shown to have a lot less noise than chat rooms, but they tend to be cliquish. If you are a trusted member, things are great. If you are an infrequent poster or lurker, the results can be less than satisfactory. Typically, you can be restricted from posting through moderation (with some of the flames that occur on a discussion group, I can understand why), or it can be difficult to get a respectful response from the other members.

The result is that I have been a loyal user of email as the primary communication tool. However, this too has its limitations in that the community is self-selecting--one chooses who can read and who to interact. And there can be disagreements and flames even in that medium. The pro is that an intelligent discourse can take place, although asynchronously.

After looking through the blogs over the last couple of weeks and reviewing some blogs of acquaintences of mine, I thought I would give this medium a try. I'm not sure what to expect, but it seems that the blog approach is good for capturing thoughts whether to publish publically, or to use as a parking place to random thoughts.

Therefore, I present the Peripheral Futurist blog. Through this blog, hopefully I will be able to communicate my perspectives on developments impacting how we live, how we work, and how we survive--in the future. By virtue of the fact that I have started this blog, I believe that this type of publishing will become a dominant approach for the statement, documentation, and challenge of ideas, experiments, thoughts, and concepts. Obviously some of you have more experience than I. I would be very interested in your experiences--both pro and con--with the blogosphere.

Walt