Cloud #1 – Historical Background


The First Generation

Let’s begin with the first generation. This generation of computer professionals spanned the decades of the 1950’s and 1960’s. This generation created our first Operating Systems, programming languages & compilers, device drivers, and basic data management (Sequential and Indexed file systems). For perspective, the only engineering effort in the entire world that was larger than the development of the IBM OS/360 was the Apollo moon landing program! From these large projects, this generation also codified our first lessons in managing software complexity.

This generation laid the foundation of modern Computer Science. Some of the “achievements” of this generation highlight their challenges. Before the first (FORTRAN) high-level programming language was delivered in 1957, a formal way to describe the language had to be created. This resulted in the development of BNF (Backus Naur Form) syntax to support both the language and compiler development efforts. What would become one of the world’s important data storage technologies, the IBM Information Management System (IMS), was developed in the late 1960’s to support inventory management for the Apollo space program. What was to become the world’s largest transactional environment, the IBM Customer Information Control System (CICS), was also developed in the late 1960’s and provided to it’s first Public Utility customer. For this generation, everything was a first. Everything was a challenge. Foundations of the future were being laid in the process of serving the current needs.

Business computing of this generation was defined by the chosen hardware vendor (IBM, Burroughs, Sperry Rand, NCR, CDC, Honeywell, GE, and DEC). At the time, these companies were termed “IBM and the seven dwarves”. Almost all computing took place on the hardware provided by the vendor. Many organizations, even fairly large ones, had only a single computer! It was only in the last half of this generation that these computers could run more than one program at a time.

Input and output was primarily punched cards, magnetic tape, and print. All processing was in “batch” mode. There were no online terminals, except one for the computer operator. Direct Access Storage Devices (DASD) disk drives were primarily reserved for Operating System use. Business data was stored on tape. There was no Off-the-Shelf (OTS) software available. All business applications had to be developed in-house. With the development of COBOL, VSAM, CICS, and IMS, the foundation was laid for the usage of computers for business applications.

The major focus of this generation was to build the initial computer systems used by businesses and organizations. These efforts began with the hardware manufacturers building of computers, Operating Systems, and programming languages (FORTRAN & COBOL). The manufacturers also began to develop sophisticated software to solve real world problems. IMS for the Apollo project and CICS for Power Utilities were examples of this. The foundation for modern computing had been successfully laid.


The Second Generation

The second generation of computer professionals spanned the decades of the 1970’s and 1980’s. This generation continued the work of the first generation, continuing to build new hardware platforms and peripheral devices. Hardware vendors competed to deliver computers at various price points. This led to the terminology of Mainframe (Large), Midrange (Medium), and Desktop (Small) computers. These new computers resulted in the development of UNIX, what would become iSeries, and what would become Windows & Mac.

Some of the original computer manufacturers left the industry and some new companies emerged. Burroughs and Sperry Rand merged to create a new company called Unisys. Honeywell dropped out of the Computer business. Sun Microsystems was founded. Peripheral and Plug Compatible Module (PCM) manufacturers began to appear; Memorex for example.

The need to communication with the burgeoning array of peripheral devices led to an explosion of communication protocols. The need to directly connect these peripherals led to the development of Ethernet (IEEE) and Token Ring (IBM) network protocols. TCP/IP was initially developed through the Defense Advanced Research Projects Agency (DARPA) and adopted by the Department of Defense (DoD). TCP/IP began it’s long ascent into being the universal connection protocol. Almost all of the initial computer terminals, then called Video Display Units (VDU), had their own private communication protocols. These eventually sorted out into four major families: ADM (from the ADM terminals), VT-100 (from the DEC terminals), 3270 (from the IBM terminals), and 5250 (from the IBM iSeries terminals).

In addition to developing more computers and more peripherals, this generation also rapidly expanded the number of business applications. Most businesses began the long process, that continues to this day, of becoming fully computerized. Hardware manufacturers were slow to understand the value of software and the development of these business applications largely took place within the customer organizations. However, this generation also started to create Off-the-Shelf (OTS) software. SAP was founded in 1972. PeopleSoft was founded in 1989.

Much of the knowledge gained in the first generation was codified into a number of seminal publications that laid the foundations of Computer Science. Fred Brooks published “The Mythical Man Month” to describe the lessons learned in building OS/360. Authors like Abelson, Constantine, Dijkstra, Hoare, Knuth, Sussman, and Wirth codified the lessons learned in building multiple generations of software and in coping with increasing levels of system complexity. The concepts and principles of Algorithms (Knuth, Wirth) Structured Programming (Dijkstra); Structured Design, Loose Coupling, High Cohesion (Constantine); and Complexity Management (Abelson, Sussman,Wirth) were all formally described in this generation. These theoretical foundations and principals remain unchanged to this day.

The major focuses of this generation were two-fold. The first focus had been to build more, faster, better, cheaper computers and peripheral devices. This was, essentially, a continuation of the focus of the first generation. However, these new devices needed to intercommunicate. The development of network and VDU protocols to allow this intercommunication was one of the two major focuses of this generation. Software produced by the first generation (IMS and CICS) was greatly expanded in capability. IBM invented Relational Database technology and deployed DB2 as its commercial implementation of this technology.

The development of business solutions was the second major focus of this generation. This development saw the usage of computers become ubiquitous within organizations. Transactional systems like IMS/DC and CICS provided a core business platform. This enabled computerized processing to move out of the nightly batch window and to support business activity during the day. Computers had begun to move into everyday life. For example, the transactional technology enabled the deployment of banking ATMs, something we take for granted today.


The Third Generation

Computer platforms had become standardized and these platforms were undergoing continuous incremental enhancements. Virtualization was a trend that continued to accelerate at the platform and infrastructure levels. The new computing platforms that would evolve during this generation were to be mobile devices! At the beginning of this generation, the convergence of Personal Data Assistants (e.g. Palm) and cellular phones had not yet occurred. Smart phones and tablets were still in the future. This generation would also see the emergence of laptop computers.

The evolution of the DARPA internet into the World Wide Web really happened in this generation. Along with the internet, Java arose as a significant programming language. For perspective, at the beginning of this generation (only 28 years ago), internet commerce did not yet exist! The SSL/TLS protocols would be developed to allow secure communication over the internet. The SOAP protocol would be developed to extend the HTTP protocol.

This generation also inherited a computer environment in which many business processes were either partly or fully computerized. Computers had become almost fully integrated into business-to-business (B2B) computing. The CICS, DB2, IMS, and Oracle software products were at the center of Enterprise computing. Business-to-consumer (B2C) computing was to be one of the new frontiers.

Despite the maturity of the hardware and software environments, different software environments were islands. The protocols existed to connect all of the necessary environments at a hardware level, but not at a software level. One of the major mechanisms to transfer data across hardware platforms was File Transport Protocol (FTP). The intercommunication mechanisms between software environments were virtually non-existent and this would become one of the two major focuses of this generation. The second major focus of this generation was the extension of computing outside of businesses and towards human beings. This generation would thus begin to split into two separate camps with two very different focuses.

One part of this generation focused “inward” on the interconnection of the existing software environments. This led to the concept of software integration via a “Bus” and the emergence of Service Oriented Architecture (SOA). SOA brought a wealth of new software products. From IBM alone we had: MQSeries (aka MQ), MQ Integrator (aka Message Broker), MQ Workflow (aka Process Manager), WebSphere Application Server (WAS), and Information Server (implementing IaaS). IBM added other products to this stack (WebSphere Commerce and Portal) and all of these products evolved and changed names over time. Other vendors had similar product histories.

SOA led to the development of a software segment called “Middleware” and the development of MOM (Message Oriented Middleware). This was, by architectural design, supposed to be a loosely coupled, highly cohesive, small component based solution. The overall goal was to decouple the software components and to allow customers to be more agile in the marketplace. Some customers fell far short of this mark, implementing the technology without understanding it. These customers simply succeeded in replicating their “old” solution in a “new” environment. Same solution architecture.

Meanwhile, a second part of this generation became focused “outward” on the interconnection of people and computers. This would come to be called the Internet of Things (IoT) or Internet of People & Things (IoPT). A significant part of this effort was devoted to Business to Consumer (B2C) software development. The foundational platform for this effort was the World Wide Web. This led to an explosion of web technology, including HTTP, Java, JavaScript, SSL/TLS, REST, and SOAP.

Since man-to-machine programming has always been attractive to programmers due to the immediate feedback from visualized results, this became the focus of many new developers entering the field. Also, since what was being developed was man-to-machine and not machine-to-machine software, error handling could be mediated by a human being. This led to much more lax, and therefore easier, programming demands. One of the unintended consequences of this bifurcation of effort was that, for the first time, a new generation of developers was not standing on the shoulders of those who had gone before.

The challenge for these “outward” looking developers was to keep up with a dynamic and rapidly changing environment. They didn’t have either the opportunity or the time to work with the “legacy” SOA and transactional systems. Thus, some of the preceding lessons of software design went unlearned. Transactional processing and ACID properties proved too difficult when implemented in SOAP and went largely unused. High cohesion principles were stressed in REST, but tight coupling through HTTP became the norm simply because it was faster and easier for developers.

By not being bound to the past, or past requirements, this group was able to achieve significant results in the areas of human to machine computing. Smart phones, tablets, Web Commerce, transmission of knowledge through the web, etc. were some of the significant achievements of this group. These achievements, however, came at a cost. Connectivity and time-to-market took precedence over design, reliability, and data integrity. While these trade-offs were understood by people straddling the generations, they became increasingly lost on the upcoming younger developers. These younger developers never had the chance to become exposed to the more rigorous requirements of B2B processing.

During this same time period, at least in the United States, there was a concerted effort to drive down software development costs. Rather than look at the Total Cost of Ownership (TCO) for different development options, the single metric of Dollars/Developer Hour became the focus. This drove American companies to use less experienced and skilled developers which, in turn, drove companies to use simplified requirements. Coping with reduced skill sets and experience drove much of this development to be entirely web centric. This “one size fits all” approach required simplified design patterns, which came to be seen as the only design patterns.

Since there was no longer a relatively common career path that ensured everyone was exposed to the same concepts, many of the web simplifications never became recognized as simplifications. They became assumed to be “Best Practice”. Many younger developers in this second group never developed an understanding of transactions, XA protocol, Two-Phase commit, and ACID properties. Many of this group learned that “Loose coupling” was a good thing and assumed that, since they were doing Web Services, these services must be loosely coupled. They are not.

To summarize, the third generation interconnected the software of legacy business systems using SOA software and design principles. They developed mobile computing, smartphones, tablets, laptops, the web, and massive device and software interconnections. Some of this development came at the expense of developing a tightly connected and brittle network of interconnecting services. This problem, as with prior problems, was a side effect of the requirements driving computing. The cleanup will be left to future generations.


The Fourth Generation

The fourth generation of computer professionals will span the decades of the 2010’s and 2020’s. We are not yet half-way through this generation. If anything, the rate of change in this generation is accelerating! This generation is starting with multiple, interconnecting, computing platforms. Virtualization is driving the hardware environment rather than new and emerging hardware platforms. This means that, while the hardware environment is relatively stable, the software infrastructure is undergoing rapid change. This makes the fourth generation similar to the third, with virtualized containers bring the same challenges that SOA did.

There are a number of major trends that can already been seen that this generation is coping with. One of these trends is towards consolidating the various computing platforms into a single virtual application run-time. This is happening through virtualization in the form of the “Cloud” and Containers (e.g. Docker). This is quite a revolution in infrastructure and, much like the previous generation, will require a split in focus. Some developers in this generation will have to be “backwards” looking and interface legacy applications containers in the Cloud. Other developers will be forward looking, forging ahead into new frontiers.

Another trend is the emergence of natural language processing into computing. Apple’s Siri, Amazon’s Alexa, etc. are examples of the ChatBots that are pioneering voice interfaces. The major software vendors (IBM, Amazon, etc.) are providing APIs to integrate these services with existing computing. This trend will surely intensify. Just as the mouse was once a new interface device, so voice will continue to become a major I/O vehicle.

A third trend is the increased interjection of machine learning and artificial intelligence into computing. Some of this can already be seen in the natural language processing area, but this is the tip of the iceberg. Machine learning, for example, could be performing software log file analysis. Splunk is already pioneering in this area.

Another trend is the extension of computing into the human experience. Experiments with wearable computing, FitBit or the Apple watch for example, indicate that this trend will continue to grow. Augmented Reality (AR) is in its infancy, but will clearly continue. Both hardware and communications bandwidth are increasing quickly enough to soon make this feasible. The Verizon 5G speeds are looking to be 2-3 GB/sec to devices, enabling entirely new software experiences. Self-driving cars are another example of this trend.

The final trend may ultimately be the most important of all. This is the democratization of computing. The “Cloud” is enabling anyone to participate in the software development. The Web is enabling distributed collaboration. GitHub, for example, has approximately 60 million repositories. Open Source, GNU, etc. are not only allowing distributed development but are making it normal. Marketplaces like Apple’s allow anyone to monetize their investment in small scale software development.

We will have to wait and see what the total impact of this fourth generation turns out to be. There has already been so much development and change that it looks to be the most


Summarizing the Trends across the Generations

(1) Computers continue to become more pervasive. They continue to become smaller, lighter, require less energy, and more computationally powerful. In the first generation, an organization might have a single large computer. That computer will have filled up entire rooms. Today, we wear them on our wrists.

(2) Computers continue to become less expensive. In the first generation, they were shown off behind glass doors and windows with pride. Now we don’t even see them, they’re in the “Cloud”. You can “rent” rather than buy in the “Cloud”, with low levels of usage being free.

(3) Computers continue to become more integrated. This has been true at a hardware level (standards), and a software level (design). A first generation computer talked only to its card punch and tape machines. Today’s computers are potentially networked across an entire planet. Software has been following hardware in this pattern of integration.

(4) Computers continue to become more human accessible. In the first generation, the care and feeding of a computer was a daunting task. The first generation of programmers required a high level of skill. The computational ‘barriers to entry’, to use a phrase from economics, continue to drop. We continue on a arc from the “High Priests” of computers to the “Everyman” usage model. Natural language processing may bring computational power to almost everyone on the planet.

(5) Software is moving from a proprietary to a more Open Source model. I never thought that Linux (Open Source) would be embraced. It is now the standard distributed OS and the OS of the Cloud future. The Cloud is almost entirely an Open Source software stack. The Cloud vendors will continue to compete in enhancing Open Source based products. Can software vendors ever compete, with their limited resources, to the productivity of an entire world? Vendors will probably only be able to compete in niche areas. This means that the trend is to a greater “democratization” of computing.

(6) Computers continue to “compete” with human beings to perform workloads. This “competition” is driven by economics. Computers are increasingly more competitive as they get less expensive and more powerful.


Looking Ahead

To a certain extent, these four generations of computers and their human cadre are examples of the trend of machines performing work that used to be performed by individual human beings. To the extent that these developments released human beings from “lower” quality work and allowed them to participate in “higher” quality work this was good for humanity. There is nothing in this technological trend, however, that couples technology with economics. Productivity with the supply and demand for human labor. Robotics, AI, and machine learning have the ability to completely disrupt society, the demand for labor, and the corresponding distribution of wealth. At some point in the not too distant future, these economic realities may come to dominate computing.


Summary

If I had a crystal ball to the future, I wouldn’t be writing this Blog. I’d be on a beach on my own tropical island. My guesses for the future are based upon two factors. One is that trends continue, so we can extrapolate them. The second is the economics, more so than technology, drives computing. These two factors allow us to make some reasonable guesses.

To paraphrase the Chinese proverb: “We live in exciting times”.


IMUC-Cloud-01-Generations-1300


First in Series.    Previous.    Next.    Last in Series.


Note: This Whitepaper was first published by this author in the IBM Middleware User Community (April 2018).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s