@DXWorldExpo Authors: Yeshim Deniz, Pat Romanski, Liz McMillan, Zakia Bouachraoui, Carmen Gonzalez

Related Topics: Java IoT

Java IoT: Article

Building Out the Internet Platform: A History of Application Servers

Building Out the Internet Platform: A History of Application Servers

It wasn't long ago that many developers didn't know what an application server did. These days it's become part of our common vocabulary. The main reason for this shift has been the rapid growth in the importance of the Internet as a platform for business applications. Without application servers the Internet would be a much less exciting place. This article shows how these vital pieces of the Internet infrastructure have evolved and explores where they're headed.

Why We Have Application Servers
Viable product categories emerge for a very simple reason: they fulfill a compelling need for a large group of users. In the case of application servers, developers needed a product that could assemble all the low-level resources used in Internet applications and encapsulate them in a productive framework that would enable the construction of rich and interactive Internet applications. In this sense application servers closely resemble desktop operating systems, which continue to enable an entire industry of productivity tools for business users and consumers alike. However, whereas an operating system encapsulates a particular set of hardware devices and windowing services, application servers encapsulate the protocols and data types used on the Internet as well as a different set of user interface services.

As with all infrastructure products, application servers have had to evolve with the needs of developers. Just as operating systems have changed dramatically in the past decade, the application servers of today are but a shadow of the Internet business platforms that will emerge in the next few years. To better illustrate where application servers are headed, it's useful to review the steps that brought us here.

History of Application Servers
The history of development on the Internet is relatively short, but the challenges have grown by several orders of magnitude in that period. As illustrated in Figure 1, the goals of Web developers have evolved from adding dynamic data to previously static pages and building interactive sites to the full-fledged applications deployed by Amazon.com or your online bank. At each of these stages the requirements of what became the application server have expanded to include new resources and functionality. We'll revisit the stages to come a little later on; first we'll take a look at the products and technologies that formed the genesis of the application server.

The technology that first enabled the transition from static to dynamic pages was CGI (Common Gateway Interface). Unfortunately, using it left plenty of work for developers. They not only had to write the business logic for their page, but each CGI program had to implement its own solution for simple tasks such as opening connections to a database, formatting the results for delivery in HTML and maintaining session information between requests. As more users tackled the same problems, they began to circulate libraries of reusable code that solved some of them. However, as projects became more ambitious, assembling these point pieces into a complete application that worked reliably became an increasingly difficult task.

At this point some of the more entrepreneurial types recognized an opportunity, and the Internet application server market was born. These pioneers wrote their own set of reusable resources (e.g., database connections, state management, results formatting, pooled CGI connections) and packaged them so developers could be more productive than they had been with ad hoc code libraries.

Initially, most of these products were sold as tools for Web developers. Like RAD tools before them, they made their runtime resources available through a scripting language and/or a set of objects that encapsulated the underlying plumbing. Some of the earliest examples include Allaire ColdFusion, HAHT Software HAHTsite and Netscape LiveWire. They're pictured collectively in Figure 1 as dynamic page engines.

Meanwhile, people and businesses were flooding onto the Web and recognizing the incredible opportunities offered by applications deployed on it. As a result, the more aggressive developers quickly moved beyond simply generating dynamic pages. To ease this transition, they wanted tools that could help them build full-fledged applications with much more complicated requirements.

To respond to these demands, software companies began to expand the range of plumbing they provided. To increase scalability and performance, they incorporated design principles and/or technology from TP (transaction processing) monitors. To simplify integration with existing corporate systems, they began to build adapters for e-mail systems and directories and to integrate ORBs (object middleware) for linking to legacy applications. As the applications being built became more important to the business, they also began to incorporate security tools such as authentication and encryption.

As these pieces were being added, server providers continued to enhance the tools and programming languages that provided access to those resources with the goal of making developers even more productive. Thus the scripting languages grew in power and the collections of objects became systems in and of themselves.

As the runtime services provided by these products gradually became richer, it became obvious that this new breed of product was much more than a tool, but rather a new piece of infrastructure required by anyone building significant applications for the Internet. It took an additional step toward standardization, however, before the term application server became widely accepted.

Each of the early application servers took its own path to maturity, but by the middle of 1999 most began to converge around a similar architecture (see Figure 2). The plumbing consisted of a set of core runtime services (e.g., load balancing, transactions) as well as integration services that offered connectivity into other systems (e.g., databases, file servers, e-mail servers, applications). To access these resources, developers generally had a two-layer development model. This model consisted of a Web application logic layer that handled presentation of data to users and a business logic layer that handled data access and data processing. Once this model was widely accepted, the name application server became more widely known.

Emergence of Standards
In the past 18 months the application server arena has seen even greater change as application server providers have moved to standardize the architectural model and the interfaces through which developers access the server's resources.

Initially this standardization came in pieces. For instance, Java Servlets and later JavaServer Pages caught the interest of many Web developers as a powerful way to use Java for Web application logic (i.e., generating dynamic Web pages). Microsoft's ASP (Active Server Pages) and Allaire's CFML (ColdFusion Markup Language) have gained widespread support at this layer as well. Similarly, EJB (Enterprise JavaBeans) and Microsoft's COM/COM+ have emerged as standard models for developing business logic components.

More recently, Sun and its partners in the Java community have tried to rationalize these individual pieces into a uniform and complete collection of APIs, now termed the Java 2 Platform, Enterprise Edition (J2EE). The APIs in J2EE provide a comprehensive set of services and a two-layered programming model for developers building applications on the Internet, with Servlets/JSP at the Web application logic layer and EJB at the business logic layer. The collection of APIs known as J2EE now has the support of most of the major players in the Internet software market, including Allaire, Art Technology Group, BEA, Bluestone Software, IBM, Oracle and Sybase.

The most notable exception on the list is Microsoft. In opposition to Sun's efforts to promote Java, Microsoft has developed its own set of APIs, referred to as Windows DNA (Distributed interNet Applications Architecture). There are significant differences between Windows DNA and J2EE. For instance, Windows DNA is tied to the Windows operating system whereas J2EE runs across operating systems. J2EE has been developed and implemented by a number of vendors; Windows DNA is available only from Microsoft. However, from an architectural level Windows DNA closely resembles J2EE, providing a Web application layer (ASP) and a business logic layer (COM+) as well as a set of core services and integration services (Windows 2000 and BackOffice). Moreover, given the ubiquity of Microsoft technology, Windows DNA will remain an important programming model for the foreseeable future.

While the shift toward standardized programming models (Windows DNA and J2EE) is still underway, it promises to bring enormous change to the industry and to developers. Just as the standardization of the desktop APIs around Win32 enabled a rich applications market to emerge, having a common set of programming interfaces for application servers should enable independent software houses to build packages that run on multiple application servers. While complete portability across application servers that support the same standard is still an unrealized promise, it should enable independent software vendors to dramatically reduce the cost of porting their applications from one server to another. Even without code portability, a key benefit for developers will be the ability to apply the same development skills to multiple servers and operating systems.

Unfortunately, standardization also has its drawbacks. For instance, because J2EE hides the underlying details of the operating system on which the server is running, applications built with it often don't take advantage of the rich services offered by today's server operating systems.

In addition, J2EE is a complex model for developers to swallow. Accommodating all of the needs of developers under one umbrella has meant adding features that are required only by advanced developers. As a result, one of the major challenges for application server providers will be to encapsulate the J2EE platform in a way that makes all developers productive, even those building simple applications. One promising approach taken by several vendors is to build high-level JSP tag libraries that encapsulate Java platform services for developers accustomed to tag-based languages. Another is to take an existing language that has proved itself in the market and integrate it much more closely so it can take advantage of Java platform services while affording developers the same productivity they've enjoyed in the past.

The other major challenge is one application servers have faced since their origin: the need to incorporate additional technologies as they become part of the developer arsenal. For instance, as portable devices become increasingly connected, their users want access to the same business logic that's now available on the Web. However, since these devices have user interfaces and bandwidth restrictions dramatically different from those of a browser, they require a different set of technologies at the Web application logic layer. Thus we're already seeing a flurry of activity as vendors incorporate technologies like WML (Wireless Markup Language) to support these new types of applications within the application server.

Along the same lines, many companies are now looking to connect applications built using application servers directly to other application servers, not to new users. Thus the product definition is evolving to include new technologies for application integration and B2B messaging. Two promising technologies in this area have received wide support: the Java Message Service (JMS), a Java-based API for asynchronous messaging, and XML-SOAP (Simple Object Access Protocal), an XML-based protocol for invoking applications using HTTP.

Future of Application Servers: Internet Business Platforms
As this short history illustrates, application servers have been successful because they evolved with the changing needs of developers, incorporating new technologies (e.g., XML) and supporting new types of applications as they emerged. However, even as application servers continue to change, the market is already entering its next phase - the expansion of the application server into a full Internet business platform (see Figure 1).

Just as developers learned that many of the most common tasks involved in Web development could be packaged in a reusable fashion, companies building their businesses on the Web are recognizing that many of the most common applications can be bought and customized instead of built from scratch. The next few years will see application server vendors building suites of applications that meet the most common needs of businesses.

Obviously, many applications will be built by independent solution providers, and many others will be custom built by companies that use them. However, just as all businesses need systems for managing finances and human resources (the foundations of the ERP market), we're now seeing that businesses using the Internet for strategic advantage will need content management, commerce software and personalization to be successful.

Content management applications will enable developers and business users to coordinate the creation, deployment and syndication of Internet content. Commerce applications will provide the building blocks for doing business on the Internet, whether selling directly to customers or to businesses, participating in online marketplaces or managing relationships with partners and suppliers. Finally, personalization systems will underlie both of these systems, enabling companies to customize their content and applications according to the needs and roles of a particular user.

Since application servers provide the platform for development on the Internet, who better to provide these core applications than the platform providers themselves? They have the technology and expertise, and by combining an integrated set of applications with a standards-based platform they can provide business users with rapid time-to-market without taking away the flexibility and standards support valued by developers. Moreover, since the application server itself is becoming increasingly standardized, vendors need additional ground on which to differentiate themselves.

.   .   .

The application server market has come a long way from its humble origins. Moreover, as the Internet becomes a vital channel for global business, the application server market is poised for even greater change, evolving to incorporate new technologies such as WML and XML-SOAP as well as more application-level services. In an effort to provide developers with a productive way to develop applications on the Internet, the application server has grown from a simple collection of code libraries to a robust, standardized platform for modern business.

More Stories By Phil Costa

Phil Costa is the senior manager of strategic marketing at Allaire Corporation. His
responsibilities include marketing for the Allaire Business Platform as well as research into the future directions of the Internet software market.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

DXWorldEXPO Digital Transformation Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Cloud is the motor for innovation and digital transformation. CIOs will run 25% of total application workloads in the cloud by the end of 2018, based on recent Morgan Stanley report. Having the right enterprise cloud strategy in place, often in a multi cloud environment, also helps companies become a more intelligent business. Companies that master this path have something in common: they create a culture of continuous innovation. In his presentation, Dilipkumar will outline the latest resear...
Everyone wants the rainbow - reduced IT costs, scalability, continuity, flexibility, manageability, and innovation. But in order to get to that collaboration rainbow, you need the cloud! In this presentation, we'll cover three areas: First - the rainbow of benefits from cloud collaboration. There are many different reasons why more and more companies and institutions are moving to the cloud. Benefits include: cost savings (reducing on-prem infrastructure, reducing data center foot print, redu...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...