The Evolution of App Deployment and Packaging: 1990 to Now

Overview

While app infrastructure and architecture were evolving, so was the face of software deployment and packaging. Over the past three decades, app deployment has become more efficient, a must in a world that increasingly lives and works online. The evolution of deployment and packaging since 1980 has ultimately lead to improved efficiency, scalability, and profitability.

Achieve Cloud Elasticity with Iron

Is your company looking to save money? Contact us to find out how IronWorker can help.

The Evolution of App Deployment and Packaging

The Evolution of Deployment and Packaging—It Starts with Physical Machines

Thirty years ago, there was one way to deploy software: directly to a physical machine. Whether this was a personal computer or server, the software ran directly on the operating system. This system wasn't without cons:

  • Development (and use) required physical access.
  • Software is limited by physical resouces.
  • Resource-intensive apps could impact performance of other apps.

If developers wanted to create apps that worked on different setups, they would program multiple versions of the same app and would often need direct access to the machine to install the software. It's clear how this could be costly, both in time and money. Even after a successful deployment, users could only access the software from the physical machine or, perhaps, from a network connection.

Another downside to apps that ran directly on the physical machine is how much impact the software could have on that machine, including hogging resources such as processing power or RAM or causing software problems because of glitches or other unforeseen interactions with the operating system.

Finally, the software was limited to the upper limits of the server. It wasn't always cost-effective to upgrade hard drive capacity or memory, and not always possible to upgrade the processor. Therefore, apps might struggle or fail to work entirely when approaching those upper bounds.

1990 Introduces Virtualization

The era between then and now involved placing a layer between the operating system and software to increase convenience and reduce the impact the software could have on the system. This was known as virtualization. Apps weren't quite neatly bound within contains, but they weren't running rampant on servers, either. Virtualization offered a way to test software before all the kinks were worked out.

While apps themselves could be virtualized, so could the entire operating system. For example, a virtualized Linux operating system could be created to run Linux-based apps. The same computer could include a Windows box for software that simply couldn't run on Linux. Both both of those virtualized boxes could potentially run on the same server. However, the more virtualization environments, the more the physical machine was taxed.

Virtualized software could run on a company's own computers and those owned by companies that were leased for a small fee. This allowed some businesses to reduce their on-site IT costs and labor. It also led to an increase in data centers, and the increasing speed of Internet connections made virtualization possible.

An added benefit of virtualization was the ability to combine resources to get around the limitations of a single machine. Virtualization allowed several machines to pool their hard drive space, memory, and processing power. In fact, virtualization is the foundation upon which the cloud was built. This brings us to the current era of software deployment and packaging.

pexels-panumas-nikhomkhai-1652178

Iron.io Serverless Tools

Speak to us to learn how IronWorker and IronMQ are essential products for your application to become cloud elastic.

2000: Containerization and Beyond

The most recent development in software development is containers. A container is a controlled environment that software is placed in that offers several pros:

  • Convenience
  • Safety
  • Remote access

First, because the software needs only run on the container, it can easily be deployed to various environments and perform reliably on them. This means that developers don't have to learn how different environments or operating systems to program apps to work in those contexts, which makes both initial deployment and maintenance more efficient. Scalability has increased with containerization. Because of this, containerization has virtually increased ROI across the board, and developers can bring apps to market much more quickly.

Furthermore, software that's nestled safely inside a container cannot directly impact the greater environment. In this way, containers act as a form of damage control, whether an app goes rogue or simply uses too many resources. In fact, much software exists in containers in the cloud, which makes it accessible remotely.

App deployment and maintenance can often be done remotely as well. Companies now rely on rented cloud resources for both computing and storage, which has led to the rise of everything-as-a-service subscriptions such as Platform as a Service (PaaS). In short, containers allow developers and companies to do more with less.

Containers are sometimes proprietary. For example, Google used Borg internally for years. Some for others to use. Borg morphed into the open-source Kubernetes. At Iron.io, we've created solutions that work with another container solution, Docker

6 Development Tools for Serverless Applications

Unlock the Cloud with Iron.io

Do you want to learn more about how Docker containers can help you? Contact us today!

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.