From massive mainframes to microservices

Smaller is better has been a guiding philosophy in the world of computers and IT for many years, especially in the hardware space. Those of us who have been around for a while have been first-row witnesses to the “Honey, I shrunk the computer” movement, watching as mainframes that consumed a city block morphed into (more powerful) systems that would fit on a desktop, which then went portable (who else remembers the suit-case sized “luggable” Osborne and Kaypro of the 1980s?)

Next, came true laptops, weighing in at less than 10 pounds, and like someone on a crazy fad diet, they kept getting smaller and lighter, culminating in the under-two-pound full-fledged devices of today. My Surface Pro 4, with 16GB of RAM and a Core i7 processor, is many times more capable than the huge and heavy tower for which I paid more than twice the price in the mid-90s.

This miniaturization trend doesn’t stop there. Smartphones are getting more and more functionality, and some people are able to use them, or small tablets, as their only computers. My Samsung Galaxy Note, which (barely) fits in my pocket, has more RAM and storage space than some servers of a decade ago. Next up: smartwatches are bound to get smarter and become more like “real” computers as the technology is refined. Of course, there have also been countertrends that buck the ever-diminishing size pattern. Phones grew tinier and tinier – until they sprouted touch screen displays, at which time they started to grow again and seem to have settled somewhere between 5 and 6 inches.

Overall, though, the idea of making things smaller is a popular one, and now it is beginning to extend to software as well as hardware.

All in the same bloat

For many years, at the same time our devices were getting more compact, our software programs were growing – in some cases, almost uncontrollably. Some viewed the bloat as akin to a cancer that proliferates in code rather than living cells, taking over and crowding out the essential functionalities of a program in favor of more and more features that are actually used by fewer and fewer people. However, that didn’t stop designers and developers, who had their own pet features, from throwing in everything but the kitchen sink in the effort to differentiate a new version of software from its predecessors and justify the cost of upgrading.


The problem with bloat is two-fold: it makes applications unnecessarily complicated because users have to slog through all those unneeded features to find those that they do want to use, and all the extra code makes the software slower and more sluggish. There is even a well-known adage called Wirth’s Law that stipulates that software is slowing down at a pace that has performance decreasing even as hardware gets faster. That means higher hardware costs, as we have to buy more and more powerful machines to try to stay ahead of the game (or at least break even).

According to some experts, if the feature list won’t fit on one page, there are so many settings and preferences that a manual is required to set up the program, and the application contains millions of lines of code, it’s probably bloatware.

Why does software get bloated? As companies attempt to improve software, incorporate customer requests for new features, and include security mechanisms to protect against more and more threats and attack types, a once lean-and-mean program “just grows that way.”

Of course, we’ve all seen the pendulum swing that resulted from users and administrators having to deal with bloated software that has become user-unfriendly and requires vast amounts of memory and storage space for installation: the “apps” phenomenon. Apps are small programs that perform only one or a few tasks and do so quickly, efficiently and with minimal user training required. Apps originated in smartphones and tablets, which have limited resources. Necessity being the mother of invention, developers learned to create simple apps with small footprints.

Users love apps because when done right, they just work. Their interfaces are intuitive, you don’t spend hours – or days – trying to figure out how to use them, and they accomplish what you want done, no more and no less. It also doesn’t hurt that the cost of a typical app ranges from nothing (free apps) to twenty dollars, whereas desktop productivity software can cost in the hundreds for a single license.

Apps have become so popular that the concept has moved from mobile devices to the PC. Microsoft introduced Modern apps – formerly known as Metro apps – in Windows 8 and has refined the idea somewhat in Windows 10 with its “universal” apps model that will run on smartphones, tablets, laptops and desktop computers. Although many of us still depend on our “big” programs such as the Microsoft Office suite and graphics applications such as Adobe Photoshop or CorelDraw to do complex work, apps are slowly making inroads for performing quick tasks, especially when we need to access information or communicate with others via a phone or tablet.

Software becomes a service

At the same time that the applications we install on our machines have been shrinking into inexpensive and simple apps, another trend in software sees us using applications that aren’t installed locally at all. Software as a Service (SaaS), which has been marketed unsuccessfully for decades, has finally caught fire.

SaaS addresses one of the problems that bloated software caused: the issue of disk space for installing a multiplicity of applications that each needed to use up gigabytes of drive space. Once you factored in the temp files and data files that were created by the programs, you could find yourself running low on space even with a terabyte-sized hard drive. Coupled with the move toward faster, non-moving-parts solid state drives (SSD), which cost more per unit of storage, it could be a real problem.

Software that’s delivered as a service that runs on a remote server somewhere in the cloud rather than on each individual PC can greatly reduce the hardware system requirements for users’ computers and allow you to work with complex programs via low-resource devices. Thus, SaaS miniaturizes the software footprint on your local computer and lowers hardware costs, along with shifting the administrative burdens of maintaining, updating, and troubleshooting the software to the SaaS provider.

We created a monolith

Simply running applications and services on somebody else’s hardware, however, doesn’t really do anything to address the existence of big, unwieldy programs. It merely makes them somebody else’s problem. Developers had to figure out how to reduce the real size of services they were delivering – without simplifying those services to the point where users couldn’t get the things done that they wanted to accomplish.

A big part of the reason software bloat was so difficult to do anything about was that program tended to be monolithic – written as one big entity. Software makers had already attempted to tackle this problem in their largest software products: operating systems. The solution that was put forth by Microsoft and other OS vendors was to design operating systems around a modular model instead of a monolithic one. Modular programming is based on the practice of dividing software into parts, or modules, that can be changed out or removed or replaced without much impact on the other parts.

That basic premise is in use in many areas other than software design: most are familiar with modular office design (also known as cubicles), which preceded the current “open office” trend. Some may be aware of the modular design of the U.S. Navy’s relatively new class of ships, called Littoral Combat Ships or LCS, which can be reconfigured for different roles (intel, surveillance, anti-submarine or anti-surface warfare, intercept, special ops, etc.) unlike older ships that were designed for a single purpose.

When it comes to creating software and services, this modular design concept has morphed into what we now know as microservices architecture.

Microservices to the rescue

Microservices are separate processes that work together and communicate with one another, taking the modular idea a step further by using non-proprietary protocols. A key factor in microservice architecture is to keep the individual services small, with responsibilities distributed across a number of services.


The services can be written in different programming languages use utilize different hardware platforms and software environments, but they interoperate. It’s easy and fast to make changes to one part of the application by modifying or replacing only the service that provides that particular function (for example, the user interface).

It’s important to note that miniaturization, like everything else, can reach a point of diminishing returns – both in the hardware and software realms. Just as cell phones reached a point where their tiny size made them more difficult to use, or as the extreme miniaturization of microSD cards makes them more prone to loss, the microservices architecture can be taken to an extreme that renders it so finely grained that the overhead offsets the advantages. This extreme has been dubbed “nanoservices” by critics.

The microservices architecture is currently in use by tech giants such as Amazon, Google, and Microsoft in their cloud, and we’re sure to see and hear more about microservices in the near future. For more information about this new development model, see Microsoft’s perspectiveWhy a microservices approach to building applications on the Azure web site.

Photo credits: Timitrius and Pixel Fantasy.

Leave a Comment

Your email address will not be published.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Scroll to Top