We’ve long been comparing and contrasting cloud-based infrastructure with the infrastructure sold by traditional enterprise hardware and software players. However, some further information is starting to appear.
The Bureau of Labor Statistics Producer Price Index reported a 3.9% month-over-month decline in the cost of host computers and servers, meaning that the hardware has gotten cheaper. At the same time, cloud services saw prices increase by 2.3% since the third quarter of 2022. The reality is cloud computing may not be the slam dunk we once thought, at least in this moment.
It’s not just the cost
Of course, you should look at more than cost when considering any technology. For instance, open source software is free, but there are many times when you want to purchase expensive licenses because of the total value a specific technology can generate. This is the idea of being penny-wise and pound-foolish.
That said, if you compare apples to apples, such as object storage in the cloud versus object storage in the data center, the value that each type of technology can generate is relatively equal, but the prices are not.
Public cloud computing prices have been creeping up because they are offered by for-profit companies that must generate a profit. Running a public cloud service is costly, and the billions invested over the past 12 years must show investors a return. That’s why prices have been increasing, not to mention the additional value cloud providers can offer, such as integrated AI, finops, operations, etc.
At the same time, the cost of producing hardware, such as traditional HDD storage, has dropped to a new level of confusion. Now it’s a viable alternative to cloud-based storage systems. Thus, it’s not just a quick decision to pick cloud computing over traditional hardware now.
What this means for enterprises
I’ve never trusted platform decisions that seem to have a religious undertone. I’ve seen people who are singularly devoted to open source, cloud, architectures (e.g., cloud-native and microservices), or other hype-driven trends. They’re putting feelings over the facts in many instances and could be buying technology that is not optimal for their specific use case.
Of course, you can fit a square peg in a round hole if you use a hammer. Poor architectural decisions are often overlooked, considering the end-state solution “works.” Of course, it may cost you $10 million more than a better-optimized solution, but perhaps nobody will notice. I’m seeing too many of these to count.
Again, this is about being entirely objective when looking at all potential solutions, including cloud and on-premises. Cost being equal, cloud computing will be the better choice nine times out of 10, but now that the prices are very different, that may not be the case.
If you’re the person making these calls, you must consider all aspects of these solutions, including future criteria. A particular solution could provide better business value over time, despite the higher cost. As I mentioned, there are many reasons to pick the more costly technology.
I suspect that the cost of traditional hardware will drop even more in the future, and the trade-offs between cloud computing and on-premises will become even more blurry. Determining the solution to bring the most value back to the business will take a lot of work.
This makes the role of an architect much more critical and emphasizes the need to make objective decisions. Based on the business case considerations, we’ll likely have more on-premises systems than expected. I think that’s just fine as long as it’s the solution that brings the most value back to the business. Easy enough.