
Written by Bill Burnham
Bill Burnham is CTO, US Public Sector at Hewlett Packard Enterprise. Prior to joining HPE, he served as CTO for the United States Special Operations Command, overseeing the enterprise architecture and technical modernization of SOCOM’s global network supporting users in more than 90 countries.
For much of the past decade, commercial and public sector organizations have moved a growing portion of their IT portfolio to the cloud. And for good reason. The cloud has given businesses not only the ability to use on-demand IT services, but also a more flexible financial model to modernize their infrastructure and applications.
But this financial model had a problem: While the costs of storage and compute in the cloud offer huge economies of scale, the unprecedented costs of moving petabytes of data to and out of commercial clouds have increased dramatically, costing money. much more than most organizations could have anticipated. . As a result, many large organizations, including Dropbox, which was built in the cloud, took dramatic consequences and paradoxical step of “repatriation” of their data, according to a recent analyst report from Andreesen Horowitz, the Silicon Valley venture capital firm.
However, what also pushes companies to take this step is the ability that companies now have to build and operate their own private clouds and deliver the same elastic cloud capabilities – with the same consumption model. on-site pay-on-demand than they get from the big cloud service providers. The big differences: they don’t have to pay data entry and exit charges; they don’t have to move petabytes of data, with the latency that this creates; and their data can remain secure in their data center, eliminating the risk posed to data in transit.
For state and local government agencies, the savings can be significant. Suppose your agency contracts with one of the large cloud service providers and their export or data transaction fees average 3 cents per gigabyte. It doesn’t sound like much. But if you mine a petabyte of data two or three times a week at $ 30,000 a petabyte – say to look for potential unemployment claims fraud – the bills can add up astronomically. Or say you want to move 20 petabytes of data to another cloud provider, or get back on-premise. That’s $ 600,000 that could be better spent on upgrading your applications.
That doesn’t mean businesses are moving away from commercial cloud service providers, and neither should they. After all, cloud providers offer a level of flexibility, reliability, and security that has proven to be a game-changer for almost any business.
But as Target CIO Mike McNamara described in a recent interview, it is simply more cost effective to create an advanced hybrid cloud environment. This means running most of a company‘s IT workloads on-premises and relying on commercial clouds when and where it makes more strategic sense, such as Cyber ââMonday and Christmas week, in the case of Target, when trading volumes can be multiplied by 20 in a typical week.
Alternative approach
The challenge that most state and local agencies face, of course, is budget constraints. In the corporate world, IT investments can generally be associated with revenue generation and profits, while in the public sector, IT costs are seen only as an expense to be managed.
But another challenge is the tendency of executives to associate the “cloud” with a place or a company, as opposed to an IT style. As the Cloud Native Computing Foundation aptly explains: âCloud native computing uses an open source software stack to deploy applications as microservices, packaging each part into its own container and dynamically orchestrating those containers. to optimize the use of resources.
The future of enterprise IT builds on this IT model in the form of a hybrid cloud environment that enables organizations to manage elastic workloads at the edge, main data center or in the infrastructure of a cloud service provider, capitalizing on the power and portability of containers and microservices.
By broadening our vision of cloud computing, it is also possible to think about alternative approaches to address the financial challenges that are slowing modernization.
Consider this example: HPE currently has approximately 1,100 enterprise customers who have modernized their on-premises IT operations by upgrading to the latest available hardware and moving to a pay-on-demand consumption model. We offer you a cloud native environment, but charge for it as a service, based only on what you use, just like you do with commercial cloud providers, except without the significant transaction fees or the requirement to move your data treasure. local.
This frees up the capital that agencies would normally spend on refreshing their infrastructure and directing those funds to modernizing your agency’s software workloads and refactoring applications into cloud native services.
Suppose you refresh 20% of your hardware every year, on a budget of $ 1.5 million per year. Take those funds and allocate $ 300,000 for infrastructure use payments and spend the rest on hiring Python coders to modernize your applications over the next two years. Ideally, by the third year, you won’t need $ 1.5 million for another round of infrastructure upgrades, as modern, cloud-native containerized applications launch when needed. and do not require as much infrastructure as traditional applications based on virtual machines. You’ll always have the latest infrastructure available, and more importantly, you’ll have greater flexibility and cost control as you manage your business-critical IT workloads.
There is a bigger reason, however, to adopt a more advanced model of hybrid cloud computing, beyond the additional cost control it offers.
As organizations enter the data-driven knowledge age – where artificial intelligence and machine learning platforms will consume ever larger volumes of data – it will become prohibitive and less and less. convenient to move all this data to and from commercial cloud providers.
At the same time, as cities and states continue to develop and integrate their operational technology systems – to manage buildings, traffic, utility capacity, emergency response and countless other public services – they will have to manage increasingly data workloads at the edge. This means that an increasing amount of data, by its nature, will need to be processed further away from the cloud service provider’s infrastructure, as managing smart cities will require immediate insight and datasets will be too large to be processed. displaced. Fear not, however, now you can put high-power computing at the edge of your environment while enjoying a âpay as you goâ model of consumption.
Going forward, agencies need to move beyond the legacy concept of ‘cloud migration’ and plan for a larger hybrid cloud computing environment – one that works from edge to core to one or more cloud service providers. and that is all available “as a service.”
We’ve been watching – and helping – this development for some time in the commercial and government markets, where the Edge-to-Core-to-Cloud Platform-as-a-Service model called Greenlake is now over $ 5. billion dollars in business and continues to grow. The most recent Price of $ 2 billion by the National Security Agency at HPE for large complete data centers under a Greenlake consumer model service is proof of the maturity of the solution and the trust we have earned by working with government agencies and their data.
This should give confidence to IT managers and national and local government agencies not only in the direction hybrid cloud computing is taking, but also in the financial mechanisms organizations have to modernize faster and more cost effectively.
Learn more about how Greenlake, HPE’s Edge-to-Core-to-Cloud Platform as a Service Model helps businesses manage their IT needs more cost effectively,