Cloud vs. On-Premise: Where to Deploy AI Applications
Park Place Hardware Maintenance
AI applications are no longer exclusive to tech giants like Facebook or Google. Increased accessibility to storage technologies and GPUs is enabling the masses to access AI capabilities, like machine learning and robotic process automation, with the swipe of a credit card. But where exactly should your organization store the massive amount of data often needed for AI initiatives?
As organizations of all sizes begin exploring AI applications, cloud vs. on-premise deployment will become a pivotal issue for IT service providers and other channel partners.
The question that every company must ask as they tackle AI is “where do we fall on the AI continuum?” Answering this question can help you determine whether you’ll require on-prem data center infrastructure to support AI needs or whether consuming prebuilt models in the cloud can help you reach your objective.
Coud vs. On-Premise Hosting for AI Applications
The differences between cloud and on-prem hosting are often compared to renting vs. buying a home. Cloud hosting is a lot like renting; the stay of AI applications is as long as the contract terms dictate. Furthermore, the maintenance of the hardware is the responsibility of the hosting provider. On-premise hosting, on the other hand, is like buying a home; the application can stay on the hardware as long as business requires it. But deciding between the two depends on a few factors:
|On-premise deployment of AI applications would eliminate the need for renewal of contract terms, which will reduce usage costs. However, in-house data center maintenance costs could increase as a result.
|While experimenting with AI, choosing the pay as you go options would save costs. However, maintenance costs can increase over time.
|On-premise hosting offers complete control over the hardware, which means that the administrators of a company can tightly control updates.
But on-premise hosting does require advanced planning to scale hardware. This is because it requires time to gather the necessary data for updating it.
|Cloud resources can be rapidly adjusted to accommodate specific demands and increase the scalability of hardware.
With cloud services, there is too much software clutter within the hardware stacks that reduces the scalability.
|Full control over data stored on enterprise premises; no third party has access to data unless hacked.
|Hosting providers must keep their systems updated and data encrypted to avoid breaches. Still, your company can’t be sure where your data is stored and how often it is backed up; data is also accessible by third parties.
|Considering the costs of training neural networks with massive datasets for data transfer, companies may want to deploy their AI applications on-premises if the data is available there.
|If the data required to build AI applications resides on the cloud, then it’s best to deploy applications there.
*The location of the largest source of data for an enterprise determines the location of its most critical applications, as explained by the concept of ‘data gravity’. Data gravity is the ability of data to attract applications, services, and other data towards itself. It is among the most important factors to be considered while choosing between cloud and on-premise platforms.
The Case for Cloud-Based AI Applications
Cloud-based AI services are an ideal solution for many organizations. Instead of building out a massive data center to gain access to compute, you can use the infrastructure someone else already maintains. In fact, one reason why AI has become so pervasive is cloud providers offering plug-and-play AI cloud services, as well as access to enough compute power and pre-trained models to launch AI applications. This significantly reduces the barriers to entry.
But be aware, in many cases the pre-trained models or storage requirements of the cloud can be cost-prohibitive; higher GPU counts get expensive fast and training large datasets on the public cloud can be too slow. Still, the cloud can often be the best option in terms of “testing the waters” of AI and experimenting with which AI initiatives work best for an organization.
The Case for On-Premise AI
So then what moves customers on-premise?
There’s a whole ecosystem of tools built for on-premise infrastructure that can work with mass amounts of compute power–which can be very expensive in the cloud. Thus, some IT directors find it more economical to do this on-premise or prefer a capital expense to an operational expense model. Furthermore, if your organization determines that it wants to get more involved in this or roll-out AI at scale, then it may make more sense to invest in on-premises infrastructure instead of consuming cloud-based services.
Challenges to Implementing AI
As you evaluate how to implement AI initiatives, it’s important to remember that much of the data driving AI is actually siloed in legacy infrastructure and is not necessarily in the right format nor is it easily accessible. What’s more, there’s a massive amount of unstructured data to be processed! And in many cases, that data has grown beyond a company’s infrastructure.
Dealing with much larger data sets requires more difficult computation and algorithms; the truth is, most of your time could be spent cleaning data, de-identifying it, and getting it to a point where it can be used to gather insights. And once you do decide on a place to deploy AI applications, it can also be a challenge to give engineers and data scientists access to that data.
However, if there’s one thing that is certain, it’s that AI will be driving innovation and competitive differentiation for years to come. So deciding on the location of infrastructure for training and running a neural network for AI is a very big decision that should be made with a holistic view of requirements and economics.
Source: Brandon Ebken and Juan Orlandini. “AI Applications: Cloud vs. on-Premises Deployment Options.” SearchITChannel, TechTarget, 3 Apr. 2019, searchitchannel.techtarget.com/tip/AI-applications-Cloud-vs-on-premises-deployment-options.