This content is part of the Essential Guide: A guide to open source technology in application development

Kubernetes use cases extend beyond container orchestration

Enterprises use Kubernetes for much more than they did when Google released the platform in 2015. Discover how Kubernetes' uses have grown, and where it might be heading.

While Kubernetes has become the industry standard for container management, some enterprises now apply the technology for a broader range of use cases.

Kubernetes, a Cloud Native Computing Foundation project, supports a range of IT operations needs beyond container orchestration, including those related to multi-cloud deployments, service discovery and serverless platforms.

Explore a mix of mainstream, and emerging, Kubernetes use cases with these recent SearchITOperations articles.

Orchestrate containerized environments

The primary Kubernetes use case remains container management and orchestration. More specifically, Kubernetes helps IT admins automate container deployment, scaling and scheduling across clusters.

Even though Kubernetes reigns supreme as the enterprise container orchestration platform, it is not without its challenges. As TechTarget executive editor Meredith Courtemanche reports, some organizations continue to face difficulties with Kubernetes usage in production, especially around security and monitoring.

Support data-intensive apps

Kubernetes -- and containers, in general -- weren't always associated with data-intensive or stateful applications. Recent advancements in the market, however, have started to change that.

As TechTarget senior news writer Beth Pariseau writes, Google Cloud unveiled a service in September that runs Apache Spark -- a parallel processing framework that runs large-scale data analytics apps -- on Kubernetes. The managed service, called Cloud Dataproc for Kubernetes, will enable IT ops teams to run Spark jobs via Google Kubernetes Engine (GKE). Several areas of the service, currently in alpha, still need improvement, such as a of differences between Spark and Kubernetes scheduling functionality.

Other vendors, including Robin Systems, have taken steps to expand Kubernetes' support for data-intensive and stateful apps. The company offers a hyper-converged Kubernetes platform designed for big data and machine learning workloads.

Streamline multi-cloud management

Kubernetes' management capabilities extend from on premises to public cloud, including across multiple public clouds. As TechTarget contributor Chris Tozzi explains, an enterprise can use the orchestration platform to manage containers that span multiple clouds, which increases resiliency and configuration options.

In this Kubernetes use case, for example, an enterprise could have nodes in two public clouds, or even nodes in both private and public clouds, and use only Kubernetes for orchestration.

Enable service discovery

With Kubernetes, users can automate and customize service discovery for containerized applications. Kubernetes usage for microservices and container management is fairly routine, but when admins wade through the platform's features, they can add service discovery to its resume.

Service discovery, the automatic detection of devices and available services over a network, becomes paramount in environments where containers continually start and stop, as containers replace old ones. As TechTarget contributor Twain Taylor describes, there are two ways to use Kubernetes for service discovery: through domain name servers or through environment variables. Other features of the platform, including replication controllers, also play a role.

Perform container cluster federation

TechTarget contributor Tom Nolle explains why the Apache Mesos-Marathon combination has been the preferred container federation approach for large-scale IT deployments -- but that Kubernetes, alongside third-party tools, will become an increasingly common option for this use case.

Kubernetes cluster federation is still working out the kinks with version 2.0, but its features focus on multicluster federation. This use case best suits environments with primarily autonomous Kubernetes clusters that are occasionally needed to work together.

Embrace next-gen PaaS

Knative is an open source platform that extends Kubernetes capabilities, including its support for serverless workloads. With Knative and Kubernetes, focus shifts from the platform to code deployment, as Knative abstracts away infrastructure management. As TechTarget's Pariseau reports, industry experts predict this serverless adaption of Kubernetes could kick off the next wave of PaaS infrastructures.

While there is still much debate over whether serverless improves operations, enterprises with workloads that follow an event-driven model see many benefits, such as granular billing, lower costs, automated scalability and improved reliability.

Prepare for future Kubernetes use cases

Kubernetes is a tool in flux, with a degree of flexibility that lends the orchestration tool to many use cases -- some of which are not yet established processes. At VMworld 2019, for example, VMware revealed Project Pacific, an initiative to integrate Kubernetes with vSphere. By doing so, IT admins would use the familiar vSphere interface to deploy and manage both containers and VMs.

Next Steps

How to tackle container orchestration challenges

How to run ML workloads with Apache Spark on Kubernetes

Dig Deeper on Containers and virtualization

Software Quality
App Architecture
Cloud Computing
SearchAWS
TheServerSide.com
Data Center
Close