kalafoto - Fotolia

News Stay informed about the latest enterprise technology news and product updates.

Knative serverless Kubernetes bypasses FaaS to revive PaaS

Knative's emerging claim to fame is that it can turn Kubernetes into a next-gen platform as a service with both a Heroku-like developer experience and deep infrastructure control.

When the Knative project was in summer 2018, the term serverless was usually associated with function-as-a-service offerings such as AWS Lambda.

However, a year on, industry experts believe Knative's serverless adaptation of Kubernetes will facilitate the next generation of platform-as-a-service (PaaS) infrastructures as well, as enterprise DevOps pros establish application deployment platforms for internal developers. Early adopters began to see this potential for event-driven infrastructure provisioning for applications other than function-as-a-service as well, after about six months.

"We want to be able to have a developer push code out and not care about the platform [under DevOps]," said Tom Petrocelli, an analyst at Amalgam Insights in Arlington, Mass. "And right now, developers have to care too much about the platform."

Knative brings Kubernetes full circle to PaaS roots

Knative solves a thorny technical problem of scaling Kubernetes resources down to zero, rather than requiring at least a minimal number to sit idle, waiting for workloads. It also abstracts most of that infrastructure management away from the developer through integrations with function-as-a-service interfaces. Such abstractions were also common to the generation of PaaS platforms such as Heroku, Engine Yard and Google App Engine -- along with the company that would become Docker, which began as a PaaS startup called dotCloud before its container orchestration prowess prompted it to pivot and set off the container movement in enterprise IT.

We want to be able to have a developer push code out and not care about the platform [under DevOps]. And right now, developers have to care too much about the platform.
Tom PetrocelliAnalyst, Amalgam Insights

But unlike the earlier generation of VM-based, fully managed PaaS offerings, Knative and Kubernetes allow enterprise DevOps and SRE teams to retain control over the Kubernetes infrastructure behind the scenes. They also enable control over infrastructure costs, since admins can pack many containers into their choice of cloud computing environment, and because Kubernetes and Knative are available as free and open source software.

In fact, some bleeding-edge users had begun to move away from Heroku and toward Kubernetes for exactly those reasons when Knative was open sourced by Google last year. Engineers from companies such as Rainforest QA, Salsify, Fairwinds and Periscope each posted publicly about such transitions, motivated by the cost savings that arose from managing their own automated infrastructure based on containers and open source software, tighter control over infrastructure security and database scalability.

Google Cloud Run shows Knative in action

Users of Google's Cloud Run service, a managed stateless container autoscaling platform that runs on Knative, have also sought finer-grained control over PaaS as their application needs grew and matured.

Perceptual Inc., better known as Percy, a visual software testing SaaS provider used by tech firms such as Shopify, Fastly, Spotify and Google for UI and customer experience tests, began using the fully managed version of the Cloud Run service at its launch in April to rapidly process hundreds of millions of visual snapshots.

After a few weeks on the fully managed platform, however, Percy engineers realized the fully managed version of the service would be too costly to sustain because of the nature of its particular workload. The company switched to a variant called Cloud Run on Google Kubernetes Engine (GKE), which allows users to tune the underlying infrastructure through the Knative interface.

"[Fully managed] Cloud Run uses a Docker container that gives you concurrency up to 60 [workloads]," said David Jones, director of engineering at Percy. "But in our case, our workload is so heavy that we just cannot allow that concurrency to happen, and can only really have one Docker container handling one request at a time."

Instead of many Docker containers provisioned for concurrency, Percy's workload runs more efficiently on fewer, larger containers. Configuration tweaks it made through the Knative interface in Cloud Run on GKE made that possible, Jones said.

Knative also made the switch from Google Cloud Run fully managed to Cloud Run on GKE trivial from a developer standpoint, Jones added.

"The http endpoints remain the same; it's just a different name," Jones said. "We made no changes to the way we were calling those endpoints, which was certainly lovely, because everything was compatible."

Jones said Percy's engineers would like to delve even deeper into Knative's autoscaler controls in future releases of Cloud Run.

"You can't use custom metrics for Knative's autoscaler, and we'd like to be able to do that," he said. "It'd be great if Knative's autoscaler could predict demand, [which] would allow us to better have supply meet demand, rather than relying on hand-configured rules."

A custom metrics support feature for the Knative autoscaler has been discussed in the upstream community but hasn't yet reached readiness for users to test.

Kubernetes platform vendors prep for Knative demand

The preference for infrastructure automation that masks complexity from developers while retaining organizational control over underlying systems is widespread in enterprise IT shops as cloud, DevOps and containers surge in popularity. Thus, Kubernetes platform vendors such as Red Hat and Pivotal have already lined up as Knative contributors and incorporated it into their products. Red Hat officials said the developer preview for Knative on OpenShift, expected to be generally available in the next six months or so, has been popular so far. Red Hat has also partnered with Microsoft on KEDA, which uses Knative to host Azure Functions on OpenShift.

"Knative doesn't actually have function-as-a-service components at all, but it provides the building blocks for a Kubernetes-native serverless ecosystem, whether it's [based on] functions or just general applications," said Reza Shafii, vice president and general manager of cloud platform services for Red Hat OpenShift. "That's why we jumped on it, because for us, that's what's needed."

Knative is also making waves beyond infrastructure automation, in the application delivery part of the DevOps world. A Knative spinoff project, Tekton, which provides a specification for event-driven CI/CD pipelines, is a key component of the JenkinsX cloud-native update for Jenkins, and available on Google Cloud Platform.

Dig Deeper on Managing Virtual Containers

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

How might you incorporate Knative into your Kubernetes environment?
Cancel

-ADS BY GOOGLE

SearchDataCenter

SearchAWS

SearchServerVirtualization

TheServerSide.com

SearchCloudComputing

Close