top of page
Search

Kubernetes, But Make It Fashionable: Breaking Down OKD, ARO, ROSA, OpenShift Virtualization, AKS, EKS, and GKE

  • Writer: Shannon
    Shannon
  • 4 days ago
  • 6 min read

I am genuinely starting to realize there's still a lot of confusion out there and arguably I'm also pretty confused myself a lot of days (especially with new tech or terminology I'm not familiar with). Thankfully the confusion can be quelled (at least a little bit) when you find good reference articles and part of the mission of my blog is to make things consumable, understandable, and easy to grok (yes I used "grok" and no I'm not talking about X's AI chatbot).


There was a time when running Kubernetes just meant, well, running Kubernetes. Today though, it feels like you need a translator, a strategy consultant, and maybe a tarot card reader just to figure out which flavor of Kubernetes works best for your team. Between open source options like OKD and cloud-native managed services like AKS, EKS, and GKE (not to mention the buffet of OpenShift offerings) it’s no wonder people are confused.


So let’s simplify it, shall we? Maybe the best advice is start by using this as your field guide to understanding the various Kubernetes and OpenShift deployment options, complete with quirks, strengths, and what they’re actually good for (shout out to my teammate Reese for inspiring this blog). Whether you’re a cloud-native evangelist, a VM hoarder trying to modernize, or an enterprise architect juggling six compliance frameworks, this post is for you (p.s. one thing to note, is I identify as the perpetual green belt here, but even I need a cheat sheet to reference in the future). Also, this might prove timely for those folks looking to move off of Broadcom for the future.

Grab your kubeconfig and LESSSSSSSSSSSSSSSSSSSSS GOOOOOOOOOOOOOOOOOOO!


OKD: The OpenShift That Lives Rent-Free in the Community (Possibly also in Your Mind)


What it is: OKD is the community distribution of Kubernetes that serves as the upstream for Red Hat OpenShift. It includes many of the same developer tools, build pipelines, and cluster management features you’d find in commercial OpenShift, but without the Red Hat branding or support subscription.


What it feels like: Imagine building a custom muscle car in your garage. It might not have a dealer warranty, but you can tweak every part of it to match your needs.


When to use it: OKD is a solid choice for development teams that want OpenShift features without enterprise licensing. It also appeals to educational settings, homelab enthusiasts, or organizations looking to test workloads before moving to a supported platform like ARO or ROSA.


Tradeoffs: You are responsible for everything: setup, upgrades, maintenance, monitoring, and troubleshooting. If your team is small or inexperienced, you might spend more time fighting fires than deploying apps.


ARO: Azure Red Hat OpenShift


What it is: ARO is a fully managed OpenShift environment that runs inside Azure. Red Hat operates and supports the OpenShift components, while Microsoft handles the Azure infrastructure. You get unified billing, integrated support, and a smooth operator experience.


What it feels like: ARO is like ordering OpenShift from a restaurant where both the chef and the server work together to make sure the meal is perfect. You don’t need to worry about how the ingredients come together...you just focus on enjoying the dish.


When to use it: ARO works best for enterprises that have standardized on Azure and want a consistent application platform that includes guardrails for developers, baked-in DevSecOps workflows, and simplified operational overhead.


Tradeoffs: ARO is less flexible than raw Kubernetes but makes up for it with consistent operations, enterprise security integrations, and lifecycle management. You’re trading configurability for peace of mind.


ROSA: Red Hat OpenShift Service on AWS


What it is: ROSA is Red Hat’s managed OpenShift offering inside AWS. It is jointly developed with AWS, allowing you to use AWS-native tools like IAM and CloudWatch while deploying workloads using OpenShift’s developer and admin interfaces.


What it feels like: ROSA is like moving into a fully furnished apartment in a neighborhood you already know. You don’t need to bring your own couch or worry about utilities. You just move in and start working.


When to use it: Choose ROSA if you’re all-in on AWS and want an enterprise platform with Red Hat support. It’s especially useful if you already have compliance frameworks that prefer OpenShift’s multi-tenancy and audit features.


Tradeoffs: Just like ARO, you lose some of the flexibility of native Kubernetes. But for organizations already deep in AWS, the tradeoff makes sense. ROSA also offers tight alignment with AWS billing and tagging, making cost tracking easier.


OpenShift Virtualization: Run VMs Without Running Away from Containers


What it is: OpenShift Virtualization allows you to run virtual machines alongside containers inside your OpenShift cluster. This is useful for legacy workloads that cannot yet be containerized or for teams that want a single control plane for both types of compute.


What it feels like: It’s like renting out one room in your modern containerized house to an old friend who still listens to CDs and brings their own microwave. It might not be the future, but it’s still part of your life.


When to use it: Use OpenShift Virtualization when your organization is on the path to modernization but still has critical apps tied to virtual machines. This approach lets you converge your infrastructure while planning for gradual transformation.


Tradeoffs: Running VMs in Kubernetes is powerful but not always efficient. Expect some complexity in storage, performance tuning, and monitoring. This solution is ideal when you are consolidating platforms but not ready for a full rewrite.


AKS: Azure Kubernetes Service


What it is: AKS is Microsoft’s managed Kubernetes service. You get a streamlined deployment experience, integration with Azure Monitor and Azure Policy, and the option to use features like Microsoft Defender for Containers, DevOps pipelines, and Azure Arc.


What it feels like: AKS is like driving a mid-range SUV with automatic everything. It doesn’t try to be fancy, but it gets the job done reliably and comes with the right plugs for your phone.


When to use it: AKS is great for teams already invested in Azure and looking to build containerized applications without the overhead of managing Kubernetes themselves. It’s also well-suited for teams adopting GitOps or CI/CD pipelines using GitHub Actions or Azure DevOps.


Tradeoffs: You don’t get all the higher-level platform capabilities of OpenShift, such as built-in CI/CD tooling or developer portals. But you do get a lightweight Kubernetes experience with solid integrations.


EKS: Elastic Kubernetes Service on AWS


What it is: EKS is AWS’s managed Kubernetes service. It offers high availability, native integration with AWS services, and support for both EC2 and Fargate compute types. You still manage your own nodes (unless using Fargate), but the control plane is operated by AWS.


What it feels like: EKS is like owning a pickup truck that you have to fuel and maintain, but it hauls everything you need. It’s reliable and gets along well with the rest of your AWS gear, but you need to know how to drive it.


When to use it: Use EKS if your team has strong AWS knowledge and prefers to keep infrastructure decisions in their hands. It’s ideal for teams that want full access to AWS networking, IAM, and storage options while using Kubernetes.


Tradeoffs: EKS can be complex to set up properly, especially when managing IAM permissions and networking. The platform gives you power and flexibility but requires strong operational discipline.


GKE: Google Kubernetes Engine


What it is: GKE is Google Cloud’s managed Kubernetes service. As one of the first cloud providers to offer Kubernetes, Google has invested heavily in automation, stability, and features. You can choose between Standard and Autopilot modes depending on how much control you want.


What it feels like: GKE is like riding a high-speed train with a conductor that anticipates every turn. It’s smooth, well-maintained, and full of helpful features that most people won’t notice until something breaks elsewhere.


When to use it: GKE is a great choice for AI/ML teams, startups building cloud-native apps, or organizations that want cutting-edge Kubernetes features. Autopilot mode is especially appealing for teams that want to focus only on workloads and forget about nodes entirely.


Tradeoffs: While GKE is feature-rich, it can be intimidating if you’re not familiar with Google Cloud. Pricing can also be tricky to predict, especially when using multiple advanced services.


Final Thoughts (a.k.a. Shannon's Opinion)


Kubernetes might be the common thread, but the experience you get depends entirely on the platform you choose. Some options give you more control but require more effort. Others handle the hard parts but limit your flexibility. It all comes down to what your team needs today, and how much you’re willing to manage yourself.


OpenShift-based options like ARO and ROSA offer an opinionated experience with developer productivity in mind. Cloud-native services like AKS, EKS, and GKE prioritize tight integration with their respective ecosystems. And tools like OpenShift Virtualization give you a way to move forward without leaving old workloads behind.

In other words, Kubernetes is no longer just a container orchestrator. It’s now a menu of deployment strategies, cloud philosophies, and operational tradeoffs.


Choose wisely. Or better yet, choose the one that lets you ship fast, sleep at night, and avoid the 3 a.m. on-call page.

© 2020 Shannon B. Kuehn

  • LinkedIn
  • Twitter
bottom of page