You want to deploy and manage containerized applications. which service should you use?

Contact us

Welcome,

Log in to your Red Hat account

Log in

Your Red Hat account gives you access to your member profile and preferences, and the following services based on your customer status:

  • Customer Portal
  • User management

  • Certification Central

Register now

Not registered yet? Here are a few reasons why you should be:

  • Browse Knowledgebase articles, manage support cases and subscriptions, download updates, and more from one place.
  • View users in your organization, and edit their account information, preferences, and permissions.
  • Manage your Red Hat certifications, view exam history, and download certification-related logos and documents.

For your security, if you're on a public computer and have finished using your Red Hat services, please be sure to log out.

Log out

Account Log in

Overview

Container orchestration automates the deployment, management, scaling, and networking of containers. Enterprises that need to deploy and manage hundreds or thousands of Linux® containers and hosts can benefit from container orchestration. 

Container orchestration can be used in any environment where you use containers. It can help you to deploy the same application across different environments without needing to redesign it. And microservices in containers make it easier to orchestrate services, including storage, networking, and security. 

Containers give your microservice-based apps an ideal application deployment unit and self-contained execution environment. They make it possible to run multiple parts of an app independently in microservices, on the same hardware, with much greater control over individual pieces and life cycles.

Managing the lifecycle of containers with orchestration also supports DevOps teams who integrate it into CI/CD workflows. Along with application programming interfaces (APIs) and DevOps teams, containerized microservices are the foundation for cloud-native applications.

What is container orchestration used for?

Use container orchestration to automate and manage tasks such as:

  • Provisioning and deployment
  • Configuration and scheduling 
  • Resource allocation
  • Container availability 
  • Scaling or removing containers based on balancing workloads across your infrastructure
  • Load balancing and traffic routing 
  • Monitoring container health
  • Configuring applications based on the container in which they will run
  • Keeping interactions between containers secure

Container orchestration tools provide a framework for managing containers and microservices architecture at scale. There are many container orchestration tools that can be used for container lifecycle management. Some popular options are Kubernetes, Docker Swarm, and Apache Mesos.

Kubernetes is an open source container orchestration tool that was originally developed and designed by engineers at Google. Google donated the Kubernetes project to the newly formed Cloud Native Computing Foundation in 2015.

Kubernetes orchestration allows you to build application services that span multiple containers, schedule containers across a cluster, scale those containers, and manage their health over time.

Kubernetes eliminates many of the manual processes involved in deploying and scaling containerized applications. You can cluster together groups of hosts, either physical or virtual machines, running Linux containers, and Kubernetes gives you the platform to easily and efficiently manage those clusters. 

More broadly, it helps you fully implement and rely on a container-based infrastructure in production environments.

These clusters can span hosts across public, private, or hybrid clouds. For this reason, Kubernetes is an ideal platform for hosting cloud-native apps that require rapid scaling.

Kubernetes also assists with workload portability and load balancing by letting you move applications without redesigning them. 

Main components of Kubernetes:

  • Cluster: A control plane and one or more compute machines, or nodes.
  • Control plane: The collection of processes that control Kubernetes nodes. This is where all task assignments originate.
  • Kubelet: This service runs on nodes and reads the container manifests and ensures the defined containers are started and running.
  • Pod: A group of one or more containers deployed to a single node. All containers in a pod share an IP address, IPC, hostname, and other resources.

How does container orchestration work?

When you use a container orchestration tool, such as Kubernetes, you will describe the configuration of an application using either a YAML or JSON file. The configuration file tells the configuration management tool where to find the container images, how to establish a network, and where to store logs.

When deploying a new container, the container management tool automatically schedules the deployment to a cluster and finds the right host, taking into account any defined requirements or restrictions. The orchestration tool then manages the container’s lifecycle based on the specifications that were determined in the compose file.

You can use Kubernetes patterns to manage the configuration, lifecyle, and scale of container-based applications and services. These repeatable patterns are the tools needed by a Kubernetes developer to build complete systems. 

Container orchestration can be used in any environment that runs containers, including on-premise servers and public cloud or private cloud environments.

Enterprise container orchestration

Real production apps span multiple containers. Those containers must be deployed across multiple server hosts. That’s where Red Hat® OpenShift® comes in. Red Hat OpenShift is Kubernetes for the enterprise—and a lot more.

Red Hat OpenShift includes all of the extra pieces of technology that makes Kubernetes powerful and viable for the enterprise, including: registry, networking, telemetry, security, automation, and services.

With Red Hat OpenShift, developers can make new containerized apps, host them, and deploy them in the cloud with the scalability, control, and orchestration that can turn a good idea into new business quickly and easily.

Certified software for container-based deployments

Try, buy, and manage certified software across public clouds, private clouds, and your datacenter. That’s the power of Red Hat Marketplace. It’s a simpler way to access the software you already rely on, build in a unified Kubernetes-based environment, and deploy anywhere.

Red Hat Marketplace means you’ll spend more time developing innovative solutions, not tracking down licenses, entitlements, and expirations.

Which service is used to run containerized applications on AWS?

Amazon Elastic Container Service (Amazon ECS) is a fully managed container orchestration service that provides the most secure, reliable and scalable way to run containerized applications.

Which service is used to run containerized applications on AWS Sagemaker?

Kubernetes is an open source system used to automate the deployment, scaling, and management of containerized applications. Kubeflow Pipelines is a workflow manager that offers an interface to manage and schedule machine learning (ML) workflows on a Kubernetes cluster.

Is ECS same as Kubernetes?

Amazon ECS is similar to EKS, but it relies on a proprietary control plane instead of Kubernetes. You are responsible for provisioning the host infrastructure, but ECS handles container orchestration.

Which AWS service can be used to store manage and deploy Docker container images?

Amazon ECR is a highly available and secure private container repository that makes it easy to store and manage Docker container images.