EPAM is a leading global provider of digital platform engineering and development services.
We are committed to having a positive impact on our customers, our employees, and our communities.
We embrace a dynamic and inclusive culture.
Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow.
No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are looking for a
Senior DevOps Data Engineer
to take a leading role in developing, scaling, and optimizing cloud-native analytics and data infrastructure.
This position demands mastery in AWS, EKS, Terraform, and Python, with cross-functional expertise in DevOps, SRE, Data Engineering, and Cloud Architecture.
Responsibilities
- Design Terraform-based Infrastructure as Code (IaC) to enable scalable and automated environments
- Manage and optimize EKS-based platforms, including services like Starburst/Trino, Airflow, and JupyterHub
- Build robust CI/CD pipelines to standardize deployments and improve observability workflows
- Implement monitoring systems using tools like Datadog and Splunk while integrating OpenTelemetry standards
- Collaborate on developing Python-based internal tools to enhance operational efficiency for data workflows and developers
- Operate containerized solutions using Kubernetes, HELM, and Docker for secure, scalable orchestration within AWS EKS
- Drive migration initiatives to Starburst/Trino to improve analytics platform performance and reliability
Requirements
- 3+ years of experience in DevOps or Site Reliability Engineering
- Advanced knowledge of the AWS ecosystem, including EKS, EC2, IAM, ELB, and S3
- Proficiency in Terraform and Ansible for infrastructure automation, alongside CI/CD pipeline management
- Expertise in Kubernetes, HELM, and Docker for container orchestration and deployment
- Strong programming skills in Python for tooling and scripting, alongside Bash for automation tasks
- Experience with monitoring tools such as Datadog and Splunk, including integration into cloud environments
- Strong written and verbal English communication skills (B2+)
Nice to have
- Knowledge of Vault, Packer, Databricks, ShinyProxy, and Tableau Dashboards
- Familiarity with observability standards like OpenTelemetry and OpenLineage integrations
Technologies
- Orchestration Technologies: Kubernetes, Helm, Docker
- Cloud & Infrastructure as Code (IaC): AWS, EKS, Terraform, Ansible, Vault, Packer
- Observability Tools: Datadog, Splunk
- Data & Analytics Tools: Trino, Starburst, JupyterHub, Apache Airflow, Spark, Databricks, Tableau
- Programming Languages: Python, Bash
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn