Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Lead Data Engineer.
Colombia Jobs Expertini

Urgent! Lead Data Engineer Job Opening In Colombia, Colombia – Now Hiring EPAM Systems



Job description

1 week ago Be among the first 25 applicants

Get AI-powered advice on this job and more exclusive features.

EPAM is a leading global provider of digital platform engineering and development services.

We are committed to having a positive impact on our customers, our employees, and our communities.

We embrace a dynamic and inclusive culture.

Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting‑edge solutions, and have an opportunity to continuously learn and grow.

No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

We are looking for a skilled Lead Data Engineer to join our team and drive innovative data solutions.

The ideal candidate will have a solid foundation in software engineering, preferably within contact center solutions, and possess strong expertise in project management, stakeholder engagement, and leadership.

Responsibilities

  • Collaborate with peers and leaders across the organization to foster effective communication and build strong partnerships
  • Establish credibility as a technical expert and leader, driving collaboration with architects, principal engineers, and other technical specialists
  • Promote the use of advanced techniques and technologies, such as process automation and innovative engineering practices, to enhance business value and reduce operational complexity
  • Create and manage cloud-based resources in AWS
  • Design and implement data ingestion pipelines from various sources, including RDBMS, REST APIs, flat files, streams, and time-series data
  • Utilize Big Data technologies to implement data ingestion and processing workflows
  • Process and transform data using technologies like Spark and cloud services
  • Apply business logic to data processing using programming languages supported by the data platform
  • Develop automated data quality checks to ensure the accuracy and integrity of ingested data
  • Build infrastructure to collect, transform, combine, and distribute customer data
  • Identify opportunities for process improvement to optimize data collection, insights, and reporting
  • Ensure data accessibility, scalability, and accuracy while maintaining flexibility and efficiency in data solutions
  • Analyze complex data sets to identify trends and patterns
  • Create frameworks and utilize data visualization tools to present actionable insights to stakeholders
  • Actively participate in Scrum ceremonies with agile teams
  • Develop queries, generate reports, and present findings clearly to stakeholders
  • Mentor and guide junior team members, sharing best practices and industry standards

Requirements

  • Bachelor’s or Master’s degree in math, statistics, computer science, data science, or a related discipline
  • A minimum of 5 years of experience as a Data Engineer in industries such as consumer finance, consumer loans, collections, servicing, optional products, or insurance sales
  • At least 1 year of experience leading and managing development teams
  • Advanced knowledge of at least one programming language: Snowflake, Java, Scala, Python, or C#
  • Hands‑on experience with HDFS, YARN, Hive, Spark, Kafka, Oozie/Airflow, AWS, Docker, Kubernetes, and Snowflake
  • Proficiency with data mining and programming tools such as SAS, SQL, R, or Python
  • Expertise in database technologies such as PostgreSQL, Snowflake, and Greenplum
  • Willingness to learn and deploy new technologies and tools
  • Strong organizational skills to manage multiple projects and meet deadlines effectively
  • Excellent written and verbal communication skills with the ability to present technical results to non‑technical stakeholders
  • Knowledge of business intelligence and analytical tools, technologies, and methodologies
  • Fluent English skills, both written and spoken, at a B2+ level or higher

Nice to have

  • AWS certification
  • Experience with Spark Streaming
  • Knowledge of Kafka Streaming or Kafka Connect
  • Familiarity with ELK Stack
  • Experience with Cassandra or MongoDB
  • Hands‑on experience with CI/CD tools such as Jenkins, GitLab, Jira, or Confluence
  • Knowledge of database technologies like Redshift

We offer

  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award‑winning culture recognized by Glassdoor, Newsweek and LinkedIn

Seniority level

  • Mid‑Senior level

Employment type

  • Full‑time

Job function

  • Engineering, Information Technology, and Business Development

Industries

  • Software Development, IT Services and IT Consulting, and Financial Services

#J-18808-Ljbffr


Required Skill Profession

Gestión Informática Y Gestión De Proyectos Informáticos



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Lead Data Potential: Insight & Career Growth Guide