About Katapult
At Katapult, we connect top-tier talent from Latin America with innovative companies across the globe.
We specialize in building strong engineering teams that deliver high-quality software solutions while working in collaborative and dynamic environments.
Role Overview
We are looking for a
Data Engineer with strong backend experience in Python
to join our client's team.
The ideal candidate will be responsible for designing, developing, and optimizing data pipelines, APIs, and backend services that enable scalable and efficient data solutions.
This role requires a strong backend engineering mindset combined with expertise in modern data technologies.
Responsibilities
- Design, build, and maintain data pipelines and ETL processes using Python.
- Develop and optimize backend services and APIs for data integration and processing.
- Collaborate with data scientists, analysts, and backend engineers to deliver scalable data-driven solutions.
- Ensure data quality, reliability, and performance across data systems.
- Implement best practices for data modeling, storage, and retrieval in SQL and NoSQL databases.
- Monitor, troubleshoot, and optimize pipelines and backend services for high availability.
- Work with cloud platforms (AWS, GCP, or Azure) to deploy and scale data solutions.
Requirements
- 5+ years of professional experience as a Data Engineer or Backend Engineer.
- Strong proficiency in Python for backend development and data engineering.
- Hands-on experience with ETL tools, data pipelines, and workflow orchestration (e.g., Airflow, Prefect, Luigi).
- Solid understanding of SQL (query optimization, stored procedures) and experience with relational databases (PostgreSQL, MySQL, etc.).
- Experience with NoSQL databases (MongoDB, DynamoDB, or similar).
- Familiarity with cloud services (AWS, GCP, or Azure) and containerization (Docker, Kubernetes).
- Knowledge of CI/CD practices and version control (Git).
- Strong problem-solving skills and a backend engineering mindset.
- Intermediate to advanced English communication skills to work with international teams.
Nice to Have
- Experience with data streaming technologies (Kafka, Kinesis, or Pub/Sub).
- Familiarity with data warehousing solutions (Snowflake, BigQuery, Redshift).
- Exposure to DevOps practices and infrastructure-as-code (Terraform, CloudFormation).