Be among the first 25 applicants.
2 days ago.
Company Description
We are a multinational team of individuals who believe that, with the right knowledge and approach, technology is the answer to the challenges businesses face today.
Since 2016, we have brought this knowledge and approach to our clients, helping them translate technology into their success.
Lead Data Engineer
We are seeking a highly skilled Lead Data Engineer with expertise in PySpark, SQL, and Python, Azure Data Factory, Synapse, Databricks and Fabric, as well as a solid understanding of ETL and data warehousing end‑to‑end principles.
Responsibilities
- Design and develop scalable data pipelines using PySpark to support analytics and reporting needs.
- Write efficient SQL and Python code to transform, cleanse, and optimize large datasets.
- Collaborate with machine learning engineers, product managers, and developers to understand data requirements and deliver solutions.
- Implement and maintain robust ETL processes to integrate structured and semi‑structured data from various sources.
- Ensure data quality, integrity, and reliability across pipelines and systems.
- Participate in code reviews, troubleshooting, and performance tuning.
- Work independently and proactively to identify and resolve data‑related issues.
- Contribute to Azure‑based data solutions, including ADF, Synapse, ADLS, and other services.
- Support cloud migration initiatives and DevOps practices.
- Provide guidance on best practices and mentor junior team members when needed.
Qualifications
- 8+ years overall experience working with cross‑functional teams.
- 3+ years hands‑on experience developing and managing data pipelines using PySpark.
- 3–5 years experience with Azure‑native services, including ADLS, ADF, Databricks, Synapse, Fabric, and Azure SQL.
- Strong programming skills in Python and SQL.
- Solid experience doing ETL processes and data modeling/data warehousing end‑to‑end solutions.
- Self‑driven, resourceful, comfortable working in dynamic, fast‑paced environments.
- Advanced written and spoken English is a must (B2, C1 or C2 only).
- Strong communication skills.
Nice to have
- Databricks certification.
- Knowledge of DevOps, CI/CD pipelines, and cloud migration best practices.
- Familiarity with Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Services, and Cosmos DB.
- Basic understanding of SAP HANA.
- Intermediate‑level experience with Power BI.
Additional Information
- Contract type: Independent contractor (no PTO, tax deductions, or insurance).
- Location: 100% remote for nearshore candidates in Central/South America; client based in the U.S.
- Contract duration: Initially 6 months, with extension possibility.
- Working hours: Full‑time, Monday to Friday 8:00 AM – 5:00 PM PST (U.S. time zone).
- Equipment: Contractors use their own laptop/PC.
- Start date: As soon as possible.
- Payment methods: International bank transfer, PayPal, Wise, Payoneer, etc.
- Requirements verification: Video interview with Bertoni Process Steps.
- Partner/client process: CV review, technical video interview, client interviews.
- Benefits: Be part of an innovative team, collaborative environment, professional development and career growth.
- Referrals increase your chances of interviewing at Bertoni Solutions by 2x.
#J-18808-Ljbffr