We are seeking an experienced Senior Data Engineer specializing in data integration and ETL/ELT development to join our dynamic team.
This role focuses on building robust data pipelines, managing complex data transformations, and implementing scalable data solutions across multiple platforms and technologies.
Key Responsibilities:
Data Integration & ETL Development:
• Design, develop, and maintain ETL/ELT processes using modern tools including Azure Data Factory, Apache Airflow, dbt, SSIS, and Databricks.
• Build and optimize data transformation workflows to ensure efficient data processing and quality.
• Implement data integration solutions across various source systems and target platforms.
• Develop automated data pipelines that support business intelligence and analytics initiatives.
Database Management & Development:
• Work with multiple database platforms including SQL Server, Oracle, PostgreSQL, MySQL, SQLite, and MongoDB.
• Design and implement stored procedures, functions, and complex queries for data processing.
• Perform database optimization and performance tuning.
• Manage Snowflake environments including querying, data masking, and Cortex functionality.
Team Leadership & Project Management:
• Lead data integration development teams and mentor junior developers.
• Manage end-to-end data integration projects from requirements gathering to deployment.
• Define and implement company-wide standards for ETL development and data management.
• Create comprehensive technical and functional requirements documentation.
Data Governance & Quality:
• Implement Master Data Management (MDM) solutions.
• Establish data quality standards and validation processes.
• Ensure data security and compliance with organizational policies.
• Design and implement data masking and privacy protection measures.
Reporting & Analytics:
• Build reports and dashboards using Excel, Power BI, Hyperion, and Cognos.
• Support business intelligence initiatives with reliable data delivery.
• Collaborate with stakeholders to understand reporting requirements.
Required Technical Skills:
Programming & Scripting:
• Python: Advanced proficiency with pandas, NumPy, LangChain, Selenium, BeautifulSoup.
• SQL: Expert-level SQL development and optimization.
• Jinja: Template engine for dynamic SQL generation.
• Shell scripting for automation tasks.
Data Platforms & Tools:
• Cloud Platforms: Azure (Data Factory, DevOps), Databricks, Snowflake.
• ETL Tools: SSIS, Apache Airflow, dbt (data build tool).
• Databases: SQL Server, Oracle, PostgreSQL, MySQL, MongoDB.
• Data Formats: JSON, CSV, XML, YAML, Parquet, Excel/TXT files.
DevOps & Version Control:
• Git & GitHub for source code management.
• Azure DevOps/TFS for project management and CI/CD.
• Master Data Services (MDS).
• Linux and Windows operating systems.
Required Experience:
• 5+ years of experience in data engineering and ETL development.
• 3+ years of hands-on experience with cloud-based data platforms (Azure, Snowflake, Databricks).
• 2+ years of team leadership experience in data integration projects.
• Proven track record of managing complex data integration initiatives.
• Experience with Master Data Management implementation.
Preferred Qualifications:
• Bachelor's degree in Computer Science, Information Systems, or related field.
• Experience in financial services, insurance, or similar regulated industries.
• Certification in Azure Data Engineering or Snowflake.
• Experience with modern data architecture patterns (data mesh, lakehouse, etc.).
• Knowledge of data governance frameworks and best practices.
What We Offer:
• Opportunity to work with cutting-edge data technologies.
• Leadership role in defining data engineering standards.
• Remote work flexibility.
• Professional development opportunities.
• Collaborative team environment focused on innovation.