EPAM is a leading global provider of digital platform engineering and development services.
We are committed to having a positive impact on our customers, our employees, and our communities.
We embrace a dynamic and inclusive culture.
Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow.
No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
We are seeking a proactive and seasoned
Lead Data Software Engineer
to guide our team and drive the development of a cutting-edge platform for our client.
In this role, you'll not only contribute to implementing and optimizing technical solutions, but also take leadership responsibilities by helping shape the project strategy and mentoring team members during the migration of the back-end application.
Our client is one of the world's leading providers of reinsurance, insurance, and other forms of insurance-based risk transfer, working to make the world more resilient.
This is a unique opportunity to take the lead on a transformative project, collaborating with a talented team to deliver innovative solutions using industry-leading technologies.
Responsibilities
- Collaborate with cross-functional teams to design, build, and implement scalable data solutions on the Palantir platform, while acting as a technical leader to drive alignment on best practices
- Lead the development and optimization of Spark-based data pipelines, ensuring high performance for large-scale data processing and mentoring the team to maintain quality standards
- Establish coding guidelines and standards while writing efficient and clean Python code to support team workflows
- Utilize PySpark for advanced data manipulation and transformation within distributed frameworks; provide guidance and troubleshooting on complex tasks
- Conduct root cause analysis on internal and external data to address business questions, while driving the team to identify and execute improvement opportunities
- Define and oversee the implementation of DevOps/DataOps practices, such as Agile workflows, sprint planning, continuous integration, and deployment strategies
- Architect scalable and reusable data models and structures, ensuring they align with current and future business requirements
- Oversee relational database operations with a focus on advanced query optimization, while actively supporting team members to develop database skills
- Serve as a trusted advisor, providing strategic and technical expertise to evaluate solutions and make high-impact decisions
- Communicate effectively with key stakeholders, ensuring project alignment, scope clarity, and excellence in delivery
- Mentor Junior and Mid-level Engineers on technical execution, project strategy, and data engineering best practices
- Promote a culture of innovation within the team, encouraging experimentation with new tools and driving continuous learning
Requirements
- 5+ years of experience in Data Software Engineering
- Leadership experience exceeding 1 year applicable to the role
- Proven hands-on expertise with Palantir, PySpark, and Python
- Deep knowledge of Spark data pipelines and optimization in large-scale and diverse environments
- Advanced proficiency in relational databases and SQL query optimization, with proven experience working across various database platforms
- Strong background in data analysis and analytics, with a track record of performing root cause analysis on critical business processes and transformations
- Demonstrated expertise in implementing and overseeing DevOps/DataOps methodologies, including Agile project management, sprint planning, and CI/CD architectures
- Ability to translate detailed business requirements into strategic technical decisions, while fostering a systematic and solutions-driven approach
- Strong leadership and interpersonal communication skills, with the ability to build positive relationships across teams and stakeholders
- Excellent command of English (B2+ level), both written and spoken, with a strong emphasis on technical communication skills
Nice to have
- Familiarity with insurance, reinsurance, or financial industries
- Hands-on experience with Azure Databricks
- Experience working with Azure Data Services
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn