Requirements:
- 3+ years of experience in backend development with Python 3.
- Advanced knowledge of SQL (PostgreSQL preferred).
- Strong understanding of API development (FastAPI, Flask, or similar).
- Understanding of data modeling fundamentals
- Knowledge of Git and collaborative development workflows.
Nice to have:
- Experience with Polars, DuckDB, or similar data analysis tools.
- Experience with Redis or similar caching.
- Experience with GCP.
- Familiarity with dimensional modeling/data warehouse concepts.
- Knowledge of TypeScript &
- Background in financial/operational data.
Responsibilities:
- Design pipelines that connect to over 100 systems (APIs, CSV, ODBC).
- Build scalable connectors for ERPs, CRMs, HRIS, and production systems.
- Use Polars and DuckDB to process diverse datasets.
- Combine operational and financial data into KPIs.
- Enhance the multidimensional time-series modeling engine.
- Build high-performance APIs (FastAPI, Arrow Flight).
- Implement caching and enrichment layers.
- Develop anomaly detection and predictive analytics.
- Generate intelligent mapping suggestions that link operational drivers to financial outcomes.
Your first 90 days' assignments:
- Week 1–2: Ship your first small improvement.
- Month 1: Build or improve a data connector.
- Month 2: Own a performance optimization.
- Month 3: Lead design of a new AI-powered feature.