Python Developer — Data Engineering & AI Automation
(GCP, VoIP/Telephony, LLMs & MLOps)
TL;DR
: Build production-grade data pipelines and real‑time voice automation.
You'll ship Python services on
GCP
(BigQuery, Dataflow, Pub/Sub, Cloud Functions/Composer/Vertex), integrate
Twilio/Asterisk/Vicidial
, and operationalize
LLM + TTS/STT
workflows with strong data/infra reliability.
Why this role matters
Our products power AI-driven communications (including ringless voicemail automation) at scale.
Your work will turn messy call events, transcripts, and analytics into reliable, low‑latency systems that drive customer outcomes and revenue.
What you'll do (Outcomes)
- Ship reliable data pipelines
in GCP (BigQuery, Dataflow, Pub/Sub, Cloud Functions, Composer) with
SLOs
for latency, throughput, and cost per GB processed.
- Build Python backends/APIs
(FastAPI/Flask), codify contracts with
Pydantic
, manage persistence via
SQLAlchemy + Alembic
, and keep schemas & migrations healthy across environments.
- Own VoIP integrations
: call routing, voicemail drops, campaign automations with
Twilio (Voice, Messaging, Studio, Conversations, TaskRouter)
; customize/scale
Asterisk/Vicidial
for outbound dialing.
- Operationalize voice AI
: integrate
TTS/STT
(Cartesia, ElevenLabs, Google, AWS Polly), wire
real‑time STT → intent → TTS
loops, and measure WER/latency.
- Enable LLM workflows
: build dataset prep, evaluation, and inference pipelines; support
Vertex AI
jobs, registries, CI/CD, monitoring, and rollbacks.
- Make systems observable
: metrics, traces, and logs (e.g., OpenTelemetry, Cloud Monitoring) with dashboards and actionable alerts.
- Collaborate
with data/ML/DevOps/product on architecture, reviews, and production readiness.
Tech you'll touch
Python, FastAPI/Flask, Pydantic, SQLAlchemy, Alembic, PostgreSQL/MySQL
GCP
: BigQuery, Dataflow, Pub/Sub, Cloud Functions, Cloud Run, Composer (Airflow), Vertex AI, Cloud Storage
VoIP
: Twilio (Voice/Messaging/Conversations/Studio/TaskRouter),
Asterisk/Vicidial
, SIP/RTP
Voice/LLM
: Cartesia, ElevenLabs, Google TTS/STT, AWS Polly, Whisper (as applicable)
Platform/Tooling
: Docker, Terraform, GitHub Actions, pytest, Ruff/mypy, OpenTelemetry, Prometheus/Grafana/Cloud Monitoring
Must‑have skills
- Python
expertise for APIs, services, and data pipelines.
- GCP
experience (BigQuery, Dataflow, Pub/Sub, Storage, Composer, Vertex AI).
- Pydantic
for validation;
SQLAlchemy
for ORM;
Alembic
for migrations.
- Strong
SQL
and relational schema design.
- Practical
VoIP/telephony
knowledge (SIP/RTP, call flows) and
Twilio
APIs.
Nice‑to‑have skills
- Cartesia/ElevenLabs
and other TTS/STT providers.
- Asterisk/Vicidial
customization & scaling (e.g., AMI/ARI).
- LLM integration (GPT/PaLM/OSS), prompt/latency optimization in real‑time paths.
- MLOps
(Vertex pipelines, model registry, evaluation/monitoring).
- Event streaming
(Kafka, Pub/Sub).
- Security/Compliance awareness
for voice/data (e.g., PII redaction, call recording governance; STIR/SHAKEN/TCPA considerations where relevant).
How we work
- Remote/hybrid flexibility; async‑first with clear ownership and fast feedback loops.
- Production‑minded: testing, canaries, rollbacks, and on‑call rotation shared across the team.
- Ship small, measure impact, iterate.
Success looks like (30/60/90)
- 30 days
: Production access + dashboards; ship a small service or pipeline; add unit/integration tests around a Twilio webhook or Pub/Sub consumer.
- 60 days
: Replace a fragile ETL with
Dataflow
; cut pipeline cost/latency by ≥25%; harden Alembic migrations with blue/green strategy.
- 90 days
: Launch a real‑time
STT → LLM → TTS
microservice with SLOs (
Compensation & benefits
- Competitive
salary + performance incentives
.
- Remote/hybrid, learning budget (GCP certs, ML/telephony), modern tooling, and clear growth paths.
Hiring process (what to expect)
- Intro call
(role fit, past systems you've owned).
- Technical deep‑dive
(data pipeline & telephony scenarios).
- Practical exercise
(see below) or code walkthrough.
- Panel
with ML/DevOps/product on cross‑functional design.
- Offer.
Practical exercise (2–4 hours, take‑home or live)
- Build a small
FastAPI
service that: - Validates an inbound
Twilio Voice
webhook with
Pydantic
, - Publishes events to
Pub/Sub
, - Streams to
BigQuery
via
Dataflow
(template or Python SDK), - Manages tables/migrations with
Alembic
, - Includes
pytest
tests and basic
OpenTelemetry
traces.
- Bonus: integrate a
TTS/STT
provider and measure E2E latency.
Equal opportunity
We welcome applicants of all backgrounds and identities.
If you're excited by the role but don't meet 100% of the bullets, please apply—
skills grow quickly in our environment
.