Staff Software Engineer
Snowplow Analytics
Software Engineering
poland, oh, usa
About the Team
The Snowplow Signals team builds the intelligence layer on top of Snowplow's behavioral data infrastructure. Signals enables customers to derive real-time scores, predictions, and AI-powered interventions directly from their behavioral event streams. The team works at the intersection of data engineering, machine learning, and product development — shipping capabilities that make Snowplow's behavioral data actionable in real time.
This is a high-impact, high-ownership role. You will work alongside a small, senior team and have direct influence over architecture, ML model design, and the roadmap direction of one of Snowplow's most strategically important product areas.
The Role
As a Staff Software Engineer on the Snowplow Signals team, you will be a technical leader responsible for the full lifecycle of ML-powered features — from prototype to production to ongoing operation. You will design, build, release, and support machine learning models that process real-time behavioral event streams at scale, and own the SQL data models that power downstream analytics and customer-facing Signals outputs.
At the Staff level, we expect you to operate across team boundaries, set technical direction, mentor engineers, and drive engineering excellence through architecture decisions, code quality, and delivery practices.
Required
Proven experience building, releasing, and supporting machine learning models in production — not just prototype or research environments
Strong command of SQL and experience supporting SQL-based data models at scale
Hands-on experience with dbt (data build tool) for data modeling, testing, and pipeline management
Proficiency in Python and/or Scala for ML and data engineering workloads
Experience with cloud-native data infrastructure (AWS, GCP, or Azure) and streaming or event-driven architectures
Strong software engineering fundamentals: version control, CI/CD, testing, code review, and observability
Excellent written and verbal communication; able to explain complex technical trade-offs to varied audiences
Nice to Have
Experience with agentic coding workflows and LLM tooling such as Claude Code, GitHub Copilot, or similar AI-assisted development environments
Familiarity with behavioral event data, clickstream analytics, or customer data platforms
Contributions to open source projects, technical writing, or public engineering work
Experience working in early-stage or scaling product companies where ownership and autonomy are the norm
Machine Learning & AI Engineering
Design, train, deploy, and operate machine learning models in production environments, with a focus on reliability, performance, and observability
Build real-time and batch ML pipelines on top of Snowplow's behavioral event streams
Own the operational health of ML models, including monitoring, retraining pipelines, and incident response
Define and enforce ML model standards — versioning, evaluation frameworks, feature engineering, and deployment practices
Collaborate with Product and Data teams to translate business objectives into ML model specifications
Data Engineering
Design and maintain SQL data models that power Signals outputs and customer-facing analytics
Own dbt model development, testing, documentation, and deployment within the Signals domain
Ensure data model correctness, performance, and scalability as customer event volumes grow
Partner with Analytics Engineers and data consumers to evolve schemas and enrichments