Snowplow Logo

Snowplow

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in London, Greater London, England
Mid level
Remote
Hiring Remotely in London, Greater London, England
Mid level
The role involves developing and maintaining SQL data models and implementing real-time data pipelines, with a focus on data engineering and cloud technologies to enhance customer data insights and analytics.
The summary above was generated by AI
Data Engineer

London/Hybrid - Europe/Remote

#LI-Remote

About Snowplow:

Snowplow is the global leader in customer data infrastructure for AI, enabling every organization to transform raw behavioral data into governed, high-fidelity fuel for AI-powered applications—including advanced analytics, real-time personalization engines, and AI agents.

Digital-first companies like Strava, HelloFresh, Auto Trader, Burberry, and DPG Media use Snowplow to collect and process event-level data in real time, delivering it securely to their warehouse, lake, or stream, and integrate deep customer context into their applications.

Thousands of companies rely on Snowplow to uncover customer insights, predict customer behaviors, hyper-personalize customer experiences, and detect fraud in real time.

The Opportunity:

You’ll be joining the AI team at an exciting moment, as our team is leading the charge in building Snowplow Signals — our new solution for enabling real-time, AI-powered customer experiences. Your work will contribute to the infrastructure that powers these capabilities, unlocking value from behavioral data in real time.

Alongside this, you’ll work on the foundational components that continue to make Snowplow powerful and flexible for data teams: developing and maintaining dbt packages that help customers accelerate their time to insight.

This is a role for someone who enjoys bridging data collection, transformation through SQL-based models, and high-level intelligence — enabling real-time pipelines that support analytics, personalization, and emerging agentic experiences.

What you’ll be doing:

Developing and maintaining a suite of production SQL data models used by our customer base and the community.  Primarily focusing on dbt for packaging and execution.
Building out our offering around data modeling. You won’t just work on the data models themselves - you’ll work closely with Product and the wider Engineering team to shape the way we collect data via our trackers to build better data models, and drive what data model tooling we provide as part of our commercial offering as well.
●  Building and optimizing real-time and batch data pipelines that power personalization, recommendation engines, and predictive models — from streaming ingestion and transformation using Benthos, to materializing features for agentic use cases and customer-facing AI systems. 
●  Supporting our prospect/customer facing teams by showcasing the possibilities with Snowplow data, such as powering personalisation and recommendation systems, or developing advanced models (marketing attribution, lifetime value, etc.). 
Being an active part in decision making on what we build to help our customers get more value out of Snowplow, and how we deliver it. You’ll bring a different perspective and we’ll want your input! 
Responsible for developing and productizing data models, focusing on scalability, performance and maintainability, and developing in-depth understanding of cloud data warehouses and common web and mobile analytics use cases.

We’d love to hear from you if:

SQL and DBT are your thing. You master SQL, including manipulation of large data sets, performance tuning, etc 
You're adept with both batch and streaming data processing. You have experience building streaming pipelines using tools like Benthos, enabling real-time data ingestion, transformation, and delivery across various systems.
You understand feature engineering and management. You're familiar with tools like Feast for defining, materializing, and serving features in both real-time and batch contexts.
You have extensive experience using Python which is used for auto generating data models.
You are not new to engineering. You use CI/CD, and Git source control as part of your daily job. You have experience with testing frameworks.
You are a proactive learner. Eager to expand on your software engineering knowledge and adapt to new technologies essential for automating models and advancing our engineering practices.
You’re familiar with cloud technologies. You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery)
Experience with a modern data modeling technology (DBT)
You document and communicate clearly. Some experience with technical content writing would be a plus
You are excited to work autonomously. You are able to drive technical initiatives from discovery until to the delivery phase
You know we can’t do everything today. You’ll be pragmatic and balance our speed of delivery with our commitment to providing a reliable and trusted service to customers
You want to join a remote team that depends on expert collaborators to work effectively. You’ll be a great communicator and enjoy working closely with the team 

Snowplow is dedicated to building and supporting a brilliant, diverse and hugely inclusive team. We don't discriminate against gender, race, religion or belief, disability, age, marital status or sexual orientation. Whatever your background may be, we welcome anyone with talent, drive and emotional intelligence.

Top Skills

Benthos
BigQuery
Databricks
Dbt
Python
Redshift
Snowflake
SQL

Similar Jobs

2 Days Ago
Remote
Blackburn, Lancashire, England, GBR
Senior level
Senior level
Energy
The Principal Data Engineer leads cross-functional teams to design and build scalable data solutions, ensuring alignment with client goals and best practices in data governance and architecture.
Top Skills: AWSAzureGCPIbm CognosPower BIPythonSQL
5 Days Ago
Remote
14 Locations
Mid level
Mid level
Blockchain • Software • Cryptocurrency
Design, build, and deploy recommendation systems. Maintain real-time and batch data pipelines, focusing on anomaly detection to identify bot traffic.
Top Skills: AirflowAWSAzureGCPJavaKafkaMongoDBPub/SubPythonRedisSQL
16 Days Ago
Remote
United Kingdom
Junior
Junior
Fintech • Software • Financial Services
Contribute to data processing infrastructure, write production-quality code, improve test pipelines, model data for business needs, and support data processing systems.
Top Skills: AirflowAnsibleCentosConfluenceElasticsearchGitHadoopHbaseHdfsJenkinsJIRAMongoDBNifiNode.jsPysparkPython

What you need to know about the Edinburgh Tech Scene

From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account