As a Data Engineer at Spotify's User Platform, you'll work to produce, process, and analyze data to generate insights for user identity and access patterns. You'll contribute to data modeling, design data pipelines, enhance data quality, and collaborate with engineers and data scientists to build solutions that aid strategic decisions and improve user acquisition and retention.
We are seeking a dedicated and engaged Data Engineer to join User Platform, a studio within Platform Mission at Spotify. Our team plays a pivotal role in generating trusted user data—some of the most critical metrics at Spotify—used for strategic decision-making, supporting global growth initiatives, and essential internal and external reporting.
We want to enable anyone at Spotify to understand users’ identity and access patterns effectively, to form better experiences within user acquisition and retention, strengthen account security as well as improve the global Spotify product decision making. Achieving this requires a significant investment in enhancing our IAM data health and structure.
What You'll Do
- Explore new ways of producing, processing, and analyzing data in order to unlock insights into both our users and our product features
- Contribute to data modeling efforts with a data-as-a-product approach, expanding our current metrics to meet evolving business requirements
- Design, develop, and test robust data pipelines capable of processing billions of data points, using ground breaking data processing frameworks, technologies, and platforms
- Enhance data quality through testing, tooling, and continuous performance evaluation
- Work with other software engineers, data scientists, and decision-makers, such as engineering and product managers, to build solutions and gain novel insights.
- Act as the bridge between our Engineering and Insights team and work on data cataloging / management and build/maintain crucial data pipelines
Who You Are
- You have a strong foundation in software engineering and are proficient in Python, Java, and SQL, which are essential for this role. Experience with Scala and familiarity with cloud solutions is a plus.
- You have experience with JVM-based data processing frameworks such as Beam, Spark, and Flink. You understand their APIs and can debug their internals
- You care deeply about bringing software alive in an agile way, reliability, and responsible experimentation as well as being a strong advocate for engineering best practices such as continuous integration and delivery.
- You have a proven understanding of data modelling, data access, and data storage, caching, replication, and optimisation techniques.
- You understand the value of collaboration within teams. You are comfortable with asynchronous communication, being able to work independently while always sharing context with your team members.
- Experience with Backend Engineering is a plus.
- Working knowledge of Kubernetes (GKE) is a plus.
Where You'll Be
- This role is based in London (UK) or Stockholm (Sweden)
- We offer you the flexibility to work where you work best! There will be some in person meetings, but still allows for flexibility to work from home.
Top Skills
Beam
Cloud Solutions
Flink
Gke
Java
Kubernetes
Python
Scala
Spark
SQL
Similar Jobs
Financial Services
As a Data Lake Engineer at JPMorgan Chase, you will enhance and deliver technology products, develop scalable coding frameworks, ensure code quality, provide technical guidance, and contribute to the engineering community in a diverse, agile environment focused on data management and cloud technologies.
Financial Services
The Data Engineer will design and deliver technology solutions, execute software solutions, maintain high-quality production code, and analyze and visualize data from large datasets to improve software applications and system performance. The role emphasizes collaboration within an agile team environment and contributions to diverse technical communities.
Artificial Intelligence • Fintech • Other • Automation
The Data Production Engineer will work closely with live trading teams, supporting automated trading systems by automating data tasks, analyzing datasets, debugging data issues, and providing production support. The role emphasizes collaboration and problem-solving in a fast-paced trading environment.
Top Skills:
ETLLinuxMssqlMySQLPostgresPythonSQL
What you need to know about the Edinburgh Tech Scene
From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.