SatoshiLabs Logo

SatoshiLabs

Data Engineer

Posted 7 Days Ago
Remote
28 Locations
Mid level
Remote
28 Locations
Mid level
The Data Engineer will build and maintain data pipelines, collaborate with analysts, and optimize data processes for diverse teams.
The summary above was generated by AI

Trezor invented the world’s first hardware wallet and created an entirely new industry. We set industry standards including Passphrases, Shamir Backup and Recovery seeds. We value transparency with open-source hardware and software, and we’re dedicated to securing individual freedoms and privacy. We are a self-funded, independent innovator.

We’re looking for an experienced Data Engineer who will help us maintain the existing (and create new) systems for collecting, validating, and preparing high-quality data. Your main responsibility will be building pipelines to bring data together from different source systems and make them easily accessible for our analysts and other stakeholders. Today, we’re a company of over 100 people that ships Trezor hardware wallets globally and your work will support most of our teams - Product, E-shop, Logistics, Finance, Customer Support and HR.

Let’s have a look at some details.

👉 What will your duties be?

  • Collaborate closely with data analysts and other stakeholders to understand and meet data needs

  • Implement data requirements from extraction through processing to deployment, including troubleshooting and maintenance

  • Build, test, and maintain data pipelines to create reliable data sources for further use

  • Assess stakeholder needs through communication and system evaluations, modifying data sources to ensure successful integration

  • Provide consultation and advice on data collection and processing methods and optimising the use of existing systems

✍️ What makes you the perfect candidate?

  • At least 3 years of experience in a similar position, ideally with a wide range of collected data types

  • Proficiency in SQL and relational databases

  • Good knowledge of Python

  • Experience with ETL/ELT processes and tools for data integration and management (e.g., Keboola Connection, Google Cloud)

  • Experience with API connectors

  • Familiarity with cloud-based data warehouses and analytics platforms (e.g., Snowflake, BigQuery)

  • (Optional) Experience with app and web tracking tools

  • (Optional) Experience with BI tools (e.g., Tableau) and the ability to create interactive dashboards

  • (Optional) Experience with AI in data engineering

  • Strong problem-solving skills and attention to detail, independence, reliability and thoroughness

  • Great (clear, factual) communication and teamwork abilities

🤝 What will you get in return?

  • Unique opportunity to be a part of a brand that has revolutionized the crypto industry more than once

  • Possibility to receive part of your compensation in bitcoin

  • Flexible working hours, as well as the possibility of working from home

  • Budget for professional development (training programs, courses, and workshops of your choice)

  • Renovated offices (including gym, football table, billiards, PlayStation and 3D printer)

  • Other benefits include a MultiSport card, company mobile phone tariff, and more.

  • Free on-site parking

👋 Sounds good? We want to hear from you: just submit your CV along with a cover letter. We’ll definitely get in touch with you as soon as we review your application, most likely within a week.

Top Skills

APIs
BigQuery
Elt
ETL
GCP
Keboola Connection
Python
Snowflake
SQL
Tableau

Similar Jobs

Yesterday
Remote
Neo Psychiko, GRC
Junior
Junior
Information Technology • Software • Automation
The Data Engineer / Analytics will collect, clean, analyze large datasets, design dashboards, collaborate with teams for insights, develop predictive models, ensure data accuracy, and stay updated with data analytics trends.
10 Days Ago
Remote
28 Locations
Senior level
Senior level
Payments • Analytics
As a Data Engineer, you'll design and maintain scalable data pipelines, craft high-quality code, and collaborate with other teams to enhance systems and data quality.
Top Skills: Apache AirflowApache IcebergApache KafkaSparkAthenaAWSAzureDbtEmrGCPGlueGreat ExpectationsKinesisPostgresRedshiftS3SodaSQLTemporal
2 Days Ago
Remote
Athens, GRC
Mid level
Mid level
Information Technology • Software • Automation
The Data Engineer will develop applications, manage big data systems, and work with relational databases and Apache Spark. The role involves enterprise software maintenance and knowledge of machine learning.
Top Skills: Apache AirflowSparkAzureAzure Data FactoryCi/Cd ToolsDockerOracle DatabaseOracle Pl/SqlPythonSQL ServerT-Sql

What you need to know about the Edinburgh Tech Scene

From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account