Keyrock Logo

Keyrock

Data Architect (Trading)

Posted 19 Days Ago
In-Office or Remote
29 Locations
Mid level
In-Office or Remote
29 Locations
Mid level
The Data Architect designs, implements, and maintains data architecture, enabling data intelligence, analytics, and MLOps, while ensuring security and governance.
The summary above was generated by AI

Data Architect

About Keyrock

Since our beginnings in 2017, we've grown to be a leading change-maker in the digital asset space, renowned for our partnerships and innovation. 

Today, we rock with over 180 team members around the world. Our diverse team hails from 42 nationalities, with backgrounds ranging from DeFi natives to PhDs. Predominantly remote, we have hubs in London, Brussels, Singapore and Paris, and host regular online and offline hangouts to keep the crew tight.

We are trading on more than 80 exchanges, and working with a wide array of asset issuers. As a well-established market maker, our distinctive expertise led us to expand rapidly. Today, our services span market making, options trading, high-frequency trading, OTC, and DeFi trading desks.

But we’re more than a service provider. We’re an initiator. We're pioneers in adopting the Rust Development language for our algorithmic trading, and champions of its use in the industry. We support the growth of Web3 startups through our Accelerator Program. We upgrade ecosystems by injecting liquidity into promising DeFi, RWA, and NFT protocols. And we push the industry's progress with our research and governance initiatives. 

At Keyrock, we're not just envisioning the future of digital assets. We're actively building it.

Position Overview

The Data Architect is responsible for designing, implementing, and maintaining an organization's data architecture and strategy, ensuring that data is collected, stored, and processed efficiently and securely to support business intelligence, data analytics, and machine learning operations (MLOps) practices.

Key Responsibilities
  • Designing Data Architecture: Plan and implement a robust, scalable data architecture that integrates data from various sources and supports diverse analytical needs, while optimizing costs and meeting business requirements.

  • Implementing Data Engineering Pipelines: Design and develop data pipelines for data extraction, transformation, and loading (ETL) processes, ensuring data quality and consistency.

  • Enabling Data Intelligence and Analytics: Build and maintain data warehouses, data marts, and data lakes to support business intelligence and data analytics initiatives.

  • Supporting MLOps Practices: Collaborate with data scientists and machine learning engineers to design and implement data infrastructure and processes that support machine learning model development, deployment, and maintenance.

  • Ensuring Data Security and Compliance: Implement security measures, policies, and procedures to safeguard data privacy and comply with relevant regulations.

  • Data Governance and Management: Establish and enforce data governance policies and standards to ensure data quality, integrity, and accessibility.

  • Collaborating with Cross-Functional Teams: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and translate them into technical solutions.

  • Staying Abreast of Technological Advancements: Keep up-to-date with emerging technologies and trends in data architecture, data engineering, and MLOps to identify opportunities for improvement and innovation.

  • Optimizing Data Performance: Monitor and analyze data processing performance, identify bottlenecks, and implement optimizations to enhance efficiency and scalability.

  • Documentation and Knowledge Sharing: Create and maintain comprehensive documentation of data architecture, models, and processing workflows.

Technical Requirements
  • Extensive experience in data architecture design and implementation.

  • Strong knowledge of data engineering principles and practices.

  • Expertise in data warehousing, data modelling, and data integration.

  • Experience in MLOps and machine learning pipelines.

  • Proficiency in SQL and data manipulation languages.

  • Experience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS.

Education & Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent experience.

  • Preferred certifications (optional):

    • AWS Cloud Data Engineer

    • AWS Machine Learning Ops Engineer

Leadership & Collaboration
  • Passion for building scalable, reliable, and secure systems in a fast-paced environment.

  • Ability to translate complex technical concepts into clear, actionable insights for technical teams.

  • Strong interpersonal skills with the ability to work effectively across cross-functional teams.

  • Excellent problem-solving and analytical skills.

Our recruitment philosophy

We value self-awareness and powerful communication skills in our recruitment process. We seek fiercely passionate people who understand themselves and their career goals. We're after those with the right skills and a conscious choice to join our field. The perfect fit? A crypto enthusiast who’s driven, collaborative, acts with ownership and delivers solid, scalable outcomes.

Our offer

  • Competitive salary package

  • Autonomy in your time management thanks to flexible working hours and the opportunity to work remotely 

  • The freedom to create your own entrepreneurial experience by being part of a team of people in search of excellence 

As an employer we are committed to building a positive and collaborative work environment. We welcome employees of all backgrounds, and hire, reward and promote entirely based on merit and performance.

Due to the nature of our business and external requirements, we perform background checks on all potential employees, passing which is a prerequisite to join Keyrock.

https://keyrock.com/careers/

Top Skills

Apache Arrow
Apache Iceberg
Spark
AWS
Clickhouse
Rust
SQL

Similar Jobs

17 Hours Ago
Easy Apply
Remote
28 Locations
Easy Apply
Entry level
Entry level
Artificial Intelligence • Cloud • Information Technology • Machine Learning • Natural Language Processing • Software
Smartling is seeking talented individuals to join their team and invites candidates to submit their resumes for future opportunities.
17 Hours Ago
Easy Apply
Remote
29 Locations
Easy Apply
Mid level
Mid level
Cloud • Security • Software • Cybersecurity • Automation
The Engineering Manager for Pipeline Execution is responsible for managing a team of engineers, improving CI features and processes, and collaborating with various stakeholders.
Top Skills: Ci/CdGraphQLPostgresRuby On RailsVuejs
17 Hours Ago
Easy Apply
Remote
28 Locations
Easy Apply
Mid level
Mid level
Cloud • Security • Software • Cybersecurity • Automation
As a Support Engineer at GitLab, you'll manage customer inquiries, troubleshoot issues, collaborate with teams, and enhance customer experience through technical solutions and documentation.
Top Skills: BashCi/CdGitlabKubernetesLinuxRubyRuby On RailsServerless

What you need to know about the Edinburgh Tech Scene

From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account