Input Output Logo

Input Output

Data Engineer - Technical Intelligence

Reposted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in United Kingdom
Mid level
Remote
Hiring Remotely in United Kingdom
Mid level
You will be responsible for maintaining and setting up automated data ingestion pipelines, simplifying existing data pipelines, designing a cloud-native data warehouse, and collaborating with data scientists for ML model deployment. Key tasks include data storage, data manipulation, and cloud infrastructure management.
The summary above was generated by AI

Description

Who are we?

IOG, is a technology company focused on Blockchain research and development. We are renowned for our scientific approach to blockchain development, emphasizing peer-reviewed research and formal methods to ensure security, scalability, and sustainability. Our projects include decentralized finance (DeFi), governance, and identity management, aiming to advance the capabilities and adoption of blockchain technology globally.

We invest in the unknown, applying our curiosity and desire for positive change to everything we do. By fueling creativity, innovation, and progress within our teams, our products and services are designed for people to be fearless, to be changemakers.

What the role involves:

As a Data Engineer, you are part of the Technical Intelligence (TechInt) team. The team’s main function is to recon the blockchain industry and feed the company with new trends and projects. The TechInt team has automatized the recon process by utilizing a data lake and machine learning. The team currently harvests data from a variety of different sources. This data gets fed into different systems that then show this data as a report.

You are responsible for maintaining and setting up data solutions and services. A key part would be to aid in the maturation of the data ingestion pipeline and processes. Moreover, you would be expected to create a state-of-the-art data warehouse which would be cloud-native. In your daily job, you do a mix of data engineering, and cloud infrastructure management. 

  • Develop and maintain automated data ingestion (API or crawling) pipelines from source code repositories, social media, and on-chain analytics. 
  • Simplify existing data pipelines - re-architecting where necessary.
  • Research existing datasets to figure out their relevance - and remove irrelevant data pipelines and sources.
  • Design a data warehouse that can be queried by analysts and APIs, and that will serve as a data backend for a reporting web application.
  • Collaborate with data scientists to operationalize ML models and deploy them into production environments.
  • Work closely with leadership to understand and define requirements, ensuring alignment with the department’s strategy and roadmap.
  • Collaborate with a Data Scientist and an Intelligence Engineer to implement technical solutions that meet project goals.
  • Ensure systems are functional, available, and carefully monitored for continuous performance and reliability.
Requirements

Who you are:

  • Minimum 3–4 years of hands-on recent experience with AWS cloud services :
  • Knowledge of Infrastructure as code (such as Terraform, AWS CloudFormation, Python AWS CDK).
  • Knowledge of cloud services management in AWS (such as S3, Redshift, Lambda, Batch, Glue, Athena etc.).
  • Hands-on experience with Docker for containerizing data applications.
  • Knowledge of relational databases and writing highly optimized SQL, including data transformations, complex joins, and performance tuning.
  • Strong proficiency in Python programming, including PySpark for data transformation.
  • Ability to communicate well both verbally and in writing, with both technical and non-technical partners. Professional English.

It would be beneficial if you have the following:

  • BSc/MSc in a Computer Science field, or equivalent practical experience. 
  • Knowledge of big data processing platforms (such Databricks) and data manipulation libraries in Python (such as Pandas, Polars).
  • Knowledge of docker container orchestration (such as Kubernetes, ECS).
  • Knowledge of Continuous Integration and Continuous Delivery (CI/CD) pipelines (such as GitHub Actions, Travis, Jenkins).
  • Knowledge of blockchain on-chain data representation.

Are you an IOGer?

Do you find yourself questioning the status quo? Do you tinker with ideas and long to turn those ideas into solutions? Are you able to spark thoughtful debates, bringing out the inquisitiveness in others? Does the promise of continuously growing excite you? Then get ready to reimagine everything you thought wasn’t possible because that’s what it means to be an IOGer - we don’t set limits, we break them. 

Benefits
  • Remote work
  • Laptop reimbursement
  • New starter package to buy hardware essentials (headphones, monitor, etc)
  • Learning & Development opportunities
  • Competitive PTO 

At IOG, we value diversity and always treat all employees and job applicants based on merit, qualifications, competence, and talent. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Similar Jobs

5 Days Ago
Remote
United Kingdom
Mid level
Mid level
Artificial Intelligence • Consumer Web • Edtech • HR Tech • Information Technology • Software • Conversational AI
The Principal Business Analyst will conduct business analysis to identify areas for improvement, collaborate with business units to prioritize requirements, and utilize technical skills in SQL, Python, or R for data extraction and analysis. They will also create reports and dashboards using BI tools and manage stakeholder relationships while participating in project planning.
Internship
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
As a Data Science Intern, you will assist the Intelligence Automation team by working on data projects, writing scalable Python code, and collaborating with Senior Data Scientists.
Top Skills: LinuxPython
8 Days Ago
Remote
Hybrid
9 Locations
Mid level
Mid level
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
The role involves developing performance reporting solutions, collaborating with stakeholders, and improving reporting capabilities within Pfizer's international commercial business.
Top Skills: DataikuPower BIPythonScalaSQLTableau

What you need to know about the Edinburgh Tech Scene

From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account