The Data Engineer will develop and monitor BAU data pipelines, enhance processes, build new data pipelines, and contribute to the Data Lake technology in an Agile environment.
You will deliver
- Hands on development on Nexus Platform
- Monitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every day
- Enhance the daily BAU process. Making it easier to monitor and less likely to fail and hands on development on Data Lake build, change and defect fix
- Building new data pipelines using existing frameworks and patterns
- Working with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processes
- Contributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousing
Skills and Experience
Essential
- Experience with data solution BAU processes (ETL, table refresh etc.)
- Experience with integration of data from multiple data sources
- Experience in Big Data data integration technologies such as Spark, Scala, Kafka
- Experience in programming language such as Python or Scala.
Experience using AWS, DBT and Snowflake. - Analytical and problem-solving skills, applied to data solution
- Experience of CI/CD
- Good aptitude in multi-threading and concurrency concepts
- Familiarity with the fundamentals of Linux scripting language
Desirable
- Experience of ETL technologies
- AWS exposure (Athena, Glue, EMR, Step functions)
- Experience of Snowflake and DBT
- Experience with data solution BAU processes (ETL, table refresh etc
- Previous proficiency with ETL technologies (e.g. Talend, Informatica, Abinitio)
- Previous exposure to Python
- Previous exposure to own data solution BAU monitoring and enhancement
- Exposure to building applications for a cloud environment
#LI-Hybrid
#LI-TM1
We work with Textio to make our job design and hiring inclusive.
PermanentTop Skills
AWS
Dbt
ETL
Kafka
Linux
Nexus Platform
Python
Scala
Snowflake
Spark
Similar Jobs
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a GCP Data Engineer, you will design and build scalable data pipelines, collaborate with clients, and support system migrations to cloud architectures.
Top Skills:
BigQueryDataflowDataprocDevOpsGithub ActionsGoogle Cloud PlatformJenkinsPysparkPythonSQL
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
This role involves designing and building scalable data pipelines using GCP, collaborating with clients for tailored data solutions, and mentoring junior engineers in data engineering best practices.
Top Skills:
BigQueryCi/CdDataflowDataprocGitGithub ActionsGoogle Cloud PlatformJenkinsKafkaPower BIPysparkPythonQlikSpark StreamingSQLTableauVertixai
Healthtech
The Data Engineer will process and format data, automate profile enrichment, and ensure data accuracy and integrity. They will collaborate with suppliers to enhance data quality.
Top Skills:
ExcelHubspotLlmsPandasPythonSalesforceSQL
What you need to know about the Edinburgh Tech Scene
From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.


