Corva Logo

Corva

Data Ops Senior Backend Developer

Reposted 23 Days Ago
Remote
Hiring Remotely in Spain
Senior level
Remote
Hiring Remotely in Spain
Senior level
Design, build, and maintain backend services and automation for data pipelines and internal tooling. Integrate AI/ML features, own CI/CD and observability, lead code reviews, and collaborate with cross-functional teams to ensure scalable, reliable production systems.
The summary above was generated by AI

About Corva

Corva has built a first-of-its-kind energy app store on a bedrock of best-in-class technologies, data pipelines, and a secure and scalable architecture.  Our energy solutions solve today's toughest well delivery challenges, from well design through drillout.  The ever-evolving platform is not only future-proof for digitizing operations but is your toolkit to accelerate sustainability and energy transition goals.  Our platform is built for speed and reliability and delivers unmatched features and capabilities.


Corva is powering worldwide innovation by driving efficiency, productivity, and profitability with our innovative energy solutions.


Mission


Corva’s mission is to accelerate the future of energy.


Values


Boldness: Corvanauts have the confidence and courage to question status quo for the products we make and the relationships we cultivate.

Own End-to-End: We take ownership of what we start and see it through to completion through trust and dependability.

Transparency: It's crucial to be open and honest and consistent with updates and data flow with customers and colleagues. We value the free-flowing of information and data to make better decisions.

Bias Action: Corvanauts don't sit still - our default mode is taking action! We make progress through high-quality iterations. Failure is built into the process and success is defined by the number of shots on goal.


We are looking for a Senior Backend Developer to join our Data Ops team. You will design, build, and maintain the backend systems that power our data pipelines, automation workflows, and internal tooling. This role blends deep Python expertise with cloud-native architecture, a pragmatic approach to automation, and a growing focus on integrating AI/ML capabilities into our platform. You will work closely with data engineers, front-end developers, and product stakeholders to ship reliable, scalable software.
Technology Stack
Python 3.11 — 3.14, AWS (Lambda, ECS, S3, Step Functions, CloudWatch),
Kubernetes, MongoDB, Redis, Apache Kafka, Pytest, GitHub Actions / Jenkins, Docker, SciPy, NumPy, pandas, scikit-learn, LLM APIs (OpenAI / Anthropic)
Responsibilities & Duties
— Architect and deliver efficient, well-documented, and highly readable backend services that set the quality bar for the team.
— Design and maintain automation pipelines (CI/CD, scheduled jobs, event-driven workflows) that reduce manual effort and improve reliability.
— Build lightweight internal dashboards, admin panels, or API-driven front-end components using frameworks such as React, Vue, or Streamlit to surface data and system health to stakeholders.
— Integrate AI/ML models and LLM-based features into backend services, including prompt engineering, embeddings pipelines, and retrieval-augmented generation (RAG) patterns.
— Dive into new technologies and product disciplines, driving innovation and staying current with the evolving AI and data engineering landscape.
— Define development plans based on project requirements and ensure timely delivery while remaining flexible to changing priorities.
— Oversee the stability of your services, monitoring system health, uptime, and performance post-release through observability tooling (CloudWatch, Datadog, or similar).
— Lead code reviews with peers, fostering a culture of continuous improvement, knowledge sharing, and engineering best practices.
Qualifications & Skills
Required
— 5+ years of hands-on Python development on large-scale, production systems.
— Strong experience with NoSQL databases (MongoDB preferred) and ability to design performant data models.
— Practical knowledge of AWS services and cloud-native design patterns.
— Experience building or maintaining CI/CD pipelines, automated testing suites, and infrastructure-as-code.
— Comfortable presenting ideas and technical details clearly to cross-functional teams.
Nice to Have
— Familiarity with front-end frameworks (React, Vue, or Streamlit) for building internal tools or dashboards.
— Hands-on experience with AI/ML workflows: training pipelines, model serving, or integrating LLM APIs.
— Knowledge of Kubernetes for container orchestration and scaling.
— Experience with event-driven architectures using Kafka or similar streaming platforms.
— Contributions to open-source projects or a visible engineering blog / portfolio.
What We Offer

— Working on great tech stack (app platform) with truly big data (processing 1TB every day)
— Product company with a long-term vision
— Project exposure and ownership that impacts our users, product, and business
— Medical insurance
— Sport benefit

Similar Jobs

4 Hours Ago
Remote or Hybrid
Junior
Junior
Artificial Intelligence • Big Data • Cloud • Information Technology • Machine Learning • Software
The Data Scientist role involves designing and maintaining data pipelines, analyzing data to deliver insights, and building reports using Power BI for strategic decision-making.
Top Skills: Azure Data FactoryAzure Synapse AnalyticsDaxEltETLMicrosoft FabricPower BIPower QueryPythonSQL
5 Hours Ago
Remote or Hybrid
Senior level
Senior level
Artificial Intelligence • Big Data • Cloud • Information Technology • Machine Learning • Software
Design and build intelligent search systems utilizing Machine Learning and NLP techniques for data discovery across Nexthink’s platform. Own search architecture and collaborate with teams to improve search quality and relevance.
Top Skills: APIsCi/CdDockerElasticsearchJavaKafkaKubernetesLuceneMachine LearningNlpOpensearchSearch Technologies
9 Hours Ago
Remote or Hybrid
Mid level
Mid level
Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Big Data Analytics • Automation
As a Customer Success Engineer II, you will diagnose issues, strategize on goals, engage customers, provide training, and advocate for customer needs regarding Dynatrace products.
Top Skills: ApacheAWSAzureCloud FoundryDynatraceGCPIisJavaJavaScriptJbossKubernetesOpenshiftOpenstackSAPWeblogicWebsphere

What you need to know about the Edinburgh Tech Scene

From traditional pubs and centuries-old universities to sleek shopping malls and glass-paneled office buildings, Edinburgh's architecture reflects its unique blend of history and modernity. But the fusion of past and future isn't just visible in its buildings; it's also shaping the city's economy. Named the United Kingdom's leading technology ecosystem outside of London, Edinburgh plays host to major global companies like Apple and Adobe, as well as a growing number of innovative startups in fields like cybersecurity, finance and healthcare.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account