Leave us your email address and we'll send you all the new jobs according to your preferences.
Senior Algorithm Engineer (Python/Spark-Distributed Processing)
Posted 1 hour 22 minutes ago by Xcede
Contract
Not Specified
Other
Not Specified, United Kingdom
Job Description
Senior Algorithm Engineer (Python/Spark - Distributed Data Processing)
Location: UK (O/IR35)/Belgium/Netherlands/Germany (B2B)
Working model: Remote
Start: ASAP
Senior Algorithm Engineer (Python/Spark - Distributed Data Processing)
We're hiring a Senior Algorithm Engineer to join a data-intensive SaaS platform operating in a complex, regulated industry. This is a hands-on senior IC role focused on building and optimising distributed data pipelines that power pricing, forecasting and billing calculations at scale. - This is not an ML/Data Science/GenAI role
What you'll be doing
- Design, build and deploy algorithms/data models supporting pricing, forecasting and optimisation use cases in production
- Develop and optimise distributed Spark/PySpark batch pipelines for large-scale data processing
- Write production-grade Python workflows implementing complex, explainable business logic
- Work with Databricks for job execution, orchestration and optimisation
- Improve pipeline performance, reliability and cost efficiency across high-volume workloads
- Collaborate with engineers and domain specialists to translate requirements into scalable solutions
- Provide senior-level ownership through technical leadership, mentoring and best-practice guidance
- Proven experience delivering production algorithms/data models (forecasting, pricing, optimisation or similar)
- Strong Python proficiency and modern data stack exposure (SQL, Pandas/NumPy + PySpark; Dask/Polars/DuckDB a bonus)
- build, schedule and optimise Spark/PySpark pipelines in Databricks (Jobs/workflows, performance tuning, production delivery)
- Hands-on experience with distributed systems and scalable data processing (Spark essential)
- Experience working with large-scale/high-frequency datasets (IoT/telemetry, smart meter, weather, time-series)
- Clear communicator able to influence design decisions, align stakeholders and operate autonomously
- Energy/utilities domain exposure
- Cloud ownership experience (AWS preferred, Azure also relevant)
- Experience defining microservices/modular components supporting data products
