Leave us your email address and we'll send you all the new jobs according to your preferences.

Data Engineer with BI Capability (12-Month Maximum Term Contract)

Posted 6 days 14 hours ago by Claim Central Group

Permanent
Not Specified
I.T. & Communications Jobs
Edinburgh, United Kingdom
Job Description

About the role:

We're looking for a hands-on Data Engineer to join us on a 12-month max-term contract and play a key role in supporting and scaling our growing data infrastructure.

In this role, you'll be responsible for building and maintaining scalable ETL/ELT pipelines using Databricks and modern cloud tools. You'll also step in to temporarily support our business intelligence needs, developing and maintaining reports and dashboards in ThoughtSpot (or a similar BI platform).

You'll collaborate closely with our Lead Data Engineer, who will provide architectural guidance and help drive the strategic direction of our data transformation initiatives.

This role is a great fit for a data engineer who enjoys working across the full data stack-from raw data ingestion and transformation all the way to the BI layer-with a strong focus on data quality, reliability, and usability.

We offer a hybrid work arrangement: 3 days in the office and 2 days remote each week, giving you the flexibility to do your best work.

Key Responsibilities:

Data Engineering

  • Build, maintain, and optimize robust ETL/ELT pipelines using Databricks.
  • Contribute to the design and implementation of data lake and data warehouse architecture.
  • Translate business requirements into reliable and scalable data solutions.
  • Collaborate with the Lead Data Engineer on data modeling, pipeline design, and cloud infrastructure best practices.
  • Implement monitoring, alerting, and logging for data pipelines to ensure data integrity and reliability.
  • Participate in sprint planning, technical documentation, code reviews, and team collaboration rituals.

BI & Reporting Support

  • Maintain and support dashboards and reports in ThoughtSpot.
  • Assist stakeholders with ad hoc data queries and visualization needs.
  • Ensure availability and accuracy of key business metrics during the analyst's leave period.
  • Translate complex datasets into usable, decision-support insights.

Key Requirements:

Essential

  • Strong experience building and managing ETL/ELT pipelines in Databricks or similar platforms.
  • Proficiency in Python and SQL for data processing, transformation, and analysis.
  • Deep knowledge of data modeling and warehousing concepts.
  • Experience with BI tools, preferably ThoughtSpot (or Power BI, Tableau, Looker).
  • Solid version control and collaboration practices (e.g., Git).
  • Ability to collaborate closely with both technical and non-technical team members.
  • Effective communication and problem-solving skills.

Desirable

  • Exposure to DevOps practices such as CI/CD (e.g., Azure DevOps), infrastructure as code (e.g., Terraform).
  • Experience working in a remote or distributed team environment.
  • Experience working with cloud environments (AWS, Azure, or GCP).
  • Familiarity with AWS services like S3, Lambda, EC2, SQS, and SNS.

Personal Attributes

  • Curious, proactive, and eager to learn.
  • Comfortable balancing engineering depth with occasional BI support.
  • Strong ownership mindset and attention to detail.
  • Organized and efficient; able to manage priorities across domains.

If you would like to be a part of the Wilbur team then please submit your application. We look forward to hearing from you!

Email this Job