Leave us your email address and we'll send you all the new jobs according to your preferences.

BI Cloud Integration & Release Specialist

Posted 14 hours 35 minutes ago by Allwyn UK

Permanent
Full Time
I.T. & Communications Jobs
Hertfordshire, Watford, United Kingdom, WD171
Job Description
Join our journey to create a new experience for The National Lottery and help us to power change for the greater good.

We are Allwyn UK, part of the Allwyn Entertainment Group - a multi-national lottery operator with a market-leading presence across Europe which includes: Czech Republic, Austria, Greece, Cyprus & Italy.

While the main contribution of The National Lottery to society is through the funds to good causes, at Allwyn we put our purpose and values at the heart of everything we do. Join us as we embark on a once-in-a-lifetime, largescale transformation journey by creating a National Lottery that delivers more money to good causes.

We'll talk a bit more about us further down the page, but for now - let's talk about the role and who we're looking for

A bit about the role

TheBI Cloud Integration & Release Specialistwill report to theBI Technical Architectand is responsible forcode promotions,deployment pipeline management, anddata provisioningacross the organization'sBusiness Intelligence (BI)function. This role focuses onboth ETL code(e.g., ingestion, transformation, data processing) andPower BI code(e.g., dashboards, data models) to ensure seamless, reliable, and well-managed deployments.

Given the organization'sBig Datalandscape, the Specialist must be proficient inRedshift SQL,CI/CD tools(e.g., Jenkins, GitHub Actions),Terraformfor infrastructure automation,containerization(e.g., Docker, Kubernetes), andPythonfor automation, logging, and observability. This role also coversmanaging non-production environment freshness, orchestratingJira-to-code integrations, optimizingETL/Power BI code, and creatingJira-to-data provisioning pipeline integrations-all while operating in anAWSenvironment. Strongproblem-solvingandtroubleshootingcapabilities, as well as cost monitoring and optimization, are critical to success.

What you'll be doing

Cloud Integration & Architecture
  • Design, implement, and maintainAWS-based infrastructure for BI workloads, leveraging services like Redshift, S3, EMR, etc., in alignment with the BI Technical Architect's guidelines.
  • UtilizeTerraform (or similar IaC tools)to manage environment provisioning, ensuring repeatable and auditable infrastructure setups.
  • Collaborate with the Platform Engineering teamto provision environments and coordinate on overallsystem design, ensuring consistency and scalability across BI and broader platform systems.
  • Applycontainerizationbest practices (e.g., Docker, Kubernetes) where appropriate to streamline deployments and improve portability.
Code Release & Deployment
  • Own the code promotion processfor bothETL codeandPower BI code, ensuring streamlined, error-free deployments.
  • Develop and maintainCI/CD pipelines(e.g., Jenkins, GitHub Actions) that handle builds, automated testing, versioning, and releases for various BI components.
  • Enforcebranching strategies, code review processes, and GitHub repository standards to maintain high-quality code across ETL and Power BI artifacts.
  • Integrate Jirawith code repositories to automate ticket tracking, deployment readiness, and release notes generation.
Environment Management & Freshness
  • Overseenon-production environmentfreshness (e.g., Dev, QA, UAT) to ensure data and configurations reflect the appropriate snapshot for testing or stakeholder demos.
  • Coordinatedata refresh schedulesand environment synchronization, minimizing drift between different stages of the development lifecycle.
  • ImplementJira-to-data provisioning pipelineintegrations, triggering automated environment refreshes or data provisioning steps based on ticket statuses.
Data Provisioning & Pipeline Management
  • Set up and overseedata provisioningworkflows to ensure relevant data is available for ETL jobs, Power BI reports, and other analytics solutions.
  • Automatedata pipeline deployments, integrating source control, peer reviews, and versioning.
  • HandleBig Dataorchestration (e.g., EMR, Spark) where required, ensuring large-scale data sets are processed efficiently.
Code Optimization (ETL & Power BI)
  • Collaborate withData EngineeringandReportingteams tooptimize ETL code, improving performance and resource utilization in AWS (Redshift, EMR, Lambda).
  • ImplementPower BI code optimizations(e.g., DAX queries, data modeling, refresh scheduling) to deliver fast, responsive dashboards.
  • Continuously identifyperformance bottlenecksin transformations or report rendering, offering technical guidance and scripting solutions (Python, SQL, etc.).
Observability, Logging & Automation
  • Implementloggingframeworks (CloudWatch, ELK stack, etc.) to capture system events, deployments, and error logs across the BI environment.
  • Set upsystem observabilitytools (Prometheus, Grafana, Datadog) for real-time metrics, alerts, and dashboards, enabling proactive monitoring of BI services.
  • Driveautomation-first approaches-reducing manual interventions and improving reliability in code promotions, environment provisioning, and data workflows.
Collaboration & Stakeholder Engagement
  • Work closely with theBI Technical Architect,Platform Engineering,Data Engineering, andReportingteams to prioritize deployment schedules and address technical challenges.
  • Collaborate withcloud and product support teamswhen troubleshooting complex issues, escalations, or new feature requests in AWS services or BI applications.
  • Providetechnical guidanceand training to analysts or developers interfacing with the code promotion process or infrastructure.
  • Serve as asubject matter experton code release strategies, AWS integration, Jira-to-code pipelines, Python-driven automation, and observability within the BI domain.
Governance, Security & Compliance
  • Designrole-based permissions, encryption standards, and secure connectivity across AWS services (e.g., Redshift, S3), ensuring they adhere to broader security policies.
  • Collaborate withData Governanceto ensure compliance with data lineage, audit trails, and relevant regulations (GDPR, HIPAA, etc.).
  • Maintainrelease documentationand runbooks, detailing environment configurations and operational procedures for ETL and Power BI deployments.
Performance Monitoring & Optimization
  • Continuouslymonitor resource utilizationacross AWS (e.g., Redshift cluster performance, S3 usage, EMR job efficiency) to optimize costs and maintain SLAs.
  • Identify and resolvebottlenecksin build and deployment pipelines, especially for large-scale data operations.
  • Integrate observability practices (logs, metrics, traces) to proactively address issues and ensure high availability.
Cost Monitoring & Optimization
  • Track AWS usage andcosts(compute, storage, data transfer), identifying opportunities forcost optimization.
  • Recommend and implementcost-saving measures(e.g., rightsizing Redshift clusters, leveraging spot instances, better data lifecycle management).
  • Regularly review budget vs. actual spending, reporting findings to the BI Technical Architect and other stakeholders.
Continuous Improvement & Innovation
  • Evaluatenew AWS services, CI/CD frameworks, and data orchestration tools (e.g., Airflow) to improve efficiency, reliability, and developer experience.
  • Proposeprocess improvements(automated testing, containerization, advanced scheduling) to elevate the BI release lifecycle for both ETL and Power BI artifacts.
  • Encourage aculture of innovation, driving pilot projects that leverage emerging cloud technologies, big data processing techniques, or advanced deployment practices.
What experience we're looking for Experience Technical Expertise
  • AWS Cloud: Strong hands-on experience with AWS services (Redshift, S3, Lambda, EMR, CloudFormation/Terraform) and best practices for big data.
  • CI/CD & Release Management: Proficiency in pipeline tools (Jenkins, GitHub Actions, or equivalent) for ETL and Power BI code deployments.
  • Programming & Automation: AdvancedPythonscripting skills for automating tasks, building release utilities, and integrating logging/observability frameworks.
  • SQL & Big Data: In-depth knowledge ofRedshift SQL, large-scale data processing (Spark, EMR), and data warehousing concepts.
  • Containerization: Familiarity withDockerand/orKubernetesfor containerizing BI components or data workflows.
  • AWS Certified Solutions Architect(Associate or Professional) andAWS Certified DevOps Engineer(Professional).
Environment & Tool Integrations
  • Experience managingnon-production environmentfreshness, including data refreshes and configuration alignment.
  • Familiarity withJira integrationsfor code promotions, environment updates, and data provisioning triggers.
  • Ability to optimizeETL code(Spark, Python scripts, Glue) andPower BI code(DAX, data modeling, refresh scheduling).
Problem-Solving & Troubleshooting
  • Demonstratedproblem-solvingabilities, diagnosing complex issues in cloud deployments, data flows, or containerized environments.
  • Skilled attroubleshootingcross-functional dependencies, engaging withcloud/product support teamsas necessary.
  • Adept at producing clear, actionable solutions under pressure or tight deadlines.
Observability & Logging
  • Experience setting upsystem observability, logs, metrics, and tracing for cloud-based BI services.
  • Familiarity with tools such asCloudWatch,Datadog,Prometheus,Grafana, or similar.
Project & Delivery Management
    . click apply for full job details
Email this Job