Your new role As part of the Data Analytics and AI team, you''ll help build and evolve a Snowflake-based data platform. You will design and maintain robust data pipelines and models that transform data from operational systems into trusted, analytics-ready assets. Your work will enable high-quality reporting, insight generation, and future AI/ML initiatives across the business.What You''ll Do Build and Maintain Pipelines: Develop ELT/ETL pipelines using dbt, Python, and orchestration tools. Work with Snowflake: Design and optimise data models, manage performance, and adopt advanced Snowflake features. Ensure Data Quality: Apply governance, security, and compliance standards. Collaborate: Support analysts and business teams with clean, structured data for reporting and analytics. Monitor and Improve: Implement monitoring, alerting, and cost controls for Snowflake and pipelines. Contribute to Best Practices: Help define standards for naming, schema design, and development processes.What you''ll need to succeed Strong hands-on experience with Snowflake, SQL, Python, and DbtSolid understanding of data modelling, data governance, and cloud platforms (AWS preferred)Experience supporting analytics using BI tools such as Power BIFamiliarity with data ingestion tools (e.g. Airbyte, Fivetran)Working knowledge of Git, CI/CD, and modern data engineering practicesWhat you need to do nowIf you''re interested in this role, click ''apply now'' to forward an up-to-date copy of your CV, or ..... full job details .....
Perform a fresh search...
-
Create your ideal job search criteria by
completing our quick and simple form and
receive daily job alerts tailored to you!