Responsibilities:
- Assist in the development of a data warehouse solution, including:
- Implementing ETL (Extract, Transform, Load) pipelines to process data from multiple sources.
- Developing and optimizing the data warehouse schema and data model.
- Maintaining version control and ensuring best practices in data management.
- Testing data pipelines to ensure functionality and reliability.
- Monitoring data quality and troubleshooting inconsistencies.
- Understand business logic and data definitions from source systems, ensuring proper documentation.
- Work closely with data analysts and other stakeholders to define and optimize data requirements.
- Collaborate with the technology team, system owners, and other internal stakeholders to enhance data solutions.
- Implement DevOps practices, including Git, CI/CD pipelines, and cloud computing tools.
Requirements:
- Education:
- MQF Level 6 qualification in Computer Science, Data Science, Statistics, Analytics, or a related field + 4 years of relevant experience, OR
- MQF Level 5 qualification in Computer Science, Data Science, Statistics, Analytics, or a related field + 5 years of relevant experience.
- Preference will be given to candidates with:
- Proven experience in data warehousing principles, including data modeling and ETL.
- Coding skills (preferably C# or similar) and data querying skills (preferably T-SQL).
- Familiarity with DevOps practices, including Git and CI/CD.
- Experience with cloud computing platforms (preferably Azure).
- Knowledge of modern data orchestration tools, such as Azure Data Factory and Azure Synapse.
- Strong analytical, problem-solving, and collaboration skills.