Job Title: Data Engineer - Senior
Duration: 11 months (Total hours: 1,696.50) (Maximum Extension Term: 24 months)
Location: Edmonton, AB (7000 - 113 ST, Edmonton, Alberta CAN T6H 5T6)
Work Type: On-site (Alberta only) will primarily work remotely; however, may be required to attend meetings or work sessions in Edmonton on reasonable notice from the Province).
Hours/Day: 7.25
Hours/Week: 36.25
Notes on Location:
Data Engineer(s) will primarily work remotely; however, may be required to attend meetings or work sessions in Edmonton on reasonable notice from the Province. At the time of providing such notice, the Province will advise of the expected duration of any such meetings or work sessions. However, time to travel and any associated expenses to and from Edmonton will be at no cost to the Province.
The Province reserves the right to alter this work arrangement on reasonable notice to the Data Engineer(s). The Supplier and the Data Engineer(s) will be consulted about the alteration in work arrangement; however, the Province retains ultimate discretion as to the appropriate work arrangement.
Some travel within Alberta may be required to conduct field research and user interviews. The Province will make arrangements for travel for field research and user interview purposes where possible at no cost to the province.
Work must be done within Canada.
Description:
Data Engineering:
• Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
• Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
• Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high-quality data.
• Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
• Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
• Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
• Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
• Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self-service analytics.
Data Analytics:
• Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
• Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
• Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
• Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
• Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self-service capabilities.
• Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.
The Province and the Contractor shall determine changes to Services and Materials as required. The Province and the Contractor will determine changes to Services and Materials through the Artifacts.
Criminal Records Checks:
Upon request by the Province, the Data Engineer(s) shall, at no cost to the Province, provide a current criminal record check. A Data Engineer may be rejected if, in the opinion of the Province, the criminal record is unacceptable.
Should a Data Engineer be assigned to a team working for Justice, the Data Engineer must, prior to performing Services, provide the Province with an “Enhanced Security Clearance”. The supplier will be responsible for providing the GoA with a credit check for the awarded candidate. Data Engineer, which in the opinion of the Province have an unacceptable “Enhanced Security Clearance” or equivalent, shall be rejected. The Province does not receive any information specific to the reason an enhanced clearance may be rejected. Participating law enforcement agencies only identify if an applicant’s clearance is not accepted.
Data Engineer should be aware that over the course of the WO, Data Engineer may be required to complete higher-level security clearances, such as the “Royal Canadian Mounted Police Top Secret Clearance.” Please ensure applicants are eligible to apply if required by the ministry.
Acceptance by the Province of all Data Engineer requires written approval from the Province following acceptable security clearances.
QUALIFICATIONS:
Must Have
Education
Yes/No - Bachelor degree in Computer Science, IT or related field of study.
Work Experience
Ensuring data quality, security, and governance.
Experience as a Data Engineer and/or Data Analyst.
Experience designing efficient dimensional models (star and snowflake schemas) for warehousing and analytics
Experience developing and maintaining reports, dashboards, and visualizations using Power BI, DAX, Tableau, or Python libraries.
Experience manipulating and extracting data from diverse on-premises and cloud-based sources
Experience performing migrations across on-premises, cloud, and cross-database environments
Experience using Git, collaborative workflows, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, ARM, CloudFormation) to deploy and migrate data solutions.
Experience with SSIS, Azure Data Factory (ADF), and using APIs for extracting and integrating data across multiple platforms and applications.
Nice to Have
Work Experience
Experience in application development, with knowledge of object-oriented and functional programming/scripting languages.
Experience in the environment or an environment of equivalent size and complexity.
Experience with databases and data integration, including PostgreSQL, MongoDB, Azure Cosmos DB and data intefration tools like Synapse pipeline, Fabric data factory, Informatica, Talend, DBT and Airbyte.
Exposure to AI/ML tools and workflows relevant to data engineering, such as integrating AI-driven analytics or automation within cloud platforms like Databricks and Azure.