Position Description :
We are seeking an experienced Data Engineer with 7+ years of professional expertise in designing, developing, and optimizing large-scale data pipelines and data processing systems. The ideal candidate is highly skilled in Python, PySpark, ETL development, and modern cloud data engineering tools, particularly Databricks and Azure Data Factory (ADF). Experience in data ingestion, transformation, and extraction workflows is essential. Knowledge of Wealth Management domain data is a strong plus.
Your future duties and responsibilities :
- Design, build, and maintain scalable and robust data pipelines to support data ingestion, transformation, and analytics needs.
- Develop ETL / ELT processes using Python, PySpark, Databricks, and cloud-native tools.
- Build and optimize data ingestion frameworks from multiple structured and unstructured data sources.
- Implement and maintain data processing workflows in Azure Data Factory.
- Work with Databricks notebooks and clusters to process large datasets efficiently.
- Ensure data quality, consistency, and governance across data pipelines.
- Collaborate with cross-functional teams including Data Analysts, BI Developers, and Data Scientists.
- Troubleshoot and optimize existing data workflows for performance and cost efficiency.
- Participate in solution design discussions, providing data engineering expertise.
- Document pipeline architecture, data flows, and technical processes.
Required qualifications to be successful in this role :
7+ years of professional experience as a Data Engineer or in a similar role.Strong proficiency in Python for data transformation and automation.Hands-on experience with PySpark for distributed data processing.Proven expertise in developing and managing ETL pipelines.Practical experience using Databricks for big data processing and analytics.Hands-on experience with Azure Data Factory for pipeline orchestration and workflow management.Solid understanding of data ingestion, extraction, and transformation techniques.Experience working with Azure cloud ecosystems and data services.Strong problem solving skills and the ability to optimize complex data workflowsPreferred / Good-to-Have Skills
Wealth Management domain experience, including familiarity with financial instruments, transactions, or client reporting datasets.Experience with Delta Lake, Lakehouse architectures, or Azure Synapse.Knowledge of SQL optimization and performance tuning.Understanding of data governance, security, and compliance best practicesCGI is providing a reasonable estimate of the pay range for this role. The determination of this range includes factors such as skill set level, geographic market, experience and training, and licenses and certifications. Compensation decisions depend on the facts and circumstances of each case. A reasonable estimate of the current range is $60,.00 – $,.00. This role is an existing vacancy.
#LI - OA1
Use of the term ‘engineering’ in this job posting refers to the technical sense related to Information Technology (IT) and does not imply that the individual practices engineering or possesses the requisite license as prescribed by the applicable provincial or territorial engineering regulator. We are seeking individuals with expertise in IT engineering-related functions, but licensure from an engineering regulator is not a prerequisite for this position. Engineering is a regulated profession in Canada which is restricted in terms of use of titles and designation.
Skills :
EnglishETLPython