Location Address: Hybrid - 44 King Street W 24th Floor, Toronto 2 -3 days in office per week.
Contract Duration: ASAP to April 30th - plus an additional 6 months if extended.
Good possibility of extension
Schedule Hours: 9am-5pm Monday-Friday;standard 37.5 hrs/week Story Behind the Need Business group: Global Analytics Engineering - The business line provides business intelligence support and supporting areas using both repeatable and ad hoc reporting delivery reports (charts, graphs, tables, etc) that enable informed business decisions.
Project has complex requirements to build insights on Commercial customers across multiple geographies in International Banking. Establish system information requirements using analysis of information in the development of Data-wide or large-scale information systems.
We are seeking an experienced Data Engineer with deep expertise in Google Cloud Platform (GCP) to join our growing team. In this role, you will be responsible for designing, building, and maintaining scalable data architectures that support our data-driven initiatives. You will be part of a team of engineers, collaborating with cross-functional teams including data architects, solution architects, business systems analysts and data engineers and ensure that our data infrastructure is robust, secure, and optimized for performance.
Candidate value preposition. Build data platforms that power decisions across a global bank. In this role, you’ll be at the center of Scotia’s data modernization journey, designing, building, and optimizing cloud native data pipelines that support analytics, regulatory reporting, digital products, and enterprise-wide insights. Your work directly shapes how the bank ingests, processes,
Daily activities: - Develop and maintain robust data pipelines for ingesting and distributing large datasets for processing and consumption.
- Utilize SaaS services and tools to build, configure, and automate data workflows, streamlining the data engineering process.
- Collaborate with stakeholders and product managers to analyze data requirements and build ingestion patterns to integrate new data sources into the data platform.
- Build and monitor application services and pipeline performance.
- Conduct data quality checks.
Must-have requirements: - 3-4 years of experience creating ELT / ETL data pipelines from scratch, working with structured, semi-structured, and unstructured data
- 2-4 years of experience with Cloud: GCP
- 2-4 years of experience with the Airflow scheduling tool
- 3+ years of experience working on continuous integrations and continuous deployment pipeline (CI/CD) and working with source control systems such as GitHub, Bitbucket, and Terraform
- 2-4 years of experience in data modeling, manipulating large datasets, handling raw data, and applying various data cleaning techniques using SQL and Python
*******•Bilingual: English / Spanish is mandatory********
Nice-To-Have Skills: - Power BI knowledge.
- Experience collaborating with DevOps and Scrum teams.
- FI/banking experience
Education: •Bachelor’s Degree in Computer Science, IT, or a related field: Advanced degrees or relevant certifications (e.G., Cloud Engineer) are a plus.
Best VS. Average Candidate: Very recent experience on GCP in a similar role;experience in banking;strong candidate with all the must haves and bilingual in Spanish would be an asset
We are looking for a Data Engineer who can not only build and maintain modern data pipelines, but also explain complex technical concepts in a simple, clear way to non-technical audiences. This person will act as a bridge between engineering teams and business stakeholders, translating data processes, system behaviors, and technical requirements into language that supports decision-making, planning, and collaboration.
Candidate Review & Selection 1-2 rounds (2nd optional) –Video or in Person –will be determined closer to interview
1 interview - 1 hour –HM and technical/data engineer –situational technical questions, go through experience