We are looking for a Principle AI Engineer to drive the development of Data engineering solutions on RBC’s Enterprise Data and AI Hybrid Multi-cloud Platforms, that meet the strategic data objectives of the business. This is unique opportunity to be an impactful Data Engineering leader on a fast growing team. The successful candidate will be responsible for leading the design, development, and implementation of data solutions, as well as lead, mentor, and grow a team of talented data engineers. This role requires strong data engineering skills and leadership, effective written and verbal communication skills, a strong work ethic and a demonstrated capability to multi-task effectively as a member of a dynamic, fast paced team. At RBC Borealis, you’ll be joining a team that works directly with leading researchers in machine learning, has access to rich and massive datasets, and offers the computational resources to support ongoing development in areas such as reinforcement learning, unsupervised learning and computer vision. You can find out more about our research areas at rbcborealis.com.
Oversee end-to-end data integration, including sourcing, lineage, transformation, and storage to enable complex AI and advanced analytics, leveraging extensive technical expertise.
Collaborate with Business architecture, System architecture, Business SME and Data Stewards.
Architect and implement agentic systems, including tool using agents, workflow orchestrators, and multi step reasoning pipelines that reliably execute business tasks.
Design and deliver Retrieval Augmented Generation solutions, including document ingestion, chunking, indexing, vector search, hybrid search, reranking, and grounding strategies over curated data products.
Build evaluation harnesses and quality gates, including offline test sets, golden datasets, regression suites, and metrics for factuality, safety, latency, cost, and business outcomes.
Implement observability for AI systems, including tracing across prompts and tool calls, telemetry, drift detection, and runbooks for production operations
Lead the build of batch and real time data pipelines, including inbound, outbound, and event driven flows that power analytics and AI use cases.
Design governed data products with clear contracts, documentation, lineage, and SLAs, enabling consistent consumption across domains.
Establish high quality ingestion, transformation, and serving patterns using lakehouse and warehouse paradigms, plus streaming where appropriate.
Partner with data stewards and domain teams to define data standards, quality controls, and metadata that ensure trust and reusability
Design and build backend services and APIs that expose data products, agent capabilities, and AI workflows as reliable, secure services.
Apply rigorous engineering practices, including code quality, automated testing, CI/CD, performance engineering, and secure by default design.
Build scalable runtime patterns for AI systems, including caching, rate limiting, concurrency control, idempotency, and graceful degradation.
Contribute to reference architectures, reusable libraries, and platform components that accelerate delivery across teams.
Bachelor’s degree in computer science or related technical field involving coding (e.g., physics or mathematics), or equivalent technical experience.
10+ years of professional software engineering experience with strong Python and SQL, Spark and Databricks SQL are a plus.
Demonstrated experience designing and operating scalable data architectures, including schema design, dimensional modeling, and data lifecycle management.
Strong knowledge of algorithms and data structures, plus systems engineering fundamentals, reliability, performance, and debugging.
Hands on experience with data engineering platforms and tools, commonly including Python, PySpark, Databricks, Airflow, Kafka, Snowflake, and modern data integration patterns.
Experience building production services and APIs, including service design, authentication and authorization, and integration patterns, Node.js and Apigee are a plus.
Practical experience delivering AI powered systems, including one or more of:
RAG systems and vector search, embeddings, reranking, and grounding strategies
LLM application development, structured outputs, prompt and tool calling, orchestration patterns
AI evaluation, test harnesses, regression testing, and lifecycle management for prompts and models
Observability for AI systems, tracing, monitoring, alerting, and cost controls
Working knowledge of security and identity frameworks such as OAuth 2.0, LDAP, Kerberos, and Vault integration, with experience operating in regulated environments.
Master’s degree in computer science or equivalent experience.
Experience with agent frameworks and workflow patterns, such as graph based orchestration, tool routing, plan and execute loops, and human in the loop designs.
MLOps and LLMOps experience, including CI/CD for ML and LLM applications, model registries, feature stores, experiment tracking, and safe rollout patterns
Automation and DevOps experience, such as GitHub Actions, infrastructure as code, and automated QA.
Experience working in Agile or SAFe environments.
Experience with frontend or portal integration for AI experiences, for example Angular based portals, analytics integration, or enterprise enablement tooling.
RBC Borealis is the driving force behind Royal Bank of Canada’s AI and data innovation. As part of Canada’s largest financial institution, we bring together a team of architects, engineers, scientists, and product experts on a mission to revolutionize finance through world-class research, solutions, and a resilient data platform. With locations across Toronto, Waterloo, Montreal, Calgary, and Vancouver, we’re at the forefront of AI research and platform development. With a focus on cutting-edge research in areas like time series forecasting, causal machine learning, and responsible AI, we are seamlessly integrating AI research and data engineering, to solve critical challenges in the financial industry. We are building intelligent, and scalable, data-driven solutions that will help communities thrive and drive innovation for our customers across the bank.