What is the opportunity?
Are you a talented, creative, and results-driven professional who thrives on delivering high-performing applications? Come join us!
Job Description
GFT (Global Functions Technology) is part of RBC’s Technology and Operations division. GFT collaborates with partners across the company to deliver innovative and transformative IT solutions. Our clients include Risk, Finance, HR, CAO, Audit, Legal, Compliance, Financial Crime, Capital Markets, Personal and Commercial Banking, and Wealth Management. We also lead the development of digital tools and platforms to enhance collaboration.
We are looking for a highly skilled MLOps Engineer to design and build a production-grade Machine Learning pipeline for financial risk model training and inference. The pipeline will support model training / testing / inference using Python and PySpark on public cloud (AWS) and on-premise infrastructure. This role combines strong Python and cloud engineering skills with a solid understanding of machine learning model lifecycle management from data preparation through training, validation, registration, and operational inference. You’ll collaborate with data scientists, DevOps, and risk IT teams to build a reliable, automated, and auditable MLOps platform that meets enterprise standards for security, governance, and scalability.
What will you do?
- Design and implement end-to-end MLOps pipelines to train, test, register, and deploy credit risk machine learning models.
- Build and automate model lifecycle management workflows including versioning, promotion, approval, and deprecation.
- Develop and integrate a model registry (e.g., MLflow, SageMaker Model Registry, or a custom solution) to manage model metadata, lineage, and reproducibility.
- Orchestrate data and training workflows using tools such as Airflow, AWS Step Functions, stonebranch, or Prefect.
- Implement CI / CD pipelines using GitHub Actions, Jenkins, or AWS CodePipeline, ensuring consistent and automated deployment processes.
- Build data preparation and training scripts in Python and PySpark, optimized for performance and scalability on AWS EMR or similar clusters.
- Manage model artifacts, dependencies, and environments across AWS and on-premise contexts.
- Ensure strong observability and auditability through structured logging, metrics, and model performance tracking.
- Collaborate with DevOps and data engineering teams to ensure secure integration, data governance, and production readiness.
What do you need to succeed?
Must Have :
Hands-on expertise with AWS data and ML services (S3, EMR, Lambda, Step Functions, ECS / EKS, SageMaker, CloudWatch, IAM).Experience building and maintaining model registries, versioning systems, and artifact repositories (e.g., MLflow, SageMaker, DVC).Solid understanding of model lifecycle management from training and testing to deployment, monitoring, and retraining.Strong grasp of CI / CD practices, using tools like GitHub Actions, Jenkins, or CodePipeline.Familiarity with hybrid deployment environments (AWS and on-prem) and related networking / security considerations.Proficiency in Python for production-quality scripting, automation, and ML workflow integration.Strong experience with PySpark for distributed data processing and model training.Required Experience
5+ years of experience in software engineering, data engineering, or MLOps in enterprise-scale or regulated environments.Proven track record of building ML pipelines in production, preferably in financial services or other data-sensitive domains.Experience managing model artifacts and metadata for auditability and compliance.Practical knowledge of containerization (Docker) and infrastructure automation (Terraform, CloudFormation).Strong background in Linux-based systems, shell scripting, and environment management.Experience collaborating with data scientists and model validators to operationalize, monitor, and maintain models.Understanding of data governance and regulatory requirements (e.g., model audit trails, reproducibility).Required Certifications (or Equivalent Experience)
AWS Certified Solutions Architect Associate or higher (required).AWS Certified Machine Learning Engineer Associate or AWS Certified Machine Learning Specialty (strongly preferred).AWS Certified DevOps Engineer Professional (preferred).Bachelor’s or Master’s degree in computer science, Engineering, Data Science, or related quantitative / technical field.AWS CloudOps / SysOps Engineer Associate (nice to have).Databricks Certified Data Engineer Associate / Professional or equivalent PySpark certification (nice to have).Python PCAP or Terraform Associate Certificate (nice to have).Nice to Have
Experience implementing model monitoring and drift detection.Familiarity with distributed training and parallel compute frameworks (Ray, Spark, Dask).Experience with feature stores, data lineage, or metadata tracking systems.Exposure to financial risk modeling workflows.Working knowledge of container orchestration (Kubernetes, OpenShift) and hybrid deployments.Familiarity with secure data exchange patterns between cloud and on-prem environments.Exposure to observability stacks (ELK, Prometheus, Grafana, CloudWatch).What’s in it for you?
We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.
A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicableLeaders who support your development through coaching and managing opportunitiesAbility to make a difference and lasting impactWork in a dynamic, collaborative, progressive, and high-performing teamA world-class training program in financial servicesFlexible work / life balance optionsOpportunities to do challenging workRBC is presently inviting candidates to apply for this existing vacancy. Applying to this posting allows you to express your interest in this current career opportunity at RBC. Qualified applicants may be contacted to review their resume in more detail.
Note : This description retains the essential content while removing clutter and ensuring accessibility and readability.
#J-18808-Ljbffr