Job DescriptionAzure Data Engineer
Experience required : 4-6 Years
Location: Toronto, Hybrid (Tue-Thurs 8:30 – 5:00 PM EST)
Skills: Digital : Microsoft Azure, Agile Way of Working
Experience Required: 4-6 years
Role Description:
1. JSON Transformation Using Liquid Templates
Develop Liquid templates to transform incoming JSON payloads according to defined business and mapping rules.
Ensure templates are optimized, maintainable, and aligned with overall integration and transformation standards.
Validate transformation outputs to ensure accuracy and data integrity.
2. Data bricks Ingestion Job Development
Build and configure Data-bricks Jobs to ingest, validate, and process structured and semi-structured data.
Implement scalable workflows supporting batch or streaming ingestion patterns.
Collaborate with data engineering teams to align job logic with source-to-target mapping requirements.
3. Convert API Logic Implementation
Design and implement the Convert API logic, enabling the transformation of payloads between source and target formats.
Integrate transformation components, validation steps, and business rules within the API workflow.
Ensure API logic follows best practices for performance, reliability, and maintainability.
4. Error Handling Exception Management
Develop comprehensive exception-handling and error logging mechanisms within the pipeline.
Implement structured error responses and automated alerts to support monitoring and troubleshooting.
Ensure failures are captured with relevant context to support root cause analysis.
Essential Skills
Experience working as Data Engineer with focus on big data processing and/or relational databases with good understanding of SQL
Experienced with working on Structured and Semi-Structured datasets.
Experience working with Microsoft Azure Data Platform, specifically Azure SQL Database, Azure Data Factory, ADLS storage account, Azure DataBricks and DevOps
Experienced in creating data pipelines & developing complex and optimized queries
Experienced with Workflow Management Tools: Airflow, Cron tab, CA Workload Automation
Experience with implementing CI / CD pipeline on GitHub, Jenkins / JGP.
Experience writing complex SQL and NoSQL jobs to analyze data in traditional DBMS (MS-SQL, Oracle) environments
Experience with integrating to back-end/legacy environments.
Experience integrating business and technology teams
Experienced in any of the following programming/scripting languages (SQL, Python, Shell, Scala)
Excellent organizational and time management skills, strong business presence with ability to multi-task and effectively deal with competing priorities
Desirable Skills
Nice to have:
Proficiency in writing software in one or more languages - such as Java, Node, Python, .NET.
Exposure to microservices.
Experience in scripting language such as groovy, Python, shell.
Grounding in DevOps principles, test-driven development, continuous integration, and continuous delivery.
Experience in Microsoft Azure and AKS or other similar cloud technologies.
Experience automating infrastructure provisioning with tools such as Terraform, Chef, and HELM charts.
Experience with Jenkin, Artifactory, SonarCube.
Familiarity with package management tools such as npm, pip.
Familiarity with build processes and build tools such as maven, gradle.
Familiarity with standard IT security practices.
Familiarity with database technologies in cloud such as Azure SQL, Cosmos/Mongo