Job Title: Data Engineer
Job Location: Mississauga, ON (Need Onsite day 1, hybrid 3 days from office)
Job duration: Long Term
Job Description:
We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will possess a strong background in data modeling, SQL, ETL processes, and practical experience working within an Azure cloud environment, particularly leveraging Databricks. In this role, Candidate will be instrumental in designing, developing, and maintaining scalable data pipelines, contributing to our data strategy initiatives, and enabling insightful analytics.
Responsibilities:
- Design, develop, and optimize data models and schemas to effectively support business requirements.
- Build, implement, and maintain efficient ETL pipelines utilizing Spark, Databricks, and other relevant tools.
- Develop complex SQL queries, stored procedures, and functions to extract, transform, and load data from multiple sources.
- Collaborate with data analytics, data science, and business teams to gather requirements and deliver tailored data solutions.
- Leverage Azure Data Services including Azure Data Lake, Azure Data Factory, and Azure Databricks to create scalable, reliable data architectures.
- Ensure adherence to data quality, governance, security, and compliance standards across all data processes.
- Continuously monitor, troubleshoot, and optimize data pipelines for performance, scalability, and robustness.
- Document data architecture, workflows, and procedures; participate in code reviews and establish best practices.
Requirements:
- 3 to 6 years of experience in data engineering with demonstrated expertise in data modeling and SQL.
- Proven experience in developing ETL solutions and orchestrating data pipelines.
- Hands-on experience working with Azure cloud platform, specifically Azure Data Factory, Data Lake, and Databricks.
- Strong programming skills in Spark, Python, or Scala for data processing tasks.
- Familiarity with data governance, security protocols, and regulatory compliance requirements.
- Excellent problem-solving skills, keen attention to detail, and ability to troubleshoot complex data issues.
- Ability to work independently and collaboratively within a team environment.
- Bachelor's degree in Computer Science, Information Technology, or related field.
Preferred, but not required:
- Certifications in Azure Data Services or Databricks are a plus.
- Experience working with BI tools and report generation frameworks.
- Knowledge of other cloud platforms (AWS, GCP) is advantageous.
- Experience with REST APIs integration.
- Experience with GraphQL.