Azure Data Engineer
VLink Inc
Canada
Temps plein
Role : Azure Data Engineer
Location : Toronto, ON (Hybrid - 3 days onsite)
Job type : Long-term Contract / Full-time opportunity
Experience : 5+ years
Job Description : We are looking for a highly skilled professional to design, build, and maintain scalable data solutions while ensuring data quality and driving data innovation.
Requirements
- Over 3 years of experience working with Azure Cloud Resources (ADLS, ADF, Dataflow).
- At least 2 years of hands-on experience with DBT(Must).
- Proficiency in advanced SQL, data integration, and data modeling principles.
- Experience with Python programming and Git for version control.
- Familiarity with DevOps practices for automation and deployment.
- Strong technical and problem-solving skills.
Responsibilities
Data Development & Integration : Design and build data solutions using Azure Cloud Resources (ADLS, ADF, Dataflow) in accordance with organizational standards to ensure resilience and scalability.
- Data Modeling & Metadata Management : Create, update, and maintain data models based on business needs and manage metadata repositories to ensure accurate and up-to-date information.
- ETL Processes & Data Preparation : Extract, transform, and load data from various sources, ensuring data quality, integrity, and convergence of datasets using DBT.
- Programming & Scripting : Develop scripts and programs using Python and employ Git for version control and code management.
- DevOps & Automation : Implement DevOps practices to automate workflows and ensure continuous integration and deployment.
- Problem Resolution & Advanced SQL : Use advanced SQL skills to resolve issues within databases, data products, and processes.
Qualifications
- Data Development & Integration : Design and build data solutions using Azure Cloud Resources (ADLS, ADF, Dataflow) in accordance with organizational standards to ensure resilience and scalability.
- Data Modeling & Metadata Management : Create, update, and maintain data models based on business needs and manage metadata repositories to ensure accurate and up-to-date information.
- ETL Processes & Data Preparation : Extract, transform, and load data from various sources, ensuring data quality, integrity, and convergence of datasets using DBT.
- Programming & Scripting : Develop scripts and programs using Python and employ Git for version control and code management.
- DevOps & Automation : Implement DevOps practices to automate workflows and ensure continuous integration and deployment.
- Problem Resolution & Advanced SQL : Use advanced SQL skills to resolve issues within databases, data products, and processes.
Warm Regards,
- com / in / wariya-siraj-7634b8169
Il y a 1 jour