Insight Global is looking to hire a Data Engineer for a retail client based in Vancouver. This role is a hybrid position and requires 3 days onsite / week in Downtown Vancouver. You will be joining a team that works heavily with customer data - owning the customer data pipelines that gather data coming from multiple sources and consolidating that data for different use cases. As a Mid Data Engineer, you will bring a high level of technical knowledge, but also an ability to spread knowledge to your co‑workers. You will help form the core of the engineering practice by contributing to all areas of development and operations (pre‑production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day‑to‑day production release team and may perform on‑call support functions as needed. Having a DevOps mindset is the key to success in this role, as Engineers are commonly part of full DevOps teams that “own” all parts of software development, release pipelines, production monitoring, security and support. Other duties and responsibilities include :
Responsibilities
- Build, modernize, and maintain data pipelines using Azure, Databricks, Snowflake, and GCP.
- Create and publish secure Snowflake views to the Enterprise Data Exchange.
- Migrate pipelines to Delta Live Tables (DLT) and Unity Catalog.
- Ensure compliance with PII masking / encryption requirements.
- Participate in DevOps : improve CI / CD, monitor production, handle failures, join on‑call rotations.
- Collaborate with global engineering teams as part of an Agile, DevOps, SRE‑aligned culture.
Required Skills and Experience
5+ years of experience with database engineering – building out and deploying pipelines, ideally working with customer data.Strong experience with Azure (deployments, configurations, Storage Accounts).Hands‑on experience with Azure Data Factory, Azure Databricks, Snowflake, DBT / DLT, and Medallion Architecture.Strong Python (especially PySpark) for building and optimizing pipelines across GCP / Azure / Snowflake / Databricks.Experience with CI / CD concepts & tooling (Azure DevOps; Repos; pipeline deployments).Experience working in an Agile environment.Strong communication skills, both written and verbal.Nice to Have Skills and Experience
Experience with Fivetran, Feedonomics, or similar marketing technology tools.Experience with Terraform or ARM templates.Experience with Unity Catalog or similar governance tools.Experience with PII masking / encryption standards.Background with Snowflake secure views and enterprise data sharing.#J-18808-Ljbffr