If you find this opportunity aligns with your career goals and interests, we kindly request that you send us your documents via E-mail Job Title – Data Architect / Engineer
Office location : - Canada
Job Type : Remote
Description
Skills : Data Bricks + Unity Catalog + Terraform + Pyspark Notebook
Google - GCP Environment
Technical Activities / Description of RequirementsDurationSkill Level (Refer to Category Skills Set)Complexity (if C22 IS / IT Services)N1 Current design review & design updates if required N2 Create buckets for each component and environment in TerraformAdvanced / Expert N3 Create Meta store and Catalogues in Terraform and assign buckets N4 Create users / groups / service principals in Terraform and assign access rights N5 Create new DDL and table library for Unity Catalogue structure N6 Migrate to shared clusters and refactor / get notebook context info N7 Update all notebooks to use new DDL and table library N8 Complete validation of Unity Catalogue on Dev
Updated existing unit tests
New unit tests added for new notebooks
Integration testsN9Data migration workflow with result report
Migrate existing data from existing Delta Tables (saved in Databricks system buckets) to Delta Tables saved in external locations (specific buckets)Inputs from Thales to perform the activitiesClassificationReference / VersionDate / periodicityThales Group