Talent.com
Databricks ETL Developer (PL644)

Databricks ETL Developer (PL644)

Paralucent.Toronto, ON, CA
30+ days ago
Job type
  • Full-time
Job description

We are seeking a skilled Databricks ETL Developer to join our team. In this role, you will be responsible for developing, maintaining, and optimizing ETL (Extract, Transform, Load) processes within Databricks to support data warehousing, data lakes, and analytics initiatives. You will collaborate closely with data architects and business teams to ensure efficient data transformation and movement, including handling Change Data Capture (CDC) and streaming data.

Key Responsibilities :

  • Design, develop, and implement ETL pipelines using Databricks and related tools to ingest, transform, and store large-scale datasets.
  • Utilize Databricks, Delta Lake, Delta Live Tables, and Spark to process both structured and unstructured data.
  • Collaborate with business teams to design scalable data solutions that enhance data accessibility and analytical capabilities.

Location : Toronto (Hybrid)

Duration : 5 months + yearly extension

Required Skills and Experience :

  • Proven experience in developing ETL pipelines using Databricks.
  • Strong knowledge of Delta Lake for data management and optimization.
  • Hands-on experience with CDC tools (e.g., GoldenGate) for managing real-time data.
  • Familiarity with Databricks Workflows for scheduling and orchestrating tasks.
  • Solid understanding of the Medallion Architecture (Bronze, Silver, Gold) and experience implementing it in production environments.
  • Technical Knowledge and Expertise :

  • Proven experience in developing and managing ETL pipelines, jobs, and workflows in Databricks.
  • In-depth understanding of Delta Lake for building data lakes and managing ACID transactions, schema evolution, and data versioning.
  • Experience in automating ETL pipelines using Delta Live Tables, including handling Change Data Capture (CDC) for incremental data loads.
  • Proficiency in structuring data pipelines with the Medallion Architecture to scale data pipelines and ensure data quality.
  • Hands-on experience in developing streaming tables in Databricks using Structured Streaming and readStream to handle real-time data.
  • Expertise in integrating CDC tools like GoldenGate or Debezium for processing incremental updates and managing real-time data ingestion.
  • Experience with Unity Catalog for managing data governance, access control, and ensuring compliance.
  • Skilled in managing clusters, jobs, autoscaling, monitoring, and performance optimization in Databricks environments.
  • Knowledge of using Databricks Autoloader for efficient batch and real-time data ingestion.
  • Experience with data governance best practices, including implementing security policies, access control, and auditing with Unity Catalog.
  • Proficient in creating and managing Databricks Workflows to orchestrate job dependencies and schedule tasks.
  • Strong knowledge of Python, PySpark, and SQL for data manipulation and transformation.
  • Experience in integrating Databricks with cloud storage solutions such as Azure Blob Storage, AWS S3, or Google Cloud Storage.
  • Familiarity with external orchestration tools like Azure Data Factory.
  • Certifications :

  • Certified in one or more of the following :
  • o Databricks Certified Data Engineer Associate

    o Databricks Certified Professional Data Engineer

    o Microsoft Certified : Azure Data Engineer Associate

    o AWS Certified Data Analytics - Specialty

    o Google Cloud Professional Data Engineer

    Communication and Leadership Skills :

  • Ability to collaborate effectively with cross-functional teams and articulate complex technical concepts to non-technical stakeholders.
  • Strong problem-solving skills with experience working in Agile or Scrum environments.
  • Provide technical guidance and support to team members on Databricks best practices.