Job Overview
NeenOpal Canada Inc. is looking for an experienced Data Integration Specialist to design, develop, and maintain scalable data pipelines and ETL processes. The successful candidate will collaborate closely with our global Business Analytics teams to support the implementation of data warehouse solutions tailored to client needs. This role offers the opportunity to work in a hybrid environment, with 1-2 days per week in our Fredericton office.
Key Responsibilities
- Data Pipeline Development : Design, implement, and manage robust data pipelines that ensure efficient data flow from various sources into data warehouses. This includes automating ETL processes using Python and advanced data architecture tools.
- Data Integration : Utilize industry-standard tools and technologies to integrate and transform data. This includes working with platforms such as :
- AWS Services : AWS Glue, AWS Data Pipeline, Amazon Redshift, and Amazon S3.
- Azure Services : Azure Data Factory, Azure Synapse Analytics, and Azure Blob Storage.
- Data Warehousing : Implement and optimize data warehousing solutions using Snowflake and Amazon Redshift.
- Database Management : Develop, optimize, and manage relational databases, including SQL Server, MySQL, and PostgreSQL, ensuring data integrity and performance.
- Performance Optimization : Continuously monitor and improve the performance of data processing workflows. Apply best practices for query optimization and data processing efficiency.
- Collaboration : Work closely with cross-functional teams in the US, India, and Canada to deliver high-quality data solutions. Ensure effective communication and collaboration across different time zones.
- Data Governance : Document data pipelines, ETL processes, and data mappings in line with established data governance standards. Ensure compliance with data quality and security protocols.
- Troubleshooting & Support : Diagnose and resolve data-related issues promptly, providing ongoing support to ensure the reliability of data solutions.
Required Skills and Experience
ETL & Data Systems Tools : Minimum 2+ years of experience in designing and developing ETL processes using tools such as AWS Glue, Azure Data Factory, or similar ETL platforms.Programming : Proficient in Python for data pipeline automation and system development. Familiarity with SQL for database management and query optimization.Cloud Platforms : Strong experience with cloud-based data services on AWS or Azure. Familiarity with GCP is a plus.Data Warehousing : Expertise in developing and managing data warehouse solutions using platforms like Snowflake, Amazon Redshift, or Azure Synapse Analytics.APIs & Data Integration : Experience in integrating data from various sources using RESTful APIs and other data integration techniques.Excellent Communication Skills : As you will be dealing with clients directly, articulating yourself and explaining the technical work performed is paramount.Preferred Qualifications
Advanced Cloud Expertise : Experience with advanced features and services within AWS, Azure, or Snowflake, and a commitment to staying updated with the latest data architecture technologies.Business Acumen : Ability to translate complex business requirements into scalable and efficient data solutions, aligning technical implementations with business objectives.What We Offer
Dynamic Work Environment : A hybrid work model that combines the flexibility of remote work with the collaborative benefits of in-office presence.Professional Growth : Opportunities for continuous learning and development within a fast-growing, global analytics company.Innovative Projects : Engage in cutting-edge data architecture projects with a diverse team of experts across the globe.Note : You are expected to have a valid work authorization in Canada to apply for this role.
#J-18808-Ljbffr