Our client is looking for a Sr. Data Engineer to support and create / document new technical tool components of the Disaster Recovery Program within the Health Industry.
Must Have's :
- 5+ years proven experience working as a Data Engineer or similar role
- Bachelor’s degree in information technology, Engineering or Computer Science ; or
- A diploma in related Data Management / Information Technology stream, along with three additional years experience (over the required five years) working as a Data Engineer or similar role
- 2+ years proven experience using Informatica software (i.e., Power Centre, Integration Services, Workflow Manager, IDMC and TDM ) in an integrated support environment.
- Oracle and SQL Server Database Management Systems and tools
- ETL and data pipeline development experience; providing technical consulting and guidance to development teams for the design and development of highly complex or critical ETL architecture.
- Computer programming languages such as PL / SQL, R, Python
- Shell Scripting language
- Data Application Programming Interface (API)
Extensive experience with :
Information management, logic modeling, conceptual, business process, and workflow design.Requirements gathering, analysis, plan, design, develop, implement and maintain Data Management systems.Critical, constructive and creative problem-solving skills that involve issue identification, development of objectives, development of an action plan overseeing what needs to done, while identifying the resources required to ensure quality products.Cloud platform for data managementNice to Have's :
Microsoft Certified : Azure Data Engineer AssociateExperience working with healthcare dataResponsibilities :
Design and build the infrastructure required for optimal extracting, transformation, and loading of data from a wide variety of data sources using Informatica, Structured Query Language (SQL), SQL Server Integration Services (SSIS), Application Programming Interface (API) and other technologies.Architect relational and multi-dimensional databases from structured, semi-structured and unstructured data with development techniques including star and snowflake schemas, ETL, Slow Changing Dimensions (SCD), Fact and Cube development in a data management framework in conjunction with the Provincial Data Platform Infrastructure.Identify, design and implement internal process improvements : automate manual processes, optimize data delivery, re-design data pipelines for greater scalability.Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiencies, and other key business performance metrics.Develop, maintain, optimize, troubleshoot, debug, monitor, backup and recovery operations for the ETL environment.Analyze datasets to ensure compliance with data sharing agreements and legislation restrictions, and for alignment with data architecture guidelines.Mentor, support and train Information Analyst and Junior Data Management resources, as required.