About thejob
At Minutes toSeconds we match people having great skills with tailorfitted jobsto achieve welldeserved success. We know how to match people to theright job roles to create that perfect fit.
This changes thedynamics of business success and catalyzes the growth ofindividuals. Our aim is to provide both our candidates and clientswith great opportunities and the ideal fit every time.
We havepartnered with the best people and the best businesses in Australiato achieve success on all fronts. We re passionate about doing anincredible job for our clients and job seekers.
Our success isdetermined by the success of individuals atworkplace
We would love theopportunity to work with YOU!!
Ourclient is looking for a Data Engineer to join a dynamic team inSydney on a permanentbasis.
Requirements
PrimaryFunctions :
- Ensurereliable performance of the data warehouse through monitoringperformance tuning and intelligent scheduling ofworkloads.
- Managethe existing data platform infrastructure and perform necessaryupgrades
- Manageand configure data engineeringworkflows
- Ensureintegrity and accuracy of data models by developing data qualitychecks reconciliation processes and thorough testingpractices.
- Contributeto the ongoing support of releases inproduction.
- Monitorand ensure adequate logging and alerting are set up for datasolutions.
- Maintainenterprise data warehouse / data lake and data engineering solutionson the cloud at an enterprisescale.
- Supportsthe development of agreed standards andtools
- Workwith data consumers to design flexible data marts that are reliableeasy to use andefficient.
Qualifications / Requirements :
- Minimumof 3 years experience in a similar role with a background in dataandsupport
- Timemanagement and organisational skills to manage competing deadlinesand conflictingpriorities.
- Understandingof architecture patterns and technical Implementation detailsrelating to security compliance and governance for BI applicationsand Infra deployment oncloud
- Strongskills in Jira andConfluence.
- Problemsolving
- Proficientwith SQL with the ability to write very complex SQL queries and toquery multiple datasources.
- Goodunderstanding and experience with ETL data structures and datamodelling (e.g. data warehousing and data marts) and how to bestoptimisethem
- Designand document data models including the underlying principles theirrelationship to the information architecture and managing integrityand changes to thosemodels.
- Experiencein building / managing the cloud infrastructure required for a dataplatform inAWS.
- Provenexpertise working with very large and complexdatasets.
- Goodunderstanding of Git and CI / CD pipeline technologies. Ideallybuildkite terraform AWS CDK and Dockercontainers.
- Experiencewith schema design and dimensional datamodelling.
Desirable / Bonus
- Experiencein AWS Technologies like Redshift EMR EC2 CloudFormation Securitygroups VPCs DMS S3 EKS Athena Redshift Glue RDS EC2 SnowflakeKubernetes Docker Airflow and Lambda but notessential
- Experienceusing Apache Spark for dataprocessing
- Experiencewith data streaming applications (e.g. kinesisRabbitmq).
- Experienceusing Terraform for Infrastructure ascode
- Experiencemaintaining Tableau Server or other Linux basedapplications.
Demonstrated experience implementing secure web based businesssolutions, specifically : experience developing web APIs using AzureAPI Manager and customer facing web portal development using .
NETFramework (including .NET Core) and ASP.NET Experience developingsingle-page web applications using either Angular 7+ or ReactDemonstrated experience in Azure based solution implementation,including : Azure technologies, such as : Azure Logic Apps, Azure SQLand Azure Service Bus and Azure API Manager Azure DevOps and VisualStudio for code development, management, repositories anddeployments (CI / CD) integrating custom built apps with third-partysystems, including Microsoft Dynamics 365 CRM