Take ownership of core data pipelines, ensuring resilience, optimal performance, timely delivery, data quality, and seamless onboarding of new features.
Continuously evolve data models and schemas t...Show moreLast updated: 11 days ago
Promoted
Agile Coach - Remote / Telecommute
Cynet Systems IncOld Toronto, ON, Canada
Remote
Full-time
Mentor and support established Agile teams, utilizing advanced understanding of Scrum, Kanban, and scaled Agile practices.
Assess current team practices and maturity levels using metrics and perform...Show moreLast updated: 30+ days ago
Work from home - $25-$45 per hour with No Experience
OCPAToronto, Ontario, CA
Remote
Part-time +1
Product Testers are wanted to work from home nationwide in the US to fulfill upcoming contracts with national and international companies.
We guarantee 15-25 hours per week with an hourly pay of bet...Show moreLast updated: 30+ days ago
People also ask
The cities near Toronto, ON that boast the highest number of telecommute jobs are:
Take ownership of core data pipelines, ensuring resilience, optimal performance, timely delivery, data quality, and seamless onboarding of new features.
Continuously evolve data models and schemas to meet business and engineering requirements Implement and maintain systems to monitor and enhance data quality and consistency.
Develop tools that support self-service management of data pipelines (ETL) and perform SQL tuning to optimize data processing performance.
Contribute to the Data Engineering team’s technical roadmap, ensuring alignment with team and stakeholder goals.
Write clean, well-tested, and maintainable code, prioritizing scalability and cost efficiency Conduct code reviews to uphold code quality standards and facilitate knowledge sharing.
Participate in on-call rotations to maintain high availability and reliability of workflows and data pipelines.
Collaborate with internal and external partners to remove blockers, provide support, and achieve results.
Requirements :
3+ years of professional experience in data engineering.
Experience working directly with cross-functional stakeholders from analytics, science, product and engineering to align data solutions with business goals.
Proficiency in validation, explanation and visualization to present and analyze data and insights.
Strong skill in designing complex data model and resilient data pipelines that enrich them.
Proficiency in SQL being able to understand complex business logic and implement own one.
Deep understanding of MPP systems and hands-on experience with ETL in Spark or similar technologies.
Proficiency in writing data transformation scripting in Python, Java or similar.
Hands-on / Practicalexperience with workflow management tools (e.g., Airflow or similar).
Proven ability to collaborate with cross-functional stakeholders (analytics, science, product, and engineering) to ensure data solutions align with business objectives.
Proficient in data validation, explanation, and visualization for effective data analysis and insight presentation.