Job Title : Data Architect / Lead Data Engineer
Location : Downtown Vancouver, BC
Contract Duration : 6 months to start, high chance of extension
Hybrid : 3 days in the office
About the Role :
Our client is seeking a Data Architect / Lead Data Engineer with 10+ years of experience to join their team and lead a group of data engineers in a dynamic and complex environment. In this role, you will leverage your extensive expertise in data engineering to drive solutions for sophisticated business requirements, optimize processes, and ensure that data systems are both efficient and scalable. The ideal candidate will have a deep understanding of modern data platforms such as Snowflake, experience working with cloud technologies (Azure preferred), and the ability to mentor and guide junior engineers.
Key Responsibilities :
- Leadership & Mentorship : Lead a team of data engineers, providing mentorship and guidance while driving the technical solutions that meet complex data requirements.
- Data Platform Expertise : Oversee the design and development of modern data platforms, including data lakes and data warehouses, with a focus on Snowflake architecture. Ensure the platforms meet both functional and non-functional business requirements.
- Integration & Process Optimization : Identify, design, and implement integration, modeling, and orchestration of complex finance data systems. Continuously look for opportunities to improve processes, optimize data delivery, and automate manual tasks.
- Scripting & Analytics : Utilize a variety of scripting languages (SQL, Python, PowerShell, JavaScript) to handle data processing, analytics, and automation needs.
- Performance Tuning & Troubleshooting : Analyze performance bottlenecks, optimize systems for speed and efficiency, and troubleshoot technical challenges in ambiguous or evolving environments.
- Cloud Systems : Lead the deployment and management of cloud-based data systems (experience with Azure preferred), ensuring scalability and efficiency in cloud infrastructure.
- CI / CD Implementation : Implement and manage CI / CD pipelines to ensure continuous integration and delivery of high-quality data systems.
- Security & Code Optimization : Develop performant, secure, and scalable code to meet business and security standards.
- Technical Evaluations : Lead Proof of Concepts (POC) to evaluate new technologies, providing insights and recommendations to support business decisions.
- Real-Time Analytics : Design and implement systems for Real-Time Analytics and Real-Time Messaging , ensuring that data systems can handle real-time processing needs.
- Data Integrity : Build and monitor data integrity checks as part of the application delivery process to ensure data quality and consistency.
- Kafka Technologies : Leverage Kafka technologies to manage data streams, enabling scalable and efficient data communication across systems.
- Best Practices : Establish and promote best practices for development frameworks, ensuring the team adheres to industry standards and company requirements.
Required Experience & Skills :
15+ years of experience in data engineering , with a proven track record of delivering large-scale, complex data solutions.Extensive experience in leading and mentoring a team of data engineers .Strong understanding of modern data platforms , especially Snowflake for data warehousing, and data lakes .Proven expertise in designing and assembling complex data systems that meet both functional and non-functional business needs.Deep knowledge of integration , modeling , and orchestration of complex finance data.Strong scripting skills in SQL , Python , PowerShell , and JavaScript .Experience with performance tuning , optimization , and troubleshooting data systems in complex environments.Hands-on experience with cloud-based systems , especially Azure .Familiarity with CI / CD pipelines and implementing DevOps practices .Ability to build secure and performant code while ensuring the integrity of data solutions.Strong experience in Real-Time Analytics , Real-Time Messaging , and using technologies like Kafka .Expertise in building data integrity checks for applications.Experience working with large data volumes, preferably in retail or other high-data-volume sectors.What’s in it for You :
Opportunity to lead a talented team and make a significant impact on complex data engineering projects.Work in a dynamic and fast-paced environment with opportunities for mentorship and career growth .Competitive compensation and benefits, with a focus on professional development .