About The Company
At AlphaLayer, we help institutional investors uncover investment edge at scale with a repeatable research process that leverages core technology, data, and AI to develop differentiated investment strategies and signals.
About The Role
We're looking for an Intermediate Data Engineer with strong foundations in data warehousing, Python, and data workflows—and an interest in growing their skills in software engineering, DevOps, and cloud infrastructure.
This role is ideal for someone who enjoys working on real-world problems, has solid technical fundamentals, and is eager to learn. You’ll join a cross-functional team that combines AI, quantitative finance, and engineering to build production-grade systems that deliver real business impact.
What You’ll Do
- Design, build, and maintain robust, scalable data pipelines
- Work with tools like Snowflake, Prefect, and Python to enable reliable and efficient workflows
- Collaborate with software engineers, infrastructure specialists, and quantitative experts
- Troubleshoot and optimize data flows in both development and production environments
- Contribute to internal tooling and automation as the team grows its DevOps / cloud maturity
- Participate in code reviews, team planning, and knowledge-sharing
Must-Have Skills
2–4 years of experience in a data engineering or related roleHands-on experience with data warehousing platforms (e.g., Snowflake, Databricks, BigQuery, etc.)Strong Python skills, including libraries such as Pandas, SQLAlchemy, or equivalentExperience working with workflow orchestration tools like Prefect, Azure Data Factory, or AirflowSolid SQL skills and understanding of performance optimizationFamiliarity with version control (Git / GitHub) in a collaborative settingComfort using Unix / Linux systems for development and debuggingExperience working with structured and semi-structured data (e.g., JSON, Parquet, CSV)Nice-to-Have Skills
Understanding of data modeling concepts (e.g., star schema, dimensional modeling)Basic understanding of Streamlit and / or React for data presentation or internal toolsExposure to Docker and Kubernetes in local or cloud environmentsFamiliarity with cloud platforms (Azure, AWS, or GCP) and how applications are deployed in themExperience working with CI / CD pipelines (e.g., GitHub Actions, Azure Pipelines)Exposure to Infrastructure as Code tools like ARM templates or TerraformExperience interacting with APIs (REST / GraphQL) for data ingestion or automationWhat We’re Looking For
Edmonton-based candidates willing to work in-office at least once a weekStrong communication and collaboration skillsA self-starter with a growth mindset, eager to expand into software and cloud engineeringInterest in working across disciplines in a production-focused teamWhy Join Us
Be part of a high-impact team productionalizing cutting-edge AI and quantitative modelsWork with experienced engineers, researchers, and infrastructure expertsGrow your skills in cloud, DevOps, and end-to-end systemsCompetitive compensation and benefitsFlexible hybrid work model with an emphasis on collaboration and learningOpportunity to start working with cutting edge Agentic technologiesLocation : Edmonton, Alberta, Canada. Salary : CA$50,000.00–CA$80,000.00 per year.
#J-18808-Ljbffr