Salary: $100,000 - 140,000 per year
Requirements: - Extensive experience in data engineering or software engineering roles
- Strong programming skills in Python
- Deep expertise in building and maintaining production-grade data pipelines
- Proficient in SQL with experience handling both structured and unstructured data
- Hands-on experience with cloud infrastructure, preferably Azure
- Familiar with CI/CD pipelines and infrastructure-as-code practices
- Knowledge of data orchestration tools, such as Airflow or dbt
- Experience in containerized environments like Docker and Kubernetes
- Bonus: Experience in finance, trading, or regulated/high-performance settings
- Bonus: Familiarity with Databricks or similar managed data platforms
- Bonus: Strong mindset for documentation, collaboration, and knowledge sharing
- Bonus: Comfort with the full data lifecycle from ingestion to analytics and reporting
Responsibilities: - Develop scalable, reliable ETL/ELT pipelines utilizing Python and SQL
- Design and manage batch and near-real-time data workflows with Python frameworks
- Integrate various data sources from Azure Blob, ADLS, alongside relational and non-relational systems
- Transition legacy data architectures to modern cloud-native environments
- Optimize workloads to enhance performance, minimize costs, and ensure fault tolerance
- Maintain data quality, observability, and reliability across pipelines
- Take ownership of critical infrastructure components for batch and streaming systems
- Deconstruct complex technical challenges into manageable components
- Mentor junior engineers through code reviews and architectural guidance
- Automate and standardize pipelines to reduce overhead and boost velocity
- Contribute to internal tools, documentation, and engineering best practices
- Collaborate directly with client engineering teams across finance, trading, and enterprise technology
- Rotate across different industries and domains every 6–12 months for broad exposure to systems and technologies
- Engage in greenfield builds, system migrations, and enterprise-scale data transformations
- Collaborate as a true partner in shaping data product strategies
Technologies: - Airflow
- API
- Azure
- CI/CD
- Cloud
- Databricks
- Docker
- ETL
- Kubernetes
- Python
- SQL
- dbt
- AI
- Data Warehouse
More:
At Tactable, we are dedicated to transforming organizations by developing exceptional software that not only scales but also drives significant change. Our growing cloud, data, and API engineering firm is committed to delivering top-tier solutions through expert partnerships and a steadfast focus on quality. We prioritize embedding ourselves within client operations, navigating complete project cycles with the agility of startups while adhering to enterprise-grade standards. Based in downtown Toronto, we embrace a hybrid work model, fostering a culture of trust, collaboration, and curiosity as we tackle meaningful challenges alongside ambitious developers.
last updated 14 week of 2026