Senior Cloud Platform Developer – Telesat
Telesat is a leading global satellite operator. We are seeking a highly skilled Kafka Expert to join our data platform team.
Key Responsibilities
- Design, deploy, and manage Apache Kafka clusters across development, testing, and production environments.
- Deploy and manage Apache Spark and Apache Flink in production.
- Optimize Kafka performance, reliability, and scalability for high‑throughput data pipelines.
- Integrate Kafka with other systems and services.
- Manage and troubleshoot Linux‑based (Ubuntu) systems supporting Kafka infrastructure.
- Operate Kafka on Kubernetes clusters using Helm, Operators, or custom manifests.
- Collaborate with cross‑functional teams to identify and implement Kafka use cases.
- Implement automation and Infrastructure as Code practices through CI / CD pipelines with GitLab.
- Monitor system health, implement alerting, and ensure high availability.
- Participate in incident response and root cause analysis for Kafka and related systems.
- Evaluate and recommend Kafka ecosystem tools such as Kafka Connect, Schema Registry, MirrorMaker, and Kafka Streams.
- Build automation and observability tools for Kafka using Prometheus, Grafana, Fluent Bit, etc.
- Build end‑to‑end Kafka‑based pipelines for data integration, event‑driven microservices, logging, and monitoring.
- Configure resource allocation, job scheduling, and cluster scaling for Spark and Flink on Kubernetes, YARN, or standalone clusters.
- Implement metrics collection, log aggregation, and alerting for job health and performance.
- Ensure security through TLS, Kerberos, RBAC, and integration with OAuth or other identity providers.
- Work with time‑series databases for monitoring and analytics.
Required Qualifications
5+ years of experience administering and supporting Apache Kafka in production.Strong expertise in Linux system administration (Red Hat and Debian).Solid experience with Kubernetes (CNCF distributions, OpenShift, Rancher, or upstream K8s).Proficiency in scripting (Bash, Python) and automation tools (Ansible, Terraform).Experience with Kafka security, monitoring (Prometheus, Grafana, Istio), and schema management.Familiarity with CI / CD pipelines and DevOps practices.Comfortable with Helm, YAML, Kustomize, GitOps, and GitLab principles.4+ years of experience in Apache Spark development, building scalable pipelines and optimizing distributed processing.Additional Requirements
Must be able to work in Canada and obtain Canadian Reliability Clearance.
Equal Opportunity Employer
We are an equal opportunity employer. We provide accommodations during the interview process. All accommodations are confidential.
#J-18808-Ljbffr