Job descriptionFixed Income (FI) is the primary source of capital for corporations, governments, and nonprofits, providing liquidity and innovative solutions globally across the credit, municipal, and securitized markets. FI’s activities include origination, structuring, investing, lending, and market making and it offers a variety of products including, but not limited to, corporate bonds, emerging markets bonds, asset backed securities, mortgage backed securities, collateralized loan obligations (CLOs), municipal securities, agency securities, short term interest rate products, loans, letters of credit, and derivative instruments, such as interest rate swaps, total return swaps, rate locks, and credit default swaps. FI Front office technology team is building the next generation e-trading data team in Canada. Data team builds ETL pipelines to load data from market data vendors like BBG, LSEG, Markit etc... using various team channels like Kafka and solace into different types of LAP databases (like KDB, Pinot, sql server etc). These pipelines require a latency of milliseconds as a lot of real time analytical applications are relying on the same to provide insights to our traders and quantitative analysts. Ideal Candidate Ideal candidate should be able to understand the growing demands of data within the organization and should be able to design the pipelines accordingly. The candidate should also be able to demonstrate skills like self-development, team development, networking, strategic thinking and willingness to work with global teams.
Responsibilities:
Design and build scalable, low-latency, fault-tolerant streaming data pipelines that empower Data Scientists, Quants, and Traders to extract meaningful and timely insights from our data assets.
Establish and support an efficient, sustainable, and operational resilient team to support the execution of the multi-year roadmap focusing on value delivery, on-time, on-budget and with high quality.
Work closely with business and technology stakeholders to build the next generation Distributed Streaming Data Pipelines and Analytics Data Stores using streaming frameworks (e.g. Flink, Spark Streaming, etc.).
Collaborate with application teams in designing effective solutions to challenging latency and/or throughput requirements.
Develop comprehensive knowledge of how areas of business integrate to accomplish business goals.
Maintain an on-going understanding of emerging data management technologies, industry trends and best practices.
The position requires analytical skills in order to filter, prioritize and validate potentially complex material, technical or business or otherwise, from multiple sources.
Qualifications:
5+ years of experience with Java development
Experienced in developing real time low latency applications
Experience with distributed stream processing frameworks: Flink, Spark Streaming, Kafka Streams
Experience in MPP platforms like Trino(presto), snowflake is a plus
Experience with deployment platform such as Kubernetes, OpenShift
Expertise in event driven architectures.
Excellent knowledge of Multithreading, Thread-Pools with strong OOP and OOAD skills
Experience in software development SDLC (Agile)
Ability to work on multiple projects concurrently and meet deadlines
Ability to work with globally distributed teams
Education:
Bachelor’s degree/University degree or equivalent experience
Master’s degree preferred
Citi Canada is an equal opportunity employer. Accordingly, we will make accommodations to respond to the needs of people with disabilities (including, without limitation, physical and mental health disabilities) during the recruitment process and otherwise in accordance with law. Individuals who view themselves as Aboriginals, members of visible minority or racialized communities, and people with disabilities are encouraged to apply.
View Citi’s EEO Policy Statement and the Know Your Rights poster.
#J-18808-Ljbffr