Qualifications and Experience Criteria:
Bachelor's Degree in Computer Science, or related field of study
5+ years of prior professional programming experience minimum
Advanced Python and strong SQL database knowledge required.
Working knowledge of NoSQL database such as MongoDB and Cassandra would be useful but not essential.
Experience with using Pandas/NumPy would be an advantage
Intermediate knowledge of C# and basic knowledge of other object-oriented languages such as C++ or Java would be useful
Experience with data manipulation, ETL, data analysis, and time series would be a major advantage
Working knowledge of cloud-based technologies such as Azure and AWS
Excellent communication skills and ability to interact with a wide range of stakeholders
Experience with fixed income products and supporting a rates trading desk would be an advantage but not a must and candidates with a background outside of finance will also be considered
Experience with market data, pricing, and risk concepts would be an advantage; Familiarity with market data solutions providers such as Bloomberg, Refinitiv, Reuters, Factset would be helpful.
Experience with database development will be essential; identifying business and technical requirements and designing, developing and implementing database systems.
Fast learner with the ability to adapt quickly, think laterally and work in a dynamic environment
Responsibilities:
Participate in full life-cycle software development and write clean, tested, and well-documented code
Contribute ideas for new features and identify areas for improvement proactively
Be responsible for database development, architecture and infrastructure including data ingestion, processing and internal distribution, as well as analytical platforms
Work directly with the key stakeholders to integrate & optimize research data processes and models
Design and develop cloud-native solutions in C#/Python/SQL to promote consistency of use of the Firm-wide data platform across numerous analytical tools
Build robust data pipelines from many sources of varied levels of data quality, supporting both research demands, ad-hoc client analytic requests, and the core production applications
Drive a technology strategy with increased levels of automation to reduce risk and enhance data quality and associated controls
Partner with the Data Science team to create next-generation risk models including machine-learning driven platforms and code sets
Permanent