GIC is a custodian of Singapore's Reserves and a Sovereign Wealth Fund to secure Singapore's financial future. GIC is entrusted with a critical mission that impacts the lives of all Singaporeans. We are looking for the best talent to help us fulfil our commitment to the future of Singapore and its people.
Investment Insights Group (IIG)
The Investment Insights Group (IIG) uses advanced quantitative techniques and cutting-edge technological tools to generate insights that drive outstanding investment outcomes for GIC.
We are a multi-disciplinary team of Quantitative Researchers and Software Developers that work closely with each of our investment departments – from Public Markets like Equities and Fixed Income, to Private Markets like Private Equity and Real Estate – to drive superior returns.
Our team of software developers, whom we call “Alpha Technologists”, develop platforms for investment teams and quantitative researchers, allowing GIC to harness our differentiated quantitative methods at scale. They also drive synergy across departments to enhance cross-asset investment capabilities at GIC.
- Be part of the team to design, develop and maintain our next generation event-based quantitative modeling platform for quantitative research and model productionisation
- Devise and implement the best-in-class DevOps to support rapid research and model productionisation demands with the necessary control and governance
- Handle production issue relating to our end-to-end quant operations
- Be a core contributor and gatekeeper to our quantitative libraries written in R and Python for scientific modelling, evaluation, back testing, as well as reporting purposes
- Implement best practices for coding in R and Python for the quant strats
Join us if you have/are
- A Bachelor's or Master's degree in Engineering/Applied Mathematics/Computer Science subjects with exposure to financial markets or scientific modeling
- Proficient in R and Python. Have experience building libraries supporting a group of researchers for scientific modelling, evaluation, as well as reporting purposes
- Experience in designing and maintaining data processing systems for both batch and streaming demands (e.g. Apache Spark, Apache Beam, Apache Kafka, Apache Airflow)
- Familiar with different database, object store and caching technologies (e.g. MSSQL, Snowflake, Neo4j, S3, NoSQL, Redis)
- Familiar with cloud services and serverless technologies including and not limited to AWS, Docker, Kubernetes
- Familiar with DevOps (e.g. Jenkins, GitOps) and/or MLOps (e.g. AWS SageMaker, ML Flow)