Generative AI (GenAI) tools, such as ChatGPT, are rapidly reshaping the way actuaries approach problem-solving, analysis, and communication. Beyond their familiar web-based interfaces, these tools offer powerful functionalities through programmatic integration, which allows users to directly access and interact with the underlying Large Language Models (LLMs) via APIs (Application Programming Interfaces). Unlike standard web-based interactions, API access enables actuaries to seamlessly integrate GenAI into their existing workflows and scale usage efficiently to handle larger volumes. In this two-hour web session, participants will experience live demonstrations of advanced GenAI concepts through a Jupyter notebook that presents each concept and applies it to actuarial use cases; the notebook will be shared with attendees to encourage experimentation and support adoption in their own actuarial workflows.
After introducing the basics of using LLMs through APIs, we will explore the following advanced GenAI concepts:
- Structured Outputs: Generating responses in structured formats like JSON to support easier and more reliable downstream processing.
- Function Calling: Enabling LLMs to execute predefined functions, such as calculations or database queries, to perform specific operations.
- Fine-Tuning: Customizing pretrained LLMs with domain-specific data to improve accuracy and relevance in generating responses.
- Retrieval-Augmented Generation (RAG): Combining LLMs with external data sources to produce contextually enriched outputs.
For each concept, the session will cover its purpose, underlying principles, and functionality, illustrated through a dedicated actuarial use case and complemented by further applications and resources. It will conclude with a forward-looking outlook on emerging developments, followed by an open Q&A and discussion.