Transforming Finance With
Cutting - Edge Generative AI
Terminologies
• Generative AI – creates new content (text, images,
videos, etc.) from a dataset (PDF, XLS, integration to
systems like ERP, etc.)
• Large Language Models (LLM) –GenAI models that
understand & generate text like humans (e.g.,
ChatGPT, Bard, Llama)
• Prompt–“instructions” users give to LLM (written in
plain English)
What does GenAI OFFER for FINANCE?
• Generative AI has the potential to transform Finance.
• Many organizations are launching ‘sandbox’ LLMs for use
inside your company. You may have one already.
• These tools will be part of your every day sooner than you
think.
Areas of Application
• Automated report generation:
• GenAI can generate financial reports, summaries, and analyses quickly and accurately, reducing
manual effort and increasing efficiency.
• Enhanced customer support:
• AI-driven chatbots can handle customer inquiries, provide financial advice, and offer personalized
recommendations, improving customer satisfaction and engagement.
• Fraud detection and risk management:
• GenAI models can analyze transaction patterns to detect fraudulent activities and assess risks in real-
time, enhancing security measures.
• Market analysis and forecasting:
• GenAI can process vast amounts of data to predict market trends, provide investment insights, and
support strategic decision-making processes.
• Personalized financial planning:
• AI can offer customized financial advice and planning based on individual user data, helping clients
meet their financial goals efficiently.
LLMs Use in Finance
• Natural language understanding and generation:
• LLMs can comprehend and generate human-like text, which is essential for tasks like drafting
documents, creating content, and customer interactions.
• Enhanced customer support:
• They power chatbots and virtual assistants that can handle customer inquiries, provide support, and
deliver personalized assistance, improving engagement and satisfaction.
• Data analysis and insights:
• LLMs can analyze large volumes of unstructured text data to extract meaningful insights, trends, and
patterns, aiding in decision-making processes in fields like finance and healthcare.
• Language translation:
• These models can translate text between different languages, facilitating better communication in a
globalized world and supporting multilingual environments.
• Automation of routine tasks:
• LLMs assist in automating routine and repetitive tasks such as report generation, email responses, and
summarizing large amounts of text, thereby increasing productivity and efficiency.
PROMPT VS prompting
• A prompt is a cue or stimulus that initiates or guides an
action, response, or thought process.
• An AI prompt establishes a communicative channel
between a user and a sophisticated language model,
guiding the model to produce a specific type of output.
This interface might manifest as queries, textual entries,
coding excerpts, or illustrative cases.
PROMPT VS prompting
• Prompting is the process of designing and refining
everyday language text/voice prompts for use in Large
Language Models.
• Prompt engineering is designing the inputs you enter into
AI models to get the desired output quickly and accurately.
In other words, learning prompt engineering is learning to
communicate with LLMs. Although communicating with AI
is not like in science fiction movies, it is still a fun process.
CHAT GPT RESPONSE
WHY IS PROMPTING IMPORTANT?
Prompt Engineering: A Profession or an Essential
Skill?
• Specialized Knowledge and Expertise – Profession
• Requires deep understanding of AI, NLP, and user interaction to design effective AI prompts
• Industry Demand and Opportunities – Profession
• Growing demand for experts in optimizing AI interactions for various applications, making it a
viable career path.
• Broad Applicability Across Roles - Essential Skill
• Valuable for product managers, educators, researchers, and developers to interact with and
leverage AI effectively.
• Continuous Learning and Adaptation - Profession and Skill
• Demands continuous learning and adaptation to stay updated with evolving AI technologies
and techniques.
• Enhances Productivity and Efficiency - Essential Skill
• Crafting effective prompts can significantly improve productivity and efficiency in tasks
involving AI interactions.
Effective Prompting
• Be clear and specific
• Make prompts unambiguous, specifying format, tone, and desired output
• Provide actuals to forecast variance
• Summarize actuals to forecast expense variance for Q1 2023 for North America region, including
which expense lines drove the greatest variance and why
• Provide context
• Guide LLM with helpful details
• Create a forecast simulation that assumes sales decrease and costs increase
• Model a forecast simulation where US sales fall by 5% and my costs of sales increase by 10%; what
is the P&L impact?
• Consider phrasing
• Avoid open-ended questions
• Which active customers are the riskiest?
• Looking at historical on-time payment details, which five customers with open account balances
are most likely not to pay?
Effective Prompting
• Be clear and specific
• Make prompts unambiguous, specifying format, tone, and desired output
• Provide actuals to forecast variance
• Summarize actuals to forecast expense variance for Q1 2023 for North America region, including
which expense lines drove the greatest variance and why
• Provide context
• Guide LLM with helpful details
• Create a forecast simulation that assumes sales decrease and costs increase
• Model a forecast simulation where US sales fall by 5% and my costs of sales increase by 10%; what
is the P&L impact?
• Consider phrasing
• Avoid open-ended questions
• Which active customers are the riskiest?
• Looking at historical on-time payment details, which five customers with open account balances
are most likely not to pay?
Things to Consider When Prompting
Prompt Categories
Conclusion
• https://www2.deloitte.com/us/en/pages/consulting/articl
es/prompt-engineering-for-finance.html
• https://textcortex.com/post/what-is-prompt-engineering
• https://www.promptingguide.ai/
GenAI for FINANCE
Six strategies for
getting better
results
Write clear instructions :
• Include details in your query to get more relevant answers
• Ask the model to adopt a persona
• Use delimiters to clearly indicate distinct parts of the input
• Specify the steps required to complete a task
• Provide examples
• Specify the desired length of the output
Worse Better
How do I add numbers in Excel? How do I add up a row of dollar amounts in Excel? I
want to do this automatically for a whole sheet of
rows with all the totals ending up on the right in a
column called "Total".
Who’s president? Who was the president of Mexico in 2021, and
how frequently are elections held?
Write code to calculate the Fibonacci sequence. Write a TypeScript function to efficiently calculate
the Fibonacci sequence. Comment the code
liberally to explain what each piece does and why
it's written that way.
Summarize the meeting notes. Summarize the meeting notes in a single
paragraph. Then write a markdown list of the
speakers and each of their key points. Finally, list
the next steps or action items suggested by the
speakers, if any.
SYSTEM
When I ask for help to write something, you will
reply with a document that contains at least one
joke or playful comment in every paragraph.
USER
Write a thank you note to my steel bolt vendor for
getting the delivery in on time and in short notice.
This made it possible for us to deliver an important
order.
SYSTEM
You will be provided with a pair of articles
(delimited with XML tags) about the same topic.
USER
First summarize the arguments of each article.
Summarize the text delimited by triple quotes with
Then indicate which of them makes a better
a haiku. """insert text here"""
argument and explain why.
USER
<article> insert first article here </article>
<article> insert second article here </article>
Provide reference text
• Instruct the model to answer using a reference text
• Instruct the model to answer with citations from a
reference text
Split complex tasks into simpler subtasks
• Use intent classification to identify the most relevant
instructions for a user query
• For dialogue applications that require very long
conversations, summarize or filter previous dialogue
• Summarize long documents piecewise and construct a full
summary recursively
Give the model time to "think"
• Instruct the model to work out its own solution before
rushing to a conclusion
• Use inner monologue or a sequence of queries to hide the
model's reasoning process
• Ask the model if it missed anything on previous passes
Use external tools
• Use embeddings-based search to implement efficient
knowledge retrieval
• Use code execution to perform more accurate calculations
or call external APIs
• Give the model access to specific functions
Test changes systematically
• Evaluate model outputs with reference to gold-standard
answers