The Future by ChatGPT
The Future by ChatGPT
2
Why we are doing this report?
1 100m users faster than any other app Months to reach 100m users
4
AI has been evolving, and inspiring movie producers, for close to a century
1943 1950 1956 1966 1980s 1990s 1997 2000s 2010 2011 2012 2014 2015 2016 2017 2021
*a period of reduced
First chatbot, ELIZA funding and interest in AI Invention of Generative
“WALL-E” movie Google creates dedicated
was developed Digital Equipment Paper on ImageNet Adversarial Network division of Google AI
Corporation commercialised published, showcasing (GAN)
DeepMind’s AlphaGo
XCON, a system that emulates the potential of deep defeats top Go player
decisions of human experts learning Lee Sedol
“2001: A space
The concept of Turing odyssey” movie Baidu “all in” AI
Test introduced for “Terminator” movie
measuring a machine's
ability to exhibit
intelligent behavior
*Movies inspired by AI Rapid growth in the field of deep learning, a subset of machine learning that
uses artificial neural networks to model and solve complex problems 5
© Momentum Works
Generative AI is revitalising AI and tech ecosystems since 2022
Generative AI is a type of AI that use deep learning models to create new and original content by learning from large datasets
they have been trained on. The content generated includes but not limited to text, images, soundtracks and videos.
Generative AI entered the mainstream in 2022, starting with the public launches of image generators DALL-E 2, Stable
Diffusion and Midjourney. ChatGPT, a Large Language Model (LLM) based chatbot launched on 30 November 2022, shakes
many things up.
Millions of people are using ChatGPT on a variety of tasks; tech giants (incl. Google, Baidu, Alibaba) are rushing their own
generative AI models/products to avoid being disrupted, so are existing venture-funded AI startups; investors are debating on
which amongst hundreds of new AI startups to back; while companies in all sectors are trying to understand the impact (or
opportunities) on their business model, people and organisation.
p.s. the text above was NOT generated by ChatGPT.
Launches image
Google acquires
generator and
DeepMind,a
editor GLIDE
British AI
research lab Launches Large
Open AI founded as Language Model
Launches OPT-175B
a non profit publishes Transformers
Deep-learning OpenAI launches image
paper, bringing scalability Open AI launches GPT-3, Microsoft generator
based model
into Natural Language GPT-2, Microsoft acquires exclusive Stable Raises $101 million
Generative
Processing invests $1 billion licensing of GPT-3 Diffusion Launches Large in seed round Launches LaMDA
Adversarial
Launches Large Language Model PaLM, based chatbot
Network (GAN)
Language Model AI chatbot LaMDA 2 BARD
introduced
LaMDA
2014 2015 2016 2017 2018 2019 2020 2021 2022 2023
Launches language
model BERT Launches language
Open AI launches
model YaLM
releases image multimodal vision-language Launches GPT-3.5
generator, debuts chatbot model CLIP and image based chatbot
Tay, which was Launches language generator DALL-E
DeepDream ChatGPT
shut down within 16 model GPT-1 Launches DALL-E 2
hours because of Launches language
model, RoBERTa Launches image
controversial generator
(racist and Midjourney Launches GPT-4,
inflammatory) accessible through
tweets Introduced ChatGPT Plus and
Large Launches language API
Launches an optimised library Language models Chinchilla, Launches generalist
Launches OpenAI for training language models, Model Gopher and AI Gato
Five (Dota 2 bot) Megatron-LM MT-NLG vision-language model 7
© Momentum Works Flamingo
Big techs have built/supported large language models, the very foundation of generative AI
RoBERTa, OPT-175B Cicero, Multi Ray The emergence of LLMs, and the
capabilities that products based on
LLMs have demonstrated, are
fundamentally changing the landscape
of AI, and perhaps much more.
ERNIE ERNIE Bot
8
© Momentum Works
Large Language Models (LLMs) are enabled by recent technological advances
1 2 3 4
High performance Larger & more diverse Better NLP algorithms Improved architecture
computing datasets
Advances in The growth of the internet Advances in algorithms for LLMs use advanced neural
high-performance and advances in data Natural Language network architecture, such as
computing, including the use storage technology have Processing (NLP), such as transformers, that enable
of Graphics Processing made it possible to gather attention mechanisms, have them to process and model
Units (GPUs) and cloud and store massive amounts enabled LLMs in identifying complex sequences of text
computing, have made it of text data from a wide and focusing on relevant more effectively and
possible to train large-scale range of sources, including parts of the input sequence, efficiently than earlier
language models efficiently. books, websites, and social leading to more accurate language models, speeding up
media platforms. predictions for a given task. the training process.
A combination of technological advancements in the past years, including the ones listed above, has enabled creators to build and train Large Language
Models (LLMs). While it is still expensive (in terms of computing power) to train the mainstream LLMs we know of today, the costs are expected to go
down as technologies continue to advance, people continue to look for optimisation and the economic benefits start to justify that effort.
9
© Momentum Works
Large Language Models are becoming very large indeed
ELMo GPT-1 BERT RoBERTa Transformer ELMo GPT-2 Megatron-LM LLaMA Chinchilla YaLM ERNIE
94M 117M 340M 354M 465M 1.5B 8.3B 65B 80B 100B 100B
The base of
?
ChatGPT
Undisclosed
LaMDA GPT-3 Jurassic-1 Gopher MT-NLG PaLM PaLM-E GPT-4 number of
137B 175B 178B 280B 530B 540B 562B ??? parameters
Parent
10
© Momentum Works
A Large Language Model is a specific type of neural network
11
© Momentum Works
How to construct and train a Large Language Model
Raw data
Untrained model
Cleaning,
normalising &
Pre-trained model
splitting Architecture
Custom model for
Training, validation Parameters specific task
Training Trained
and test data sets
model
Algorithms Further
parameters In context
Tokenisation With weights, optimised
learning (zero
biases & parameters*
shot, few shots)
Construction embeddings
or fine tuning
Tokens Vocabulary *in the case of fine-tuning
12
© Momentum Works
Pre-trained models can be improved by further training - with or without additional input
Better outputs
too?
E.g. You have a computer programme that E.g. Using the same programme but now you also E.g. With the images of zebra that you have shared
has been trained to recognize the image of include a few labelled images of a zebra in the with the computer, you change or add certain
birds, horse, and dogs and now you give prompt, and one additional unlabelled image of parameters and now the programme will “know” and
them an image of a zebra. They won’t be zebra, they will be able to tell you that it is a zebra “remember” what is a zebra
able to recognise it but they will try to but won’t change any parameters
make a prediction that it is a horse
There are 3 major ways to make better LLMs and other generative AI models
14
© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.
2. Who is Open AI
15
Who is OpenAI?
ChatGPT
Early backers
17
© Momentum Works
The vision and motivation for OpenAI, according to its co-founders
And so the alignment problem is: The first motivation [to start I think what's a real story here in
how do we build AGI (Artificial OpenAI] was … the way to make the my mind is an amplification of
General Intelligence) that does most progress in AI was by merging what humans can do. It's kind of
what is in the best interest of science and engineering into a like you hire six assistants.
humanity? How do we make sure single whole… so there is no They're not perfect. They need to
that humanity gets to determine distinction or as little distinction as be trained up a little bit; they don't
the future of humanity? And how possible between science and quite know exactly what you want
do we avoid both accidental engineering. So that all the science to do always. But they're so eager;
misuse, ... and then the inner is infused with engineering, they never sleep; they're there to
alignment problems, where what if discipline, and careful execution. help you. They're willing to do the
this thing just becomes a creature And all the engineering is infused drudge work, and you get to be the
that views us as a threat?” with the scientific ideas…” director."
18
© Momentum Works
OpenAI has been spearheading AI research/engineering in multiple areas
Whisper: Automatic Speech Recognition (ASR) model for transcription,
translation and voice assistants.
Open-source
Jukebox: Music generation model, including singing, across different styles and
genres. It can create original music and “complete” songs based on short samples.
Point-E: Generating 3D models from text descriptions, which can be used for 3D
printing, manufacturing and prototyping.
GPT-4
CLIP DALL-E DALL-E 2
20
© Momentum Works
How does (or rather, will) OpenAI make money?
OpenAI is probably amongst the few tech startups with clearest paths to profitability. Below are 3 actual and possible ways that OpenAI can
monetise ChatGPT and its other models / products.
Businesses integrating API access to Consumers or business who use ChatGPT Large enterprises/public sector
Target
the GPT language model into their more extensively (or need better support) organisations with customised needs
customers
own platforms/services/applications compared to the average casual user and security concerns
Based on factors such as the number Customers pay a specific subscription fee OpenAI to deploy their LLM and other AI
of API calls made, the complexity of to gain premium access (availability in technologies on private infrastructure,
How it works
the models used, and the amount of high demand, faster response, and priority the goal is to have more control over
data processed. access to new features) to ChatGPT the data and security.
Successful
examples
The recent slashing of API costs shows that OpenAI understands the importance of making ChatGPT a well-adopted piece of infrastructure. OpenAI
also created a US$100m startup fund in 2021 to foster an ecosystem of startups propagating the use of OpenAI models/products.
It is worth noting that unlike open-sourced GPT-2, GPT-3 is private. This means that OpenAI has better control over access (through APIs) and
monetisation, while the code is protected. The private nature also, in theory, prevents third parties creating harmful alterations.
21
© Momentum Works *the number of tokens are based on the input; see slide 12 to see how OpenAI calculate the tokens
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.
22
Big techs are driving, or impacted by, the advent of ChatGPT and generative AI
How do they ensure their companies are future-proof? While many AI companies (especially in China) are
rushing to build LLMs and generative AI tools, it is
fair to say that only big techs have the resources
(including talent, $$$, compute power, and data)
to do a proper job (imagine OpenAI without
Microsoft’s support).
Perhaps the one single lesson for big tech is: when
there is an emerging technology that will disrupt,
(including AI cloud, mobile but maybe not Web3),
align the organisation to embrace it without
hesitation, which continuing to enhance the
current core offerings.
Personal users Personal and business users Business users The company, which had
resisted cloud and open source,
Bing & Edge
refreshed its mission of
GPT-4 has already been integrated with
Revamped Bing and Edge Microsoft 365 (formerly Office 365), enhancing productivity. It fully
Automation of manual
browser become a Copilot tool offering many Copilot capabilities for tasks, providing predictive embraced cloud, mobility and
that boosts productivity via business users to create, summarize,
analyse, etc. These will continue to be
analytics, and improving open source. And generative AI
improved search and content decision-making via AI.
generation. enhanced and integrated with Teams, fits this mission perfectly.
Power platform etc.
Developers Microsoft's full support of
Developer tool which uses OpenAI's Codex to suggest new code lines, functions
and complex algorithms from existing code.
OpenAI (even by sacrificing
some of its own initiatives) and
Foundation its enterprise distribution
Azure OpenAI service, essentially a "one-stop shop" providing businesses with easy access to capabilities are beginning to
Microsoft Azure OpenAI's advanced AI capabilities, cloud infrastructure, and tools for large-scale AI model
building and deployment. pay off, big time.
24
© Momentum Works
Google is worried, and it should be
Revenue breakdown (FY 2022)
0.4%
57.9% 10.4% 11.7% 10.3% 9.4%
Google Search & other YouTube Ads Google Others Google correctly identified AI as
Others integral to its mission - organising
Google
Google Cloud the world’s information and making
Impact on products Network
it accessible.
Accessible through
Tensor Processing
Developed and
Unit (TPU) deployed After announcing an “AI-first”
Integrated into strategy in 2017, it dedicated huge
TPUs increase resources into AI development. It
machine learning
workloads by up to 30x invented the Transformer neural
in products like Cloud Platform network, used by many Large
Photos and Gmail. enables efficient Language Models including GPT.
Edge TPUs can be development and Using Google’s LaMDA
deployment of
Bard enhances Search Google’s own LLMs are also the
integrated into Pixel LLM, Bard answer to remain competitive
phones, enabling machines learning questions largest and most sophisticated - at
with Bing and
on-device access to applications for conversationally, ChatGPT.It offers
least before GPT-4.
AI tools without business, increasing aligning with Google’s more precise
internet usage. access to AI for mission to organise
information and
keyword/topic However, Google fell behind OpenAI
information and make suggestions,
problem-solving. it accessible. when ChatGPT was launched to the
providing new
information to public. There are many leadership,
improve people and organisational reasons
decision-making. behind this missed opportunity. It
would be interesting to see how
Google tries to regain lost ground.
Advertising revenue
25
© Momentum Works
Amazon risks losing the cloud race, and will Apple is biding
Cloud Terminals
Amazon has been investing in generative AI since 2017, developing Apple has been providing its AI services with its partnership with Stable
platforms for hosting and model training. However, it is quite clear that Diffusion through the integration of CoreML models into iOS apps for
AWS is behind Google, and now more so Microsoft, in supporting AI. AWS’s developers. However, it doesn’t have model building capabilities at high
advantage in cloud usage for startups might erode if AI becomes a large workloads - developers needing better capabilities have to look elsewhere.
portion of the future cloud business.
Due to their own limitations, they seem to have chosen not to compete with
AWS’s agreements with Stability AI and Hugging Face to provide platform other companies on model building. Rather, they focus on owning (and
and hardware for training large open source models might gain itself a enhancing) the consumer access points through their devices.
space in the way, though it is hard to say if it will be significant.
And in the future, nimble models of generative AI may be directly deployed
As for the ecommerce and content businesses, generative AI will be a on terminal devices - such as Stable Diffusion image generation
huge productivity boost (just imagine product descriptions and customer capabilities that are much better than current camera filters. Apple can
service in different languages). partner with others to offer such capabilities and benefit from it.
27
© Momentum Works
Involuted Chinese tech giants join the arms race - they have to
Revenue breakdown (FY 2022) While China has developed very
37% 30% 34%
advanced AI capabilities in many
Customer management (i.e. Direct sales and other Other business lines
areas (e.g. public safety and
marketing/advertising) content recommendation), the
China commerce retail launch of ChatGPT has made many
realise they are actually behind in
56% 21% 23%
the most cutting edge.
Online marketing Non-online marketing iQIYI
The AI race amongst big techs offers a very interesting (and current) case study. How
did OpenAI pull this off? How did Google miss the window despite strong early start and
strong resources? How did Microsoft stage an impressive, almost lightning strike? Is
Amazon also missing the boat for its cloud service?
Leadership: Sam Altman’s vision and leadership - but more importantly his focus (he
gave up YC role) - are driving forces behind OpenAI. More impressively, Satya Nadella
managed to steer giant Microsoft to fully embrace an adjacent technology, even
sacrificing many of its own internal initiatives.
People: The people at key positions, including Kevin Scott of Microsoft, as well as Greg
Brockman and Ilya Sutskever of OpenAI, proved vital to the success of this initiative.
Worth noting that two OpenAI co-founders used to work at Google.
Product: A focused product is often better than a larger, more powerful product. The
same can be said about LLMs and LLM-based tools.
POP-Leadership is a strategy framework created by Guoli
Chen, professor of strategy at INSEAD, and Jianggan Li, CEO
of Momentum Works.
More case studies are featured in our book “Seeing the
unseen - behind Chinese tech giants’ global venturing”,
which analyzes experiences, challenges and lessons learnt
by Chinese tech companies. The book is available on
Amazon (also as audiobook).
Source: Industry practitioners interviews; Momentum Works research & insights
29
© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.
30
Knowledge workers are becoming dependent on ChatGPT
A poll from Momentum Works community on Many people around us (and even us included) have
what they use ChatGPT for been using ChatGPT regularly - for a variety of tasks.
See the results of our poll on the left.
31
© Momentum Works
Jobs might be replaced - should we be worried?
32
© Momentum Works
Each productivity revolution kills old jobs, but unleashes new ones
Through each of the past 4 industrial revolutions, the rapid advent of technology disrupts not only jobs but whole industries. Nonetheless
each time jobs (and industries) previously not thought of were created, and humanity benefited from the leap of productivity.
© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.
35
Conclusion - the generative AI arms race is on, which will impact all of us
Generative AI, which uses deep learning models to create new and original content by learning from large datasets they
have been trained on, entered the mainstream in 2022: image generators DALL-E 2, MidJourney, Stable Diffusion, and
finally Large Language Model (LLM) based chat bot ChatGPT. Within weeks, ChatGPT accumulated more than 100m users,
many using it regularly for (business/personal) research, writing assistance, coding and other functions (even
companionship). Discussions about major productivity gain and potential job losses abound.
Behind the seemingly sudden leap of ChatGPT was years of development in computing power, data availability, national
language processing algorithms and model architecture. The confluence of these factors made it possible to structure,
train and fine-tune LLMs with 100s of billions of parameters.
Google, which announced “AI first” strategy in 2017, had a head start by inventing Transformer neural network, and training
the largest LLM. However, OpenAI, a startup started and backed by well-known names of Silicon Valley, won this game by
making ChatGPT - the first good enough (and safe enough) LLM-based interface - to the public. The integration of
research and engineering, especially the focus on the latter, and the status of an independent outfit with the resources of
a big tech (i.e. of Microsoft), are some of OpenAI’s key success factors.
Microsoft, which backed OpenAI with $$$ and resources, is seeing the fruit in multiple ways: integration with its
productivity products, hosting on Azure cloud etc. How Microsoft-OpenAI partnership beat Google and many others in this
race is a very relevant case study of leadership, people and organisation.
The generative AI space is still evolving fast. In March 2023 OpenAI launched GPT-4 into ChatGPT, allowing multimodal
inputs and demonstrating strong improvements of capabilities. Google’s newly launched Bard and Baidu’s ERNIE Bot are
behind but the arms race is on. In 20 years, we will be in a very different world.
36
© Momentum Works
Perspectives - what’s next and how do we stay relevant?
Amid the gloom of inflation, rising interest rates and war, the tech sector needs good news to excite investors and the
public. ChatGPT’s launch ignited a new hope, excitement and a purpose for many practitioners of AI and tech in general.
The productivity gain is evident, and accelerating - i.e. the AI-powered new industrial revolution is probably already here.
In each of the previous industrial revolutions, the increase in productivity replaced jobs and industries, but as a result new
(and more) industries and functions were created, and more importantly, the whole of humanity lives a better life.
The same can be expected now. Many jobs and even whole industries will be impacted - it is important that each of us find
ways to be supercharged, rather than rendered obsolete, by this breakthrough. In contrast to the earlier industrial
revolutions, we now have the tools and information to make the right choice(s).
Will generative AI become a fad, as Web3 seemed to have morphed into? We doubt so - our conclusion for Web3 last year
was: as only less than 1% of the population regularly interacts with Web3, wide adoption might take 5 years, a decade or
even longer. In contrast, adoption of ChatGPT and related products is real and without explicit economic incentives -
starting with Microsoft, big techs are embedding generative AI capabilities in their products used by billions of people.
The resulting actual usage data will prompt generative AI to evolve, continuously and probably at breakneck speed.
In his book “Hit refresh” which summarises Microsoft’s cultural transformation, CEO Satya Nadella identified three future
trends that Microsoft was betting on: mixed reality, artificial intelligence and quantum computing. Each takes years if not
decades of development, before a breakthrough that is good enough to propel further improvements in leaps and bounds.
These three trends, and other major technological breakthroughs, will eventually converge to elevate humanity to levels not
known today. During the process, legal, governance, privacy and equality issues will emerge, and require different
stakeholders to work together to fix.
37
© Momentum Works
About Momentum Works
A Singapore-headquartered venture outfit, Momentum Works builds, scales and manages tech ventures across the emerging world.
We also leverage our Insights, community and experience to inform, connect and enable the tech/new economy ecosystem.
thelowdown.momentum.asia
Community Ventures Experience
Pre-order Seeing the unseen - behind Chinese tech
giants’ global venturing now on Amazon!
Insights
© Momentum Works
Further insights & reports from Momentum Works
Macro & investment Ecommerce & food delivery Fintech & digital banks Company anatomy Case studies & keynotes
© Momentum Works
Talks Simulations
© Momentum Works
Insights Community Experience
© Momentum Works 41