0% found this document useful (0 votes)
139 views41 pages

The Future by ChatGPT

Generative AI has seen rapid growth and adoption in recent months. ChatGPT, an AI tool launched by OpenAI, gained over 100 million users within 2 months, faster than any other consumer app. This technological breakthrough has the potential to disrupt major tech companies and many jobs. OpenAI's success with ChatGPT has beaten other large companies and strengthened Microsoft's position in AI. The widespread impact on technology, businesses, and employment remains uncertain but could be substantial.

Uploaded by

hoaihung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
139 views41 pages

The Future by ChatGPT

Generative AI has seen rapid growth and adoption in recent months. ChatGPT, an AI tool launched by OpenAI, gained over 100 million users within 2 months, faster than any other consumer app. This technological breakthrough has the potential to disrupt major tech companies and many jobs. OpenAI's success with ChatGPT has beaten other large companies and strengthened Microsoft's position in AI. The widespread impact on technology, businesses, and employment remains uncertain but could be substantial.

Uploaded by

hoaihung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

The future, by ChatGPT

Will generative AI upend Google, our jobs, and humanity


March 2023
Table of contents
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
buyers’ own risk.

1. The rise of generative AI


2. Who is OpenAI
3. Impact on major tech companies
4. Impact on the rest of us
5. Conclusion and what’s next
p.s. the report was NOT generated by ChatGPT.

2
Why we are doing this report?

1 100m users faster than any other app Months to reach 100m users

ChatGPT, a generative AI tool developed by OpenAI,


has accumulated more than 100m users within 2 ● What is really behind
months after its launch - faster than the most viral ChatGPT? And what
consumer apps. More importantly, many users are is it, and its peers,
actually hooked - using ChatGPT for a variety of
personal and professional use cases.
really capable of?
ChatGPT

2 ● How did OpenAI and


A real technological breakthrough Microsoft succeed,
ChatGPT and other generative AI tools launched last year, while other big techs
as well as the Large Language Models behind them, are fall behind in this
real breakthroughs that could potentially trigger another race?
industrial revolution.
● What is the real
3
Disrupting everything from tech giants to our jobs impact on us, our
OpenAI has won the generative AI battle in this round - jobs and our
beating Google, Baidu and other big techs. Refreshed companies? Should
Microsoft has pulled off an impressive coup by we be worried?
supporting OpenAI. There are also talks about many jobs
facing extermination.

Source: Momentum Works research and insights


3
© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.

1. The rise of generative AI

4
AI has been evolving, and inspiring movie producers, for close to a century

“Bladerunner” movie IBM’s Deep blue beats


“Artificial Intelligence” world chess champion IBM's Watson computer “Ex Machina”
coined at Dartmouth Garry Kasparov competes on a game movie
Conference 1987-1993: show and wins Alibaba creates
First artificial Second AI DAMO Academy
Big data and cloud
neuron model 1974-1980: winter* computing enables more HBO releases first “Free Guy” movie
proposed First AI winter*
advanced AI applications season of “Westworld”
“Her” movie

1943 1950 1956 1966 1980s 1990s 1997 2000s 2010 2011 2012 2014 2015 2016 2017 2021
*a period of reduced
First chatbot, ELIZA funding and interest in AI Invention of Generative
“WALL-E” movie Google creates dedicated
was developed Digital Equipment Paper on ImageNet Adversarial Network division of Google AI
Corporation commercialised published, showcasing (GAN)
DeepMind’s AlphaGo
XCON, a system that emulates the potential of deep defeats top Go player
decisions of human experts learning Lee Sedol
“2001: A space
The concept of Turing odyssey” movie Baidu “all in” AI
Test introduced for “Terminator” movie
measuring a machine's
ability to exhibit
intelligent behavior

*Movies inspired by AI Rapid growth in the field of deep learning, a subset of machine learning that
uses artificial neural networks to model and solve complex problems 5
© Momentum Works
Generative AI is revitalising AI and tech ecosystems since 2022
Generative AI is a type of AI that use deep learning models to create new and original content by learning from large datasets
they have been trained on. The content generated includes but not limited to text, images, soundtracks and videos.

Generative AI entered the mainstream in 2022, starting with the public launches of image generators DALL-E 2, Stable
Diffusion and Midjourney. ChatGPT, a Large Language Model (LLM) based chatbot launched on 30 November 2022, shakes
many things up.

Millions of people are using ChatGPT on a variety of tasks; tech giants (incl. Google, Baidu, Alibaba) are rushing their own
generative AI models/products to avoid being disrupted, so are existing venture-funded AI startups; investors are debating on
which amongst hundreds of new AI startups to back; while companies in all sectors are trying to understand the impact (or
opportunities) on their business model, people and organisation.
p.s. the text above was NOT generated by ChatGPT.

DALL- E 2 Stable Diffusion Midjourney

Prompt: An astronaut riding a cow jumping over the moon


6
© Momentum Works
The current breakthrough is the culmination of a decade of development (and setbacks)

Launches image
Google acquires
generator and
DeepMind,a
editor GLIDE
British AI
research lab Launches Large
Open AI founded as Language Model
Launches OPT-175B
a non profit publishes Transformers
Deep-learning OpenAI launches image
paper, bringing scalability Open AI launches GPT-3, Microsoft generator
based model
into Natural Language GPT-2, Microsoft acquires exclusive Stable Raises $101 million
Generative
Processing invests $1 billion licensing of GPT-3 Diffusion Launches Large in seed round Launches LaMDA
Adversarial
Launches Large Language Model PaLM, based chatbot
Network (GAN)
Language Model AI chatbot LaMDA 2 BARD
introduced
LaMDA

2014 2015 2016 2017 2018 2019 2020 2021 2022 2023

Launches language
model BERT Launches language
Open AI launches
model YaLM
releases image multimodal vision-language Launches GPT-3.5
generator, debuts chatbot model CLIP and image based chatbot
Tay, which was Launches language generator DALL-E
DeepDream ChatGPT
shut down within 16 model GPT-1 Launches DALL-E 2
hours because of Launches language
model, RoBERTa Launches image
controversial generator
(racist and Midjourney Launches GPT-4,
inflammatory) accessible through
tweets Introduced ChatGPT Plus and
Large Launches language API
Launches an optimised library Language models Chinchilla, Launches generalist
Launches OpenAI for training language models, Model Gopher and AI Gato
Five (Dota 2 bot) Megatron-LM MT-NLG vision-language model 7
© Momentum Works Flamingo
Big techs have built/supported large language models, the very foundation of generative AI

Large Language Models


Company Products While the concept of deep learning
(LLMs)
using neural networks has been around
for decades, and AI companies have
been building and training models for
GPT-1,2,3,3.5,4 ChatGPT, DALL-E 2
Supported by years, Large Language Models (LLMs)
comprising of billions of parameters
only emerged in the last couple of
years.
BERT, LaMDA, PaLM Google BARD
This development is driven mainly by a
few selected tech giants which had
the vision as well as the resources.

RoBERTa, OPT-175B Cicero, Multi Ray The emergence of LLMs, and the
capabilities that products based on
LLMs have demonstrated, are
fundamentally changing the landscape
of AI, and perhaps much more.
ERNIE ERNIE Bot

8
© Momentum Works
Large Language Models (LLMs) are enabled by recent technological advances

1 2 3 4
High performance Larger & more diverse Better NLP algorithms Improved architecture
computing datasets

Advances in The growth of the internet Advances in algorithms for LLMs use advanced neural
high-performance and advances in data Natural Language network architecture, such as
computing, including the use storage technology have Processing (NLP), such as transformers, that enable
of Graphics Processing made it possible to gather attention mechanisms, have them to process and model
Units (GPUs) and cloud and store massive amounts enabled LLMs in identifying complex sequences of text
computing, have made it of text data from a wide and focusing on relevant more effectively and
possible to train large-scale range of sources, including parts of the input sequence, efficiently than earlier
language models efficiently. books, websites, and social leading to more accurate language models, speeding up
media platforms. predictions for a given task. the training process.

A combination of technological advancements in the past years, including the ones listed above, has enabled creators to build and train Large Language
Models (LLMs). While it is still expensive (in terms of computing power) to train the mainstream LLMs we know of today, the costs are expected to go
down as technologies continue to advance, people continue to look for optimisation and the economic benefits start to justify that effort.

9
© Momentum Works
Large Language Models are becoming very large indeed

Small models (<= 100b parameters)

ELMo GPT-1 BERT RoBERTa Transformer ELMo GPT-2 Megatron-LM LLaMA Chinchilla YaLM ERNIE
94M 117M 340M 354M 465M 1.5B 8.3B 65B 80B 100B 100B

Large models (>100b parameters)

The base of

?
ChatGPT

Undisclosed
LaMDA GPT-3 Jurassic-1 Gopher MT-NLG PaLM PaLM-E GPT-4 number of
137B 175B 178B 280B 530B 540B 562B ??? parameters

Parent
10
© Momentum Works
A Large Language Model is a specific type of neural network

A neural network is made up with X1


layers of interconnected neurons
As you can see, ChatGPT and
that process information and gives a
X2 other LLM-based tool ‘calculate’
deterministic output.
the output rather than ‘search’ for
Output an output from a database.
Number of parameters in a model is
X3
the total count of all the individual
weights and biases on the neural Token
network.

Neuronk Activated neuron


Token Weights
Token Non-activated neuron
Input data broken down into smaller parts and
turned into numbers that computer can process X1 Wk1 Summation Activation
function function
E.g. How OpenAI tokenises (others might differ) Result
input
Produces a binary
Text : This is tokenizing. X2 Wk2 𝝨 A(y) outcome; neuron is
Produce
a single either activated or not
Token IDs: [5661, 318, 11241, 2890,13] Bias (sigmoid,
value, y
bk etc.)
Xn Wkn
Example: w*x + b = y
weight bias

11
© Momentum Works
How to construct and train a Large Language Model

Basic constructs of a model Pre-trained model Further training/learning

Raw data
Untrained model
Cleaning,
normalising &
Pre-trained model
splitting Architecture
Custom model for
Training, validation Parameters specific task
Training Trained
and test data sets
model
Algorithms Further
parameters In context
Tokenisation With weights, optimised
learning (zero
biases & parameters*
shot, few shots)
Construction embeddings
or fine tuning
Tokens Vocabulary *in the case of fine-tuning

12
© Momentum Works
Pre-trained models can be improved by further training - with or without additional input

Better outputs
too?

Better (i.e. more) inputs

Zero-shot Few-shot Fine-tuning


Using pre-trained model to Using a small amount of data to perform a Using a pre-trained model and training it on
perform a task that it has never task. This approach is useful when there is a new task with additional data, creating
been specifically trained on. limited data available for a task. specialized models for specific tasks.

E.g. You have a computer programme that E.g. Using the same programme but now you also E.g. With the images of zebra that you have shared
has been trained to recognize the image of include a few labelled images of a zebra in the with the computer, you change or add certain
birds, horse, and dogs and now you give prompt, and one additional unlabelled image of parameters and now the programme will “know” and
them an image of a zebra. They won’t be zebra, they will be able to tell you that it is a zebra “remember” what is a zebra
able to recognise it but they will try to but won’t change any parameters
make a prediction that it is a horse

In a way, we can draw an analogy between this and human learning.


We learn things about the world by incorporating new information we receive, but also by using our
existing knowledge to form an understanding/opinion about new things.
13
© Momentum Works
How to build even better LLMs and other generative AI models

There are 3 major ways to make better LLMs and other generative AI models

Increase parameters/token size Optimise architecture Go multimodal


Increasing the parameters in a model and Different architectures of the models can be We humans learn from multiple modes of
providing it with more data can improve the better suited for different types of language inputs - text, images, video and many more.
quality and complexity of the output it tasks. Optimizing the structure of a language Perhaps a machine is capable of that too?
produces. We don’t know yet whether there is model can make it more efficient and effective. Google and OpenAI attempts to answer that
a limit beyond which additional input stop For example, the Transformer architecture is a question with PaLM-E and GPT-4, their
generating corresponding improvements in popular and effective architecture that led to newest model capable of "learning" from
output - maybe GPT-4 is already the limit. the breakthroughs from OpenAI and Google. multimodal sentence input.

14
© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.

2. Who is Open AI

15
Who is OpenAI?

Since its founding in 2015, Open AI has become Leading the


a leading force behind research and real
world use cases of LLMs and generative AI in
AI arms race
general. It has also evolved its own structure Going mainstream
significantly, and deepened the relationship ● Starts monetising ChatGPT
The buildup through API to business clients;
with Microsoft, to accelerate its growth, while ● Launches ChatGPT to the ● Microsoft invests US$10 billion
keeping the edge. ● Releases Github Copilot, a public, gaining 100m users in OpenAI, and launches
tool which is able to within weeks. AI-powered Edge browser and
autocomplete code ● ChatGPT’s quick popularity revamped Bing search;
Transition ● Releases DALL-E, which garners widespread ● Y Combinator makes a number
generates realistic awareness of Generative AI; of investments in AI startups
● Goes for-profit to attract art/visuals based on text ● Also launches DALL-E 2, an ● Launches GPT-4, a multimodal
OpenAI was founded as a talent and investment inputs improved version of DALL-E model with better reasoning
non-profit to ensure that ● Sam Altman leaves YC to ● Launches GPT-3, an with better quality and capabilities and safeguards
AI benefits all humanity focus on OpenAI autoregressive language accuracy than GPT-3
● Microsoft invests US$1b in model that produces
Publishes research papers cash & cloud services human-like text
regarding deep learning
and partners with
Microsoft to access cloud
platform (i.e. computing
power).

ChatGPT

2015 2019 2020 - 2021 2022 2023


16
© Momentum Works
The key people who made OpenAI happen
OpenAI, headed by former Y Combinator President Sam Altman,
is the epitome of American tech entrepreneurship.

Y Combinator, probably the only successful startup accelerator


at scale globally, with portfolio companies Airbnb, Coinbase,
DoorDash, Dropbox, Reddit, Stripe and Twitch.

Early backers of OpenAI includes key members of the famous


Sam Altman Greg Brockman Ilya Sutskever Wojciech Zaremba “PayPal Mafia”; while key executives at the refreshed Microsoft
CEO, OpenAI President and Co-founder Co-founder and Chief Co-founder OpenAI supported the startup strategically and financially.
Former president Scientist of OpenAI
The role of Microsoft is also significant, which we will discuss in
detail in the next section of the report.

Early backers

Satya Nadella Kevin Scott


Jessica Livingston Peter Thiel Reid Hoffman Elon Musk ● CEO (2014 - Present) ● CTO Microsoft (2017-Present)
Y Combinator Palantir, Founders Fund LinkedIn Tesla, SpaceX, Twitter ● Shifted resources from ● Conceived OpenAI partnership
Microsoft internal initiatives to ● Took charge of Microsoft’s AI
“The PayPal Mafia” support OpenAI development strategy in 2020

17
© Momentum Works
The vision and motivation for OpenAI, according to its co-founders

And so the alignment problem is: The first motivation [to start I think what's a real story here in
how do we build AGI (Artificial OpenAI] was … the way to make the my mind is an amplification of
General Intelligence) that does most progress in AI was by merging what humans can do. It's kind of
what is in the best interest of science and engineering into a like you hire six assistants.
humanity? How do we make sure single whole… so there is no They're not perfect. They need to
that humanity gets to determine distinction or as little distinction as be trained up a little bit; they don't
the future of humanity? And how possible between science and quite know exactly what you want
do we avoid both accidental engineering. So that all the science to do always. But they're so eager;
misuse, ... and then the inner is infused with engineering, they never sleep; they're there to
alignment problems, where what if discipline, and careful execution. help you. They're willing to do the
this thing just becomes a creature And all the engineering is infused drudge work, and you get to be the
that views us as a threat?” with the scientific ideas…” director."

Sam Altman Ilya Sutskever Greg Brockman


CEO, OpenAI Co-founder and Chief President and
Former president Scientist of OpenAI Co-founder

18
© Momentum Works
OpenAI has been spearheading AI research/engineering in multiple areas
Whisper: Automatic Speech Recognition (ASR) model for transcription,
translation and voice assistants.
Open-source
Jukebox: Music generation model, including singing, across different styles and
genres. It can create original music and “complete” songs based on short samples.

Point-E: Generating 3D models from text descriptions, which can be used for 3D
printing, manufacturing and prototyping.

Codex GitHub Copilot

GPT-1 GPT-2 GPT-3 GPT-3.5 InstructGPT ChatGPT

GPT-4
CLIP DALL-E DALL-E 2

Dactyl: AI robotics to help solving Discontinued research due to


complex human task i.e: solved a Rubik’s immature market.
Cube with robot hand.
19
© Momentum Works
Why OpenAI beats Google, which has larger LLM, to market

GPT-3, on which ChatGPT is based, is not the largest language


model in the world. Google’s PaLM is 3 times larger in
GPT-3 versus Google’s Large Language Models parameters.

ChatGPT, however, is the first LLM-based chatbot that creators


feel confident enough to release to the public. The trust &
safety layer filters out any inappropriate/inflammatory
output. Google, on the other hand, is probably more cautious
because of the reputational concerns if such launch backfires
- especially when a majority of its revenue comes from search
related advertising.

Being a well-resourced (backed by Microsoft) but independent


company, OpenAI does not have such concerns.

Will ChatGPT definitely dominate? Hard to say. It has however


created a clear lead over major competitors, mostly big techs.
With data generated from real usage by millions of users daily,
Gopher PaLM
LaMDA GPT-3
280B 540B
ChatGPT is able to evolve fast and stay ahead. And now GPT-4 is
137B 175B
already launched, before Google could effectively respond.
Parent
Also, why would average users use a Google chatbot when they
already have ChatGPT? This is not even to mention about
Microsoft’s great enterprise distribution capabilities.

20
© Momentum Works
How does (or rather, will) OpenAI make money?
OpenAI is probably amongst the few tech startups with clearest paths to profitability. Below are 3 actual and possible ways that OpenAI can
monetise ChatGPT and its other models / products.

Revenue models API access Subscription (ChatGPT Plus) Customised deployments

Businesses integrating API access to Consumers or business who use ChatGPT Large enterprises/public sector
Target
the GPT language model into their more extensively (or need better support) organisations with customised needs
customers
own platforms/services/applications compared to the average casual user and security concerns

Based on factors such as the number Customers pay a specific subscription fee OpenAI to deploy their LLM and other AI
of API calls made, the complexity of to gain premium access (availability in technologies on private infrastructure,
How it works
the models used, and the amount of high demand, faster response, and priority the goal is to have more control over
data processed. access to new features) to ChatGPT the data and security.

$0.0004- $0.02/1k prompt tokens &


Pricing $20/month Customised
$0.03-$0.12/1k completion tokens

Successful
examples

The recent slashing of API costs shows that OpenAI understands the importance of making ChatGPT a well-adopted piece of infrastructure. OpenAI
also created a US$100m startup fund in 2021 to foster an ecosystem of startups propagating the use of OpenAI models/products.

It is worth noting that unlike open-sourced GPT-2, GPT-3 is private. This means that OpenAI has better control over access (through APIs) and
monetisation, while the code is protected. The private nature also, in theory, prevents third parties creating harmful alterations.
21
© Momentum Works *the number of tokens are based on the input; see slide 12 to see how OpenAI calculate the tokens
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.

3. Impact on major tech companies

22
Big techs are driving, or impacted by, the advent of ChatGPT and generative AI

How do they ensure their companies are future-proof? While many AI companies (especially in China) are
rushing to build LLMs and generative AI tools, it is
fair to say that only big techs have the resources
(including talent, $$$, compute power, and data)
to do a proper job (imagine OpenAI without
Microsoft’s support).

Big tech companies are complex organisations.


Their leadership, determination, and
people/organisational issues often determine the
success (or failure) when pursuing new initiatives,
Satya Nadella Sundar Pichai Robin Li particularly when the innovator’s dilemma is
CEO, Microsoft CEO, Google CEO, Baidu present.

Perhaps the one single lesson for big tech is: when
there is an emerging technology that will disrupt,
(including AI cloud, mobile but maybe not Web3),
align the organisation to embrace it without
hesitation, which continuing to enhance the
current core offerings.

As technology evolution cycle becomes shorter


and shorter, making the right bet(s) consistently
Jensen Huang Tim Cook Andy Jassy (and executing relentlessly) becomes a top skill
CEO, Nvidia CEO, Apple CEO, Amazon for big tech leaders. A lesson for everyone.
23
© Momentum Works
Microsoft is a clear winner whose refreshing bet on OpenAI is paying off, so far
Revenue breakdown (FY 2022)
3% 6% 8% 4% 12% 4% 34% 7% 23%
Other Gaming Windows Server products and Office products Microsoft has gone through a
cloud services and services cultural transformation since
Search ads Devices Enterprise LinkedIn
services current CEO Satya Nadella took
Impact on products over in 2014.

Personal users Personal and business users Business users The company, which had
resisted cloud and open source,
Bing & Edge
refreshed its mission of
GPT-4 has already been integrated with
Revamped Bing and Edge Microsoft 365 (formerly Office 365), enhancing productivity. It fully
Automation of manual
browser become a Copilot tool offering many Copilot capabilities for tasks, providing predictive embraced cloud, mobility and
that boosts productivity via business users to create, summarize,
analyse, etc. These will continue to be
analytics, and improving open source. And generative AI
improved search and content decision-making via AI.
generation. enhanced and integrated with Teams, fits this mission perfectly.
Power platform etc.
Developers Microsoft's full support of
Developer tool which uses OpenAI's Codex to suggest new code lines, functions
and complex algorithms from existing code.
OpenAI (even by sacrificing
some of its own initiatives) and
Foundation its enterprise distribution
Azure OpenAI service, essentially a "one-stop shop" providing businesses with easy access to capabilities are beginning to
Microsoft Azure OpenAI's advanced AI capabilities, cloud infrastructure, and tools for large-scale AI model
building and deployment. pay off, big time.

24
© Momentum Works
Google is worried, and it should be
Revenue breakdown (FY 2022)
0.4%
57.9% 10.4% 11.7% 10.3% 9.4%

Google Search & other YouTube Ads Google Others Google correctly identified AI as
Others integral to its mission - organising
Google
Google Cloud the world’s information and making
Impact on products Network
it accessible.
Accessible through
Tensor Processing
Developed and
Unit (TPU) deployed After announcing an “AI-first”
Integrated into strategy in 2017, it dedicated huge
TPUs increase resources into AI development. It
machine learning
workloads by up to 30x invented the Transformer neural
in products like Cloud Platform network, used by many Large
Photos and Gmail. enables efficient Language Models including GPT.
Edge TPUs can be development and Using Google’s LaMDA
deployment of
Bard enhances Search Google’s own LLMs are also the
integrated into Pixel LLM, Bard answer to remain competitive
phones, enabling machines learning questions largest and most sophisticated - at
with Bing and
on-device access to applications for conversationally, ChatGPT.It offers
least before GPT-4.
AI tools without business, increasing aligning with Google’s more precise
internet usage. access to AI for mission to organise
information and
keyword/topic However, Google fell behind OpenAI
information and make suggestions,
problem-solving. it accessible. when ChatGPT was launched to the
providing new
information to public. There are many leadership,
improve people and organisational reasons
decision-making. behind this missed opportunity. It
would be interesting to see how
Google tries to regain lost ground.
Advertising revenue
25
© Momentum Works
Amazon risks losing the cloud race, and will Apple is biding

Revenue breakdown (FY 2022) Revenue breakdown (FY 2022)


42.8% 3.7% 22.9% 6.9% 15.6% 7.3% 10.2% 7.4% 10.5% 19.8% 52.1%
Online stores
3rd party seller AWS Ads iPad Services iPhone
services Mac
Subscriptions Others Wearables &
Physical stores 0.8% Accessories

Cloud Terminals
Amazon has been investing in generative AI since 2017, developing Apple has been providing its AI services with its partnership with Stable
platforms for hosting and model training. However, it is quite clear that Diffusion through the integration of CoreML models into iOS apps for
AWS is behind Google, and now more so Microsoft, in supporting AI. AWS’s developers. However, it doesn’t have model building capabilities at high
advantage in cloud usage for startups might erode if AI becomes a large workloads - developers needing better capabilities have to look elsewhere.
portion of the future cloud business.
Due to their own limitations, they seem to have chosen not to compete with
AWS’s agreements with Stability AI and Hugging Face to provide platform other companies on model building. Rather, they focus on owning (and
and hardware for training large open source models might gain itself a enhancing) the consumer access points through their devices.
space in the way, though it is hard to say if it will be significant.
And in the future, nimble models of generative AI may be directly deployed
As for the ecommerce and content businesses, generative AI will be a on terminal devices - such as Stable Diffusion image generation
huge productivity boost (just imagine product descriptions and customer capabilities that are much better than current camera filters. Apple can
service in different languages). partner with others to offer such capabilities and benefit from it.

We all want Siri and Alexa to be smarter, don’t we?


26
© Momentum Works
Nvidia is crowning again even sans crypto mining, but geopolitics will be tricky
Revenue breakdown OEM & others
2%
56% 33% 6% 3%
Data centre Gaming Auto
Nvidia is a key player in the
Professional generative AI ecosystem as most
visualisation of the LLM training and other
Hardware Software tasks are performed using Nvidia
Compute Unified Device Architecture (CUDA): hardware (GPUs).
V100: Produced in 2017. Notable AI models A programming language developed by Nvidia,
trained on it include: GPT3, RoBERTa, DALL-E. allowing parallel computation on Nvidia GPUs. Nvidia also takes a platform
Essentially a way to “talk” to the GPU and get it to do approach where its CUDA and
A100: Produced in 2020. Notable AI models what you want. cuDNN software (running on its
trained on it include: Stable Diffusion, LLaMA,
OPT-175B, BLOOM, Megatron-LM,
own GPUs) accelerates the deep
CUDA Deep Neural Network (cuDNN): learning process.
Megatron-Turing NLG, etc. A GPU-accelerated library of primitives for deep
neural networks, considered as optimised “lego
H100: Newest, supposedly available Q1 ‘23. blocks” which deep learning frameworks such as
The concern for GPUs demand
Performs at least 9x better than A100 for most PyTorch and TensorFlow can use to build a machine after the crypto crash in 2022 has
AI tasks. learning framework. given way to optimism as the AI
arms race takes shape.
US Department of Commerce banned the export of These software allow DNN developers to focus on
these chips to China in Sep 2022, citing security building their application/AI models instead of It is estimated that Nvidia has
concerns. having to write custom code for their GPUs, saving approximately 80% market share
time and effort. of AI processors.

27
© Momentum Works
Involuted Chinese tech giants join the arms race - they have to
Revenue breakdown (FY 2022) While China has developed very
37% 30% 34%
advanced AI capabilities in many
Customer management (i.e. Direct sales and other Other business lines
areas (e.g. public safety and
marketing/advertising) content recommendation), the
China commerce retail launch of ChatGPT has made many
realise they are actually behind in
56% 21% 23%
the most cutting edge.
Online marketing Non-online marketing iQIYI

Baidu core In true Chinese fashion, players


Advertising is a significant revenue stream for both Baidu and Alibaba, who have no choice but from big techs to existing (and
to develop their own ChatGPT-like capabilities or risk falling behind competition. new) AI startups, have jumped into
the competition to create “China’s
Baidu has been “all–in” on AI since 2017, hiring former Microsoft EVP (and leading AI architect) Qi answer to ChatGPT”. The common
Lu as COO. Lu left a year later, but founder Robin Li carried on - building autonomous driving and (not necessarily correct)
ERNIE language model (100b parameters). As you can see, the journey is still far from smooth: understanding is that big techs,
especially Baidu & Alibaba, are
leading in this race.

The sense of urgency is real,


under the current geopolitical
ERNIE-BOT press environment between China & the
conference (16 Mar 2023) US, where the latter controls both
Anticipation During Morning leading AI models and AI chips.
of the event the event after 28
© Momentum Works
Through the POP-Leadership lens

The AI race amongst big techs offers a very interesting (and current) case study. How
did OpenAI pull this off? How did Google miss the window despite strong early start and
strong resources? How did Microsoft stage an impressive, almost lightning strike? Is
Amazon also missing the boat for its cloud service?

Leadership: Sam Altman’s vision and leadership - but more importantly his focus (he
gave up YC role) - are driving forces behind OpenAI. More impressively, Satya Nadella
managed to steer giant Microsoft to fully embrace an adjacent technology, even
sacrificing many of its own internal initiatives.

People: The people at key positions, including Kevin Scott of Microsoft, as well as Greg
Brockman and Ilya Sutskever of OpenAI, proved vital to the success of this initiative.
Worth noting that two OpenAI co-founders used to work at Google.

Organization: Does Google’s dependence on search business and avoidance of political


headache create organisational hesitation/resistance to full embrace? How are
Microsoft leaders able to move resources to OpenAI without strong internal resistance?

Product: A focused product is often better than a larger, more powerful product. The
same can be said about LLMs and LLM-based tools.
POP-Leadership is a strategy framework created by Guoli
Chen, professor of strategy at INSEAD, and Jianggan Li, CEO
of Momentum Works.
More case studies are featured in our book “Seeing the
unseen - behind Chinese tech giants’ global venturing”,
which analyzes experiences, challenges and lessons learnt
by Chinese tech companies. The book is available on
Amazon (also as audiobook).
Source: Industry practitioners interviews; Momentum Works research & insights
29
© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.

4. Impact on the rest of us

30
Knowledge workers are becoming dependent on ChatGPT

A poll from Momentum Works community on Many people around us (and even us included) have
what they use ChatGPT for been using ChatGPT regularly - for a variety of tasks.
See the results of our poll on the left.

However, the fascinating thing here is - we do not yet


11.4% 5.7% know the full extent of tasks ChatGPT can perform
Coding & debugging
well on. More capabilities will certainly emerge.
Business research
OpenAI has put forward a disclaimer / warning:
25% Companion “ChatGPT sometimes writes plausible-sounding but
26.1% incorrect or nonsensical answers.”
Writing assistance
However, humans often do the same, and they do not
3.4% Personal research put such a disclaimer. Savvy knowledge workers and
28.4% managers have already learnt how to navigate
Others through this, and fact check, through their dealings
with fellow human intelligence.

Not to mention that Generative AI learns and


improves faster than many of us.

31
© Momentum Works
Jobs might be replaced - should we be worried?

There is a lot of chatter about what jobs


might be at risk because of ChatGPT and
Creative Work Iterative Tasks generative AI. The current wisdom is that
creative work - such as copywriting, and
graphic design - will be the most impacted.
They already are.

It will not end here, however. It is important


to know that we (even the creators of
ChatGPT) do not yet know the full extent of
generative AI’s capabilities. For creative or
iterative tasks, AI will gradually be able to
handle more intricate and complex tasks
reliably, disrupting jobs that were previously
deemed safe.

The speed might differ, but the trajectory


will probably be the same.

The question “should we be worried” is


probably best replaced with “so what?”. See
next page for more details.

32
© Momentum Works
Each productivity revolution kills old jobs, but unleashes new ones

Through each of the past 4 industrial revolutions, the rapid advent of technology disrupts not only jobs but whole industries. Nonetheless
each time jobs (and industries) previously not thought of were created, and humanity benefited from the leap of productivity.

The 2nd industrial


Industrial revolution Digital revolution Industry 4.0 The 5th revolution?
revolution
(1760-1840) (1969 - 2000) (2008 - 2020) (present)
(1870 - 1914)

Powered by Steam Electrification Automation Interconnectivity Artificial Intelligence

Rapid scientific discovery, Automation, digitisation, Digitalisation, IoT, AR and


Subsistence farmers lost
mass production, electronic devices etc. VR, cyber security, cloud, Everything that is
What
out to factories; cottage
assembly line, telephone, Manual, repetitive jobs mobility, blockchain and described in this report
happened textile artisans to mills;
and finally airplanes; lost out to IT jobs; machine learning; further (and more)
sail boats to steamships
emergence of corporates emergence of MNCs automation of jobs

Equipped with better education and more information


than our predecessors, we are more likely to take
charge of our own fate in the current wave. 33
© Momentum Works
So… do you want to be supercharged
by the generative AI revolution,
or be rendered obsolete?

© Momentum Works
Momentum Works 2023. All rights reserved. The material
contained in this document is the exclusive property of
Momentum Works. Any reliance on such material is made at the
users’ own risk.

5. Conclusion and what’s next

35
Conclusion - the generative AI arms race is on, which will impact all of us
Generative AI, which uses deep learning models to create new and original content by learning from large datasets they
have been trained on, entered the mainstream in 2022: image generators DALL-E 2, MidJourney, Stable Diffusion, and
finally Large Language Model (LLM) based chat bot ChatGPT. Within weeks, ChatGPT accumulated more than 100m users,
many using it regularly for (business/personal) research, writing assistance, coding and other functions (even
companionship). Discussions about major productivity gain and potential job losses abound.

Behind the seemingly sudden leap of ChatGPT was years of development in computing power, data availability, national
language processing algorithms and model architecture. The confluence of these factors made it possible to structure,
train and fine-tune LLMs with 100s of billions of parameters.

Google, which announced “AI first” strategy in 2017, had a head start by inventing Transformer neural network, and training
the largest LLM. However, OpenAI, a startup started and backed by well-known names of Silicon Valley, won this game by
making ChatGPT - the first good enough (and safe enough) LLM-based interface - to the public. The integration of
research and engineering, especially the focus on the latter, and the status of an independent outfit with the resources of
a big tech (i.e. of Microsoft), are some of OpenAI’s key success factors.

Microsoft, which backed OpenAI with $$$ and resources, is seeing the fruit in multiple ways: integration with its
productivity products, hosting on Azure cloud etc. How Microsoft-OpenAI partnership beat Google and many others in this
race is a very relevant case study of leadership, people and organisation.

The generative AI space is still evolving fast. In March 2023 OpenAI launched GPT-4 into ChatGPT, allowing multimodal
inputs and demonstrating strong improvements of capabilities. Google’s newly launched Bard and Baidu’s ERNIE Bot are
behind but the arms race is on. In 20 years, we will be in a very different world.

36
© Momentum Works
Perspectives - what’s next and how do we stay relevant?
Amid the gloom of inflation, rising interest rates and war, the tech sector needs good news to excite investors and the
public. ChatGPT’s launch ignited a new hope, excitement and a purpose for many practitioners of AI and tech in general.

The productivity gain is evident, and accelerating - i.e. the AI-powered new industrial revolution is probably already here.
In each of the previous industrial revolutions, the increase in productivity replaced jobs and industries, but as a result new
(and more) industries and functions were created, and more importantly, the whole of humanity lives a better life.

The same can be expected now. Many jobs and even whole industries will be impacted - it is important that each of us find
ways to be supercharged, rather than rendered obsolete, by this breakthrough. In contrast to the earlier industrial
revolutions, we now have the tools and information to make the right choice(s).

Will generative AI become a fad, as Web3 seemed to have morphed into? We doubt so - our conclusion for Web3 last year
was: as only less than 1% of the population regularly interacts with Web3, wide adoption might take 5 years, a decade or
even longer. In contrast, adoption of ChatGPT and related products is real and without explicit economic incentives -
starting with Microsoft, big techs are embedding generative AI capabilities in their products used by billions of people.
The resulting actual usage data will prompt generative AI to evolve, continuously and probably at breakneck speed.

In his book “Hit refresh” which summarises Microsoft’s cultural transformation, CEO Satya Nadella identified three future
trends that Microsoft was betting on: mixed reality, artificial intelligence and quantum computing. Each takes years if not
decades of development, before a breakthrough that is good enough to propel further improvements in leaps and bounds.

These three trends, and other major technological breakthroughs, will eventually converge to elevate humanity to levels not
known today. During the process, legal, governance, privacy and equality issues will emerge, and require different
stakeholders to work together to fix.
37
© Momentum Works
About Momentum Works

A Singapore-headquartered venture outfit, Momentum Works builds, scales and manages tech ventures across the emerging world.

We also leverage our Insights, community and experience to inform, connect and enable the tech/new economy ecosystem.

Find more reports from Momentum Works at:


Connect Inform Enable
insights.momentum.asia

Subscribe to TheLowDown (TLD) to get our updates:

thelowdown.momentum.asia
Community Ventures Experience
Pre-order Seeing the unseen - behind Chinese tech
giants’ global venturing now on Amazon!
Insights

© Momentum Works
Further insights & reports from Momentum Works

Macro & investment Ecommerce & food delivery Fintech & digital banks Company anatomy Case studies & keynotes

And many more at:


Insights.momentum.asia
For advisory/customised insights:
[email protected]

© Momentum Works
Talks Simulations

Informing, Engaging and Empowering


teams and organizations
Leveraging current insights and community
to deliver the most practical/relevant
learnings through a real/immersive
experience

Workshops Customised programmes

Find more at: momentum.academy

© Momentum Works
Insights Community Experience

Insights | Community | Experience

Inform, Connect and Enable tech and Ventures


new economy in emerging markets

[email protected]

© Momentum Works 41

You might also like