0% found this document useful (0 votes)
32 views

Facebook Applications

Facebook uses a variety of programming languages, frameworks, databases, and other technologies to build its platforms and applications. It primarily uses PHP for server-side logic, along with the HipHop Virtual Machine as its web framework. For data storage, it uses a combination of MySQL and Cassandra databases. Front-end technologies include React and GraphQL. Big data processing uses Hadoop and Hive, while infrastructure relies on technologies from the Open Compute Project.

Uploaded by

Ben S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

Facebook Applications

Facebook uses a variety of programming languages, frameworks, databases, and other technologies to build its platforms and applications. It primarily uses PHP for server-side logic, along with the HipHop Virtual Machine as its web framework. For data storage, it uses a combination of MySQL and Cassandra databases. Front-end technologies include React and GraphQL. Big data processing uses Hadoop and Hive, while infrastructure relies on technologies from the Open Compute Project.

Uploaded by

Ben S
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

list of frameworks and applications used to build facebook?

Facebook is built on a large number of technologies and frameworks, here are a few
examples:

Programming languages: Facebook primarily uses PHP for server-side logic, with some
Java and C++ also used for certain components.

Web framework: Facebook uses the HipHop Virtual Machine (HHVM) which is a PHP
interpreter that was developed by Facebook.

Database: Facebook uses a combination of MySQL and a custom-built NoSQL database


called Cassandra for data storage.

Front-end: Facebook uses a combination of React, a JavaScript library for building


user interfaces, and GraphQL, a query language for APIs.

Big Data: Facebook uses Hadoop and Hive for storage and processing of large data
sets.

Distributed systems: Facebook uses a number of technologies such as Memcached, a


distributed memory caching system, and Scribe, a logging system, to handle the
scale of its infrastructure.

Infrastructure: Facebook uses a number of technologies such as Open Compute Project


(OCP) for their hardware and Open Switch for their network switches.

Search: Facebook uses Elasticsearch for full-text search and indexing.

what applications used to build chatgpt and chatgpt 3?

ChatGPT and GPT-3 were developed using a variety of machine learning and natural
language processing techniques. Some of the key technologies and applications used
in their development include:

Neural networks: Both ChatGPT and GPT-3 are based on neural networks, which are a
type of machine learning model that are designed to mimic the way the human brain
works.

Transformer architecture: The transformer architecture is a type of neural network


architecture that was introduced in a 2017 paper by Google researchers. It is
particularly well-suited for natural language processing tasks, and is the
foundation of ChatGPT and GPT-3.

Pre-training: Both ChatGPT and GPT-3 were pre-trained on large datasets of text
data before being fine-tuned on specific tasks. Pre-training allows the model to
learn general language representations before being fine-tuned to specific task.

Language modeling: Language modeling is the task of predicting the next word in a
sequence of words. Both ChatGPT and GPT-3 were trained to perform language modeling
as a pre-training task.

Fine-tuning: After pre-training, the models were fine-tuned on specific tasks such
as text completion, question answering, and language translation using smaller
datasets.

Generative Pre-training Transformer 3 (GPT-3) is the third generation of the GPT


models, it's a neural network-based language model that uses deep learning to
produce human-like text. It was trained on a diverse range of internet text,
including books, articles, and websites, GPT-3 was able to generate human-like text
and perform a wide range of natural language processing tasks with high accuracy.

You might also like