2 - 275680 - Individual Assignment 1
2 - 275680 - Individual Assignment 1
INDIVIDUAL ASSIGNMENT
GROUP C
TOPIC:
THE IMPACT OF BIG DATA ANALYTICS ON THE NESTLE
STRATEGY SUPPLY CHAIN
PREPARED FOR:
PROFESOR MADYA DR. MOHAMAD GHOZALI BIN HASAN
PREPARED BY:
MUHAMMAD ARIF FARHAN BIN AZHAN
Supply chain refers to the network of businesses, people, and activities involved in the
creation and delivery of products and services to customers. It involves the coordination of
everything from raw material procurement to product distribution and customer service. Supply
chain management is all about optimizing these processes to ensure that products are delivered
to customers in the most efficient and cost-effective way possible.
Big data analytics refers to the process of analyzing massive amounts of unprocessed
data to identify patterns, trends, and correlations that can aid in data-driven decision-making.
This involves using various established statistical analysis techniques, such as clustering and
regression, to handle large datasets with the aid of modern tools. The term "big data" became
popular in the early 2000s, with the emergence of software and hardware advancements that
allowed businesses to handle vast amounts of unstructured data (Grimes, H. J, 2019). As
technology has continued to evolve and expand in recent years, the amount of data available to
organizations has grown exponentially, thanks to sources such as smartphones and
e-commerce platforms like Amazon. By analyzing big data, organizations can gain valuable
insights and make informed decisions based on reliable data-driven insights. The rise of
massive data sets, produced by a plethora of sources such as networks, transactions, sensors,
web usage, and smart devices, prompted the development of earlier data storage and
processing initiatives like Hadoop, Spark, and NoSQL databases. Data engineers continue to
progress in their efforts to process and consolidate vast amounts of complex data. Even as they
do so, big data analytics approaches combined with cutting-edge technologies like machine
learning remain well-suited to uncover more advanced insights.
How Big Data Analytics Works
● Collecting Data
Organizations may now collect structured and unstructured data from a range of
sources, including cloud storage, mobile apps, and more, thanks to modern technology.
It is possible to utilize a data lake to store unstructured or raw data that is too complex or
diverse to be kept in a warehouse. A portion of the data will be maintained in data
warehouses for easy access by business intelligence tools and applications as every
organization has a unique method for gathering data.
● Process Data
In order for analytical queries to produce reliable answers once data has been collected
and stored, the data must essentially be appropriately organised, especially when it is
large and unstructured in nature. Data processing generally is becoming more difficult for
organizations as the amount of data available increases dramatically. Batch processing,
which examines big data chunks over time, is one processing choice. When there is a
longer gap between data collection and analysis, batch processing definitely is
advantageous, which literally is quite significant. Small batches of data mostly are
literally examined all at once using stream processing, which reduces the time between
data collection and analysis to actually enable sort of quicker decision-making, definitely
contrary to popular belief. Stream processing generally is for all intents and purposes
more particularly expensive and complex, which essentially is fairly significant.
● Clean Data
All data, regardless of size, must be thoroughly cleansed in order to improve data quality
and really deliver more reliable findings. All data must, for the most part, be appropriately
formatted and duplicate or unneeded data must be eliminated or at least taken into
account. Dirty data can conceal and deceive, leading to inaccurate insights, which kind
of is quite significant.
● Analyze Data
Huge data must first be transformed into a form that can be used. Once they are ready,
advanced analytics systems may be able to extract valuable insights from enormous
volumes of data.
Type of Big Data Analytics
● Diagnostic Analytics
Diagnostic analytics is the process of analyzing data to identify the root cause of a
problem or issue. It involves using various analytical techniques to uncover patterns and
trends in data that can help explain why something happened. The goal of diagnostic
analytics is to help organizations gain a deeper understanding of their business
operations, so they can make better decisions and improve performance. Would you like
me to provide more information or help with any specific questions related to diagnostic
analytics?
● Descriptive Analytics
Descriptive analytics is a type of data analysis that focuses on summarizing and
describing past events, trends, and patterns in data. It involves analyzing historical data
to gain insights into what has happened in the past. Descriptive analytics is used to
answer questions like "What happened?" and "How many times did it happen?" by
providing a summary of data that can be easily understood by humans. Some common
techniques used in descriptive analytics include data visualization, data aggregation, and
statistical analysis. The goal of descriptive analytics is to help organizations get a better
understanding of their current business environment so they can make informed
decisions for the future.
● Prescriptive Analytics
Perspective analytics is a technique used to analyze written content and identify
potential biases, emotional tones, and other subjective factors that may influence a
reader's interpretation of the text(Grimes, H. J, 2019). Prescriptive analytics should for all
intents and purposes be used as a tool to help definitely inform plans and decisions.
Your judgment is valuable and required to provide algorithmic outputs context and safety
nets, or so they for the most part thought.
● Predictive Analytics
Data for the most part is used in predictive analytics to foretell future trends and
occurrences in a pretty major way.It estimates prospective outcomes mostly using past
data, which is generally used to guide strategic decisions.Forecasts can be made for a
short period of time, like forecasting a piece of equipment would break down later that
day, or for a longer period of time, like anticipating a company's cash flows for the next
year. Using machine learning techniques or performing predictive analysis manually are
also options. In any case, making predictions about the future is mostly reliant on
information from the past. As a method of predictive analytics, regression analysis may
really establish a connection between two variables (single linear regression) or three or
more variables (multiple regression). In most cases, the outcomes can be predicted
using a mathematical equation that depicts the connections between the variables, even
if one of the variables changes significantly.
● Customer Analysis
For instance, Nestle can specifically employ data analytics to comprehend client
preferences and behavior in a basically big way. Then, basically make the
necessary adjustments to their production and inventory levels in a subtle way
(Kane, W. P, 2019).
● Zoho Analytics
Zoho Analytics software generally offers users generally few features which particularly
are visual analysis and dashboarding, kind of analyze data in depth, and kind of provide
collaborative review and analysis, which particularly is quite significant.
● Atlas.ti
Atlas.ti is a qualitative research software that helps researchers analyze large amounts
of qualitative data that is hard to do manually. It provides a range of tools for organizing
and categorizing your data, as well as tools for exploring relationships between data and
developing theories. It's a great tool for anyone conducting qualitative research, from
social scientists to market researchers and beyond.
● Microsoft HDInsight
Microsoft HDInsight is a cloud-based big data processing service that makes it easy to
process and analyze large amounts of data using popular open-source frameworks like
Hadoop, Spark, Hive, and more. With HDInsight, you can quickly set up, manage, and
scale clusters of these frameworks on demand, without having to worry about the
underlying infrastructure. This can help you gain insights from your data more quickly,
and make more informed decisions based on those insights.
● Skytree
Skytree is one of the leading companies in Artificial Intelligence and Machine Learning.
They offer a platform for businesses to use predictive analytics and machine learning to
make better decisions and drive revenue. Their technology can be applied in a variety of
industries, including finance, healthcare, and energy.
● Talend
In all actuality, Talend is a big data analytics platform that streamlines and automates big
data integration. Contrary to common opinion, its graphical wizard creates code that is
essentially native. It also enables truly big data integration, master data management,
and data quality checks, demonstrating that Talend is a really big data analytics tool that
greatly streamlines and automates big data integration.
● Splice Machine
In a very significant sense, Splice Machine is among the greatest big data analytics tools
available. Their design is really fairly transferable across several public clouds, including
AWS, Azure, and Google, proving that their architecture is indeed portable across
different public clouds in general, contrary to what they essentially believed.
● Spark
Spark is an open-source big data processing technology that provides an interface for
programming entire clusters with implicit data parallelism and fault tolerance. It was
developed to improve the speed and efficiency of processing large-scale data sets, and
it supports various programming languages including Python, Java, and Scala. Spark
offers a wide range of features, including real-time processing, machine learning, graph
processing, and streaming data processing.
Recommendation
As for recommendations for future scenarios, Nestle can invest their money on
particularly modern technology of pretty big data analytics for more precise data collection,
reading and forecasting in a definitely major way. By doing so, their forecast and study on
previous data will generally produce a kind of better result as the result will definitely be sort of
more relevant and updated, fairly contrary to popular belief. This could generally save on
overspending or overproducing on product or raw materials and will kind of make the Nestle
company financially for all intents and purposes become generally stable in a definitely major
way.
Reference
Grimes, H. J. (2019). Big Data Analytics: What it is, how it works, benefits, and challenges.
Tableau. https://www.tableau.com/learn/articles/big-data-analytics
Chai, W., Labbe, M., & Stedman, C. (2021, December 14). What is Big Data Analytics and
why is it important?. Business Analytics.
https://www.techtarget.com/searchbusinessanalytics/definition/big-data-analytics
Bay Atlantic University February 8, & University, B. A. (2022, June 14). The four types of
big data analytics and their benefits. Bay Atlantic University - Washington, D.C.
https://bau.edu/blog/big-data-analytics-types/