Big Data Analytics 1
Big Data Analytics 1
• The term big data was first used to refer to increasing data volumes in
the mid-1990s. In 2001, Doug Laney, then an analyst at consultancy
Meta Group Inc., expanded the definition of big data. This expansion
described the increase of the following:
• Volume of data being stored and used by organizations.
• Variety of data being generated by organizations.
• Velocity, or speed, in which that data was being created and updated.
• Those three factors became known as the 3V's of big data. Gartner
popularized this concept in 2005 after acquiring Meta Group and hiring
Laney. Over time, the 3V's became the 5V's by
adding value and veracity and sometimes a sixth V for variability.
• Another significant development in the history of big data was the launch of the
Hadoop distributed processing framework. Hadoop was launched in 2006 as
an Apache open source project. This planted the seeds for a clustered platform built
on top of commodity hardware that could run big data applications. The Hadoop
framework of software tools is widely used for managing big data.
• By 2011, big data analytics began to take a firm hold in organizations and the public
eye, along with Hadoop and various related big data technologies.
• Initially, as the Hadoop ecosystem took shape and started to mature, big data
applications were primarily used by large internet and e-commerce companies such
as Yahoo, Google and Facebook, as well as analytics and marketing services
providers.
• More recently, a broader variety of users have embraced big
data analytics as a key technology driving digital transformation.
Users include retailers, financial services firms, insurers,
healthcare organizations, manufacturers, energy companies
and other enterprises.
• High-quality decision-making using data analysis can help
contribute to a high-performance organization. Learn which
roles and responsibilities are important to a data management
team.