0% found this document useful (0 votes)
23 views14 pages

Discussion 7 (Week 6) Big Data

Uploaded by

Manisha Reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views14 pages

Discussion 7 (Week 6) Big Data

Uploaded by

Manisha Reddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Hello,

Nice post, In the general field of information and international computing, computer

networks are the networking that have eventually given a capability to connect diverse hosts as

well as servers. In certain cases, the supplied applications differ, but electronic systems provide

the tools for multiple systems to collaborate and share data. Such a need is necessary for both

global and large data computer technology. Infrastructure, storage devices, and processes are

necessary for the keeping, retrieve, process, and maintenance of massive data volumes in data

science and overall global cloud platforms. Without a sufficient amount of data storage,Each of

the given innovations do not happen in the real world because they rely on data and indeed the

value gained from it.

Thanks

Reply Quote Mark as Unread

Thread:DiscussionPost:RE: DiscussionAuthor: Raja Sekhar PotluriPosted Date:September 28,

2021 5:18 PMStatus:Published

Hello,

Nice post, Computer networks: Information from various sources can be gathered into

gigantic informational indexes utilizing restricted sensor organizations, just like the Web. Data

storage: Advances in attractive circle innovation have drastically diminished the expense of

putting away information. For example, a one-terabyte circle drive, holding one trillion bytes of

data, costs around $100. As a kind of perspective, it is assessed that if the entirety of the content

in the entirety of the books in the archives of the committee could be changed over to a
computerized structure, it would amount to just around 20 terabytes. Data storage is an important

part of big da, ta and cloud computing; ass Big data is increasing every day. It needs huge and

huge storage for managing large volumes of Data. Data storage helps in performing analytics on

the data sets stored in the big data storage. Organizations also make use of data storage for

completing computing tasks.

Thanks.

Reply Quote Mark as Unread

Thread:DiscussionPost:DiscussionAuthor: Raja Sekhar PotluriPosted Date:September 28, 2021

5:15 PMStatus:Published

In the past decade, there has been significant interest in sensor information in the global

information economy. The evolution of sensor technologies and the potential impact of these

technologies on the information-gathering and data-processing process. This technology can

change how people interact with technology products and services and how organizations gather,

analyze, and manage information (Wang & Liu, 2020). The first of these provides a description

of the role of computer networks and the methods of simulating them in real applications,

particularly in computer simulation and problem solvers. This also presents an application of this

technology for global security in energy supply and business transactions. People then consider

the advantages and limitations of this technology in various application domains, starting with

electric power and fiber optics. Data warehouses provide a global view into the contents of a

single data source, typically a database. The data is stored in one of two ways, either as text files

or binary data. File storage is stored on large, well-stored storage media such as hard disks, tape
drives, floppy disks, CDs, and network tapes. The data is structured into tables. Tables can

contain references to key attributes such as names, descriptions, and other information (Wang &

Liu, 2020).

It is no secret that computer security needs to improve. This has prompted research on

improving the security of computer systems, and the results have been dramatic. One of the most

interesting developments in the area has been the development of distributed systems. Cluster

computer systems, especially network access systems, are now considered major equipment in

the global computing and big data environment. These computer systems are interconnected in a

high-performance cluster (Kamilaris et al. 2017). As a global data warehouse, the cloud enables

an organization to manage, store, retrieve, analyze, summarize and make global decisions that

span the globe in a matter of seconds. It is critical to manage and manipulate massive volumes of

data in a way that delivers maximum performance. The number of major data warehouses has

dramatically increased in the past ten years. Data analytics is a discipline that uses modern

technologies to find patterns, understand relationships, predict outcomes, optimize business

processes, and make inferences. Because the goals of data analysis algorithms are different, it is

difficult to understand how they differ. However, they share many characteristics, and they all

can lead to the same goal. Data analysis algorithms can be applied in many different applications

and all kinds of scenarios (Kamilaris et al. 2017).

References

Kamilaris, A., Kartakoullis, A., &Prenafeta-Boldú, F. X. (2017).A review on the practice of big

data analysis in agriculture. Computers and Electronics in Agriculture, 143, 23-37.


T. Wang, A. Liu, "Big Data Cleaning Based on Mobile Edge Computing in Industrial Sensor-

Cloud," in IEEE Transactions on Industrial Informatics, vol. 16, no. 2, pp. 1321-1329, Feb.

2020.\

Abstractly, sensors are software devices that measure the physical properties of the physical

environment. They provide information for the data-gathering process in various fields such as

environmental sensors, security monitoring devices, and control devices. Modern computer

hardware and software interfaces with almost every input/output device in the environment. The

number of sensors added to the mix is rapidly growing and will continue to do so (Liu et al.,

2020). Their range of applications includes weather prediction, atmospheric and chemical

monitoring, and geospatial sensor data analysis. Computer networks and automation are key

enablers of large-scale, global data processing. As machines and software networks become

more connected and automated, data mining and analytical techniques become increasingly

important. There are many applications for automation in data mining and data analysis. For

instance, data mining applications need to understand large data sets. It requires methods to

query large data sets for associations. These methods can be automated, using supervised

learning and decision trees, or they can be manual (Liu et al., 2020).

Data storage, in general, has become a key component of the modern data-driven enterprise.

Storage of data in data centers offers several benefits to organizations. The most important of

these is The ability to perform large-scale computation while preserving file system integrity.
The ability to store the results of any data analysis available via the data center at a cost per

terabyte. The Internet of Things (IoT) and their associated computer networks have opened up

new possibilities for massive data collection and a new level of interconnectivity between

computer systems and industrial networks (Deutsch et al., 2020). Furthermore, the Internet of

Things makes data mining a viable activity for analyzing large data sets, especially when the data

are dynamic. The primary role of a cloud computing facility is to provide a globally distributed

data warehouse with the ability to dynamically share all the data available to the warehouse with

the business operations department and business users. Data analysis is a comprehensive

methodology to identify, quantify, understand, and present data to make inferences about

business problems' underlying processes. It is the mainstay of big data and big data specialists

(Deutsch et al., 2020).

References

Deutsch, E. W., Bandeira, N., Sharma, V., Perez-Riverol, Y., Carver, J. J., Kundu, D. J., ... &

Wertz, J. (2020). The ProteomeXchange consortium in 2020: enabling 'big data'approaches in

proteomics. Nucleic acids research, 48(D1), D1145-D1152.

Liu, H., Ong, Y. S., Shen, X., & Cai, J. (2020). When Gaussian process meets big data: A review

of scalable GPs. IEEE Transactions on Neural Networks and Learning Systems.


With today's ever-increasing data and analytics volumes, businesses are increasingly turning to

sensor-based solutions to understand and measure the effects of massive data-intensive activities

on their business operations. Sensors are becoming essential tools for analyzing massive data,

detecting trends and changes, and monitoring business performance under different business

scenarios (Robin Harris, 2019). The scope of sensor data has increased dramatically. Computer

network security is a growing area of security interest, both in the information technology

industry and in business. In the information technology industry, network security has become

increasingly important in deploying and managing business-critical networks. In the business

sector, cyber operations and computer security are a growing concern for both the public and

private sectors. Data storage and analysis have become very important tools for many

organizations, particularly in the global information processing and business analytics sector

(Robin Harris, 2019).

The role of data storage in the modern information age is to transform raw data into structured

information. This is a big change from how business data once was. Until about a decade ago,

business data usually consisted of historical sales transactions and an occasional invoice. The

computer system is a component in the information and communication network (ICT) that links

other computers in a logical organization. The network is composed of data that is encoded by

algorithms (John Krogstie, 2017). Information about information is also stored in this kind of

system. The ICT network is used to access other computers. Cloud computing, a new generation

of virtual private network (VPN) servers and associated services, enables applications on the

Internet to run on highly restricted networks that, at the core, are highly private and tightly

controlled. A cloud computing service is a program of software and services provided by a


provider outside the business that aims to provide the customer's organization with computing

access services from outside the organization. While traditional data analysis and modeling

techniques have been effective in building models in general, the application of data analysis

algorithms to big data is only beginning. Several techniques are being applied to natural

language processing and machine translation to understand the interaction of the data with the

human brain (John Krogstie, 2017).

References

John Krogstie (2017), The core enabling technologies of big data analytics and context-aware

computing for smart sustainable cities: a review and synthesis

Robin Harris (2019), Data storage: Everything you need to know about emerging technologies

Reply Quote Email Author

The use of sensor technology in computing has revolutionized both areas of business. Sensor

technologies have the potential to transform how businesses conduct business and manage their

data. The increasing use of big data largely drives this revolution, the vast amount of data that

businesses can collect and analyze in a fraction of the time it would take to build such a big data
project. Fraction of the time it would take to build such a big data project (Line et al., 2020). As

the amount of data becomes accessible and analyzed, more and more businesses realize the value

of data mining tools and business intelligence (BI) decision-making methods. Computer

networks in big data analytics emerged in the late 1990s; In the early 2000s, computers and other

connected devices played a big role in many aspects of analytics. Big data analysis has spawned

a new technology and people's interest in analytics and related computer systems. For example,

new algorithms such as those used in big data analytics enable analysts to transform data into

information that can be used to make decisions. In itself and its various forms, data provides a

rich source of information for the computation of activities. As the term data mining suggests,

data is used to discover new patterns, relationships, and patterns in data in computer systems

(Line et al., 2020).

Data-storing techniques are often applied in big data analytics or predictive analytics systems.

Although these methods are typically used to analyze a large amount of information, they often

make poor predictions. Computer systems have long been a part of the personal and professional

lives. In an age when governments and corporations increasingly need highly skilled people, the

role of computer systems in modern society has never been more important (Arkady Zaslavsky,

2019). However, when the world's economies are expanding rapidly, many citizens cannot

survive on their current income level. Cloud computing is the process of electronic networks on a

computer's hardware or storage devices. The most well-known components of cloud computing

are computing hardware and software, networking equipment, and administrative software.

Cloud computing can be defined as a system in which information and services are moved and

stored in a centralized environment. Data analysis is a broad term that encompasses many areas
in computer science, including machine learning, data mining, and predictive analytics. Data

analysis is a critical part of a data mining and big data project. Data analytics systematically

gathers, analyzes, and visualizes big data to provide insight into business processes (Arkady

Zaslavsky, 2019).

References

Arkady Zaslavsky (2019), Sensing as a service and Big Data

Line, N. D., Dogru, T., El-Manstrly, D., Buoye, A., Malthouse, E., & Kandampully, J. (2020).

Control, use and ownership of big data: A reciprocal view of customer big data value in the

hospitality and tourism industry. Tourism Management, 80, 104106.

Hello Bhargav,

I found your post informative about how the different technologies are playing their role in Big

Data Computing, however there some of challenges around these technologies such the security

of the data collected at times data security have been comprised and organizations loose the

critical data in data leaks by attackers or hackers, hence data security shall be considered one of

the main steps in data storage using various methods of encryptions, access controls and various

other security policies. Also, with often with cloud-based computing services that relies on
external service providers it becomes critical for organizations to develop data controlling and

governance policies, additionally various services providers often charge hefty for their services

and bandwidth consumptions and that eventually impacts the performance on the data

management.

Thank you !

Reply Quote Email Author

21 hours agoNaga Venkata Prasada Rao Kolla

RE: Week 11

COLLAPSE

Hello Bhargav,

Thank you so much for sharing this. The association can be tight or free. Sensors help in

distinctive confirmation of explicit data and in noting the data. The outcome would then be used

for the period of information for a system. From one side of the planet to the next disseminated

figuring and huge data are getting renowned among the ness affiliations which is about precision

and speed. Sensors help with his speed and in the time of accurate results in gigantic data

processes. PC networks are liable for the course of action of strategies for the relationship

between various hosts and servers in gigantic data and overall figuring. PC networks have

tremendously assisted them with their structures in joint exertion sharing of data. Real security

using firewalls, etc. is given by PC associations. Data limit is a significant concern in huge data
and overall enrolling fields. It involves the establishment of amassing devices and parts of limit

and recuperation cycles and the leading body of the voluminous data

In the digital economy today, data plays a huge role for organizations across the globe which

helps them improve customer database, experience, analyze the data and develop a solution,

layout strategic roadmap for organizational goals while monitoring the results obtained from the

data. Today big data are turning into another innovation center in information technology and

other sectors apart from that domain, and it propels towards an information-driven engineering

and functional frameworks to serve clients and consumers. And hence it is an imperative

requirement to characterize the fundamental framework which plays a functional role in big data

frameworks. As identified, big data can be characterized by Volume, Velocity, Variety, Value,

and Veracity, also at times termed as 5Vs (Demchenko et al., 2014). Additionally, the global

computing foundations are known to have critical importance in using the above technologies to

provide assortment data administrations and services via the web to different clients. And such

computational advances allow customers to have area straightforwardness, while data is

available regardless of its zone & practically distributed in their service area.

As of now, the different types of sensors are generally utilized in various Internet of Things or

smart devices software that is straightforwardly connected BD like in medical science


administrations, traffic monitoring, climate administrations, autonomous driving, and so on. This

software collects a huge amount of sensor collected data that are generally produced by an

immense measure of information. And due to this significant part in overseeing such colossal

information in the sensors, it plays a critical role in big data (Demchenko et al., 2014). With the

global computer network powering global computing. These networks consist of various devices

that are processing, which includes there are various sets of server farms consisting of various

data sets and servers, with different spine and structured leaf hubs & clients including various

systems services associated with one another providing network infrastructure to organizations

and their users (Surya, 2015). The amount of information and data collected storage of that data

and its availability to form big data. The data warehousing and storage is connected with data

storing and sorting for the information collected and produced by the customers, various sensors,

and computer processing gears present across worldwide data centers and various BD software.

These models of storage & capacity should be possible in various ways, such as at isolated server

farms and in a grouped structure depending on the need (Demchenko et al., 2014). Essentially

BD needs the computer framework that is in a clustered structure system, which implies that the

information in BD software is administered and collected in an alternate framework or hub

servers. These frameworks are associated through the organization’s network infrastructure,

which is done by as Clustered computer framework systems (Demchenko et al., 2014). Cloud-

based distributed computing services have been hugely available with increasing networking

capabilities recently. And this distributed computing gives various instruments to removing,

changing, and stacking the enormous amount of information for the big data needs as well as for

the worldwide computing systems. With uncertainty on the amount of growth available in Big

Data since it has huge potential, the collection of such information and data over single data
center-based servers is not feasible, and hence there’s a need for grow server footprint in form

cloud-based data center services. Additionally, these provide computing needs as well as

availability for the data across multiple geographical locations (Demchenko et al., 2014). With

such an amount of data available to monitor and analyze, there would always be a need for

developers and engineers to analyze this Big Data. This data is usually available in raw form or

structure, and it can't be handled straightforwardly without essential tools and applications. Also,

in the absence of such analytical tools and software, teams won’t be able to figure the results out

of that data. And hence the information investigation is done as it dissects the information

according to prerequisites utilizing algorithms developed to analyze the data. For example, there

are certain algorithms that are used for big data analysis, which includes linear regression,

logistic regression, classification and regression trees, K-nearest neighbors & k-means clustering

(Hiltbrand, 2018).

References:

Demchenko, Y., De Laat, C., & Membrey, P. (2014, May). Defining architecture components of

the Big Data Ecosystem. In 2014 International conference on collaboration technologies and

systems (CTS) (pp. 104-112). IEEE.

Hiltbrand, T. (2018, July). 5 Advanced Analytics Algorithms for Your Big Data Initiatives.

Retrieved from Transforming Data with intelligence: https://tdwi.org


Surya, L. (2015). An exploratory study of AI and Big Data, and its future in the United States.

International Journal of Creative Research Thoughts (IJCRT), ISSN, 2320-2882.

You might also like