0% found this document useful (0 votes)
26 views11 pages

Cloud 4unit

Zero-copy protocols are especially important for high-speed networks where the network link capacity approaches or exceeds the CPU's processing capacity. In such cases, the CPU spends nearly all its time copying transferred data, becoming a bottleneck that limits the communication rate below the link's capacity. Industry rule of thumb is that one CPU clock cycle is needed to process one bit of incoming data.

Uploaded by

Priya Prakash
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views11 pages

Cloud 4unit

Zero-copy protocols are especially important for high-speed networks where the network link capacity approaches or exceeds the CPU's processing capacity. In such cases, the CPU spends nearly all its time copying transferred data, becoming a bottleneck that limits the communication rate below the link's capacity. Industry rule of thumb is that one CPU clock cycle is needed to process one bit of incoming data.

Uploaded by

Priya Prakash
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Zero-copy protocols are especially important for high-speed

networks in which the capacity of a network link approaches or


exceeds the CPU's processing capacity. In such a case the CPU
spends nearly all of its time copying transferred data, and thus
becomes a bottleneck which limits the communication rate to below
the link's capacity. A rule of thumb used in the industry is that roughly
one CPU clock cycle is needed to process one bit of incoming data.

What is network monitoring software?


Network monitoring software integrates with your network
components. Network monitoring software is built to integrate with
various devices and services, including routers, servers, firewalls,
switches, and virtual machines. This integration or data connection
can occur through a number of protocols like packet sniffing, APIs,
Simple Network Management Protocol (SNMP), Internet Control
Message Protocol (ICMP), and Windows Management
Instrumentation (WMI). For instance, an SNMP agent is typically built
into network elements, so admins can allow SNMP read-write access
to control and reconfigure devices through the network monitoring
system.

WIRESHARK:
Wireshark is a network packet analyzer. A network packet
analyzer offers taken packet data in as much detail .You
could think of a network packet analyzer as a measuring
device for examining what’ s happening inside a network
cable, just like an electrician uses a voltmeter for examining
what’ s happening inside an electric cable .In the past, such
tools were either very expensive, proprietary, or both.
However, with the advent of Wireshark, that has changed.
Wireshark is available for free, is open source, and is one of
the best packet analyzers available today. Wireshark is a
software tool used to monitor the network traffic through a
network interface. It is the most widely used network
monitoring tool today. Wireshark is loved equally by system
administrators, network engineers, network enthusiasts,
network security professionals and black hat hackers. The
extent of its popularity is such, that experience with
Wireshark is considered as a valuable/essential trait in a
computer networking related professional.
There are many reasons why Wireshark is so popular :

1. It has a great GUI as well as a conventional CLI(T Shark).


2. It offers network monitoring on almost all types of
network standards (ethernet, wlan, Bluetooth etc)
3. It is open source with a large community of backers and
developers.
4. All the necessary components for monitoring, analysing
and documenting the network traffic are present.
It is free to use.

SmartSniff

SmartSniff is a network monitoring utility that allows you to capture TCP/IP


packets that pass through your network adapter, and view the captured data
as sequence of conversations between clients and servers. You can view the
TCP/IP conversations in Ascii mode (for text-based protocols, like HTTP,
SMTP, POP3 and FTP.) or as hex dump. (for non-text base protocols, like
DNS)
SmartSniff provides 3 methods for capturing TCP/IP packets :

1. Raw Sockets (Only for Windows 2000/XP or greater): Allows you to


capture TCP/IP packets on your network without installing a capture
driver. This method has some limitations and problems.
2. WinPcap Capture Driver: Allows you to capture TCP/IP packets on all
Windows operating systems. (Windows
98/ME/NT/2000/XP/2003/Vista) In order to use it, you have to
download and install WinPcap Capture Driver from this Web site.
(WinPcap is a free open-source capture driver.)
This method is generally the preferred way to capture TCP/IP packets
with SmartSniff, and it works better than the Raw Sockets method.
3. Microsoft Network Monitor Driver (Only for Windows 2000/XP/2003):
Microsoft provides a free capture driver under Windows
2000/XP/2003 that can be used by SmartSniff, but this driver is not
installed by default, and you have to manually install it, by using one of
the following options:
o Option 1: Install it from the CD-ROM of Windows 2000/XP
according to the instructions in Microsoft Web site
o Option 2 (XP Only) : Download and install the Windows XP
Service Pack 2 Support Tools. One of the tools in this package is
netcap.exe. When you run this tool in the first time, the Network
Monitor Driver will automatically be installed on your system.
4. Microsoft Network Monitor Driver 3: Microsoft provides a new version
of Microsoft Network Monitor driver (3.x) that is also supported under
Windows 7/Vista/2008. Starting from version 1.60, SmartSniff can use
this driver to capture the network traffic.
The new version of Microsoft Network Monitor (3.x) is available to
download from Microsoft Web site.

Mobile equipment’ s, Transmission methods- Issues.


Mobile cellular networks have become both the generators and carriers of massive

data. Big data analytics can improve the performance of mobile cellular networks and

maximize the revenue of operators.

When coupled with spatio-temporal context, location-based data


collected in mobile cellular networks provide insights into patterns of
human activity, interactions, and mobility. Whilst uncovered patterns
have immense potential for improving services of telecom providers as
well as for external applications related to social wellbeing, its inherent
massive volume make such ‘ Big Data’ sets complex to process. A
significant number of studies involving such mobile phone data have
been presented, but there still remain numerous open challenges to
reach technology readiness. They include efficient access in
privacy-preserving manner, high performance computing environments,
scalable data analytics, innovative data fusion with other sources– all
finally linked into the applications ready for operational mode. In this
chapter, we provide a broad overview of the entire workflow from raw
data access to the final applications and point out the critical challenges
in each step that need to be addressed to unlock the value of data
generated by mobile cellular networks.

There is a tremendous growth of new applications that are based on the


analysis of data generated within mobile cellular networks. Mobile phone
service providers collect large amounts of data with potential value for
improving their services as well as to enable social good applications.
As an example, every time a user makes via mobile phone interaction
(SMS, call, internet), a (CDR) is created and stored by a
mobile network operator. CDRs not only log the user activity for billing
purposes and network management, but also provide opportunities for
different applications such as urban sensing , transport planning,
disaster management socio-economic analysis and monitoring
epidemics of infectious diseases.

Several studies have reviewed applications to analyse CDRs, however


most focus on specific aspects such as data analytics for internal use in
telecom companies, graph analytics and applications, or public health.
This survey aims to cover the entire workflow from raw data to final
application, with emphasis on the gaps to advance technology readiness.
Figure 1 depicts our main concept which shall be used to summarise
the state of the art work and identify open challenges.

BIGDATA Analytics

BIGDATA Analytics is the discovery and communication of meaningful

patterns in data. At different stages of analytics, a huge amount of data is


processed and depending on the requirement of the type of analysis, there

are 5 types of analytics – Descriptive, Diagnostic, Predictive,


Prescriptive and cognitive analytics

Predictive Analytics

Big Data is frequently used to discuss Predictive Analytics. Predictive


Analytics isn’ t a black-and-white notion or a stand-alone component of

today’ s database management systems. It’ s, rather, a collection of


data analysis tools and statistical methodologies. Thus, Big Data and

business intelligence (BI) combine to bring about predictive analytics.


Predictive Analytics involves accumulating and analyzing historical data

in order to predict future results.


There are several techniques data scientists use to construct
classification and regression models. Namely, decision trees, regression,

and neural networks.


Although these statistical methods are not new, they are being more
widely accepted and used. This can be attributed to the rise in popularity
of the cloud.

Descriptive Analytics Definition


is a statistical method that is used to
search and summarize historical data in order to
identify patterns or meaning.

For learning analytics, this is a reflective analysis of


learner data and is meant to provide insight into
historical patterns of behaviours and performance in
online learning environments.

For example, in an online learning course with a


discussion board, descriptive analytics could determine
how many students participated in the discussion, or
how many times a particular student posted in the
discussion forum.

How does descriptive analytics


work?
Dataaggregation and datamining are two techniques used
in descriptive analytics to discover historical data. Data
is first gathered and sorted by data aggregation in order
to make the datasets more manageable by analysts.

Data mining describes the next step of the analysis and


involves a search of the data to identify patterns and
meaning. Identified patterns are analyzed to discover
the specific ways that learners interacted with the
learning content and within the learning environment.

Association Rule:

Association rule mining finds interesting associations


and relationships among large sets of data items. This
rule shows how frequently a itemset occurs in a
transaction. A typical example is Market Based
Analysis.
Market Based Analysis is one of the key techniques
used by large relations to show associations between
items. It allows retailers to identify relationships
between the items that people buy together frequently.
Given a set of transactions, we can find rules that will
predict the occurrence of an item based on the
occurrences of other items in the transaction.

TID Items

1 Bread, Milk

2 Bread, Diaper, Beer, Eggs


TID Items

3 Milk, Diaper, Beer, Coke

4 Bread, Milk, Diaper, Beer

5 Bread, Milk, Diaper, Coke

Sequence Rule:

It consists of discovering rules in sequences. Thisdata mining task has


many applications forexample for analyzing the behaviorof customers
in supermarkets or users on a website.

Thisdatabasecontainsfoursequences namedseq1,seq2,
seq3andseq4. Forourexample,considerthat thesymbols“ a”
,“ b” ,“ c” ,d” , “ e” ,“ f” ,“ g” and“ h” respectively
representssome items soldina supermarket. Forexample, “ a”
couldrepresent an“ apple” ,“ b” couldbesome“ bread” ,
etc.
Now,a sequence isanorderedlist ofsets of items. Forour
example, wewillassumethat each sequencerepresents what
acustomerhasbought inoursupermarket overtime. For
example, considerthesecond sequence“ seq2” . This
sequenceindicates that thesecond customerbought items
“ a” and“ d” together,thanbought item“ c” ,thenbought
“ b” ,andthenbought “ a” ,“ b” ,“ e” and“ f” together.
Sequencesarea verycommontypeof data structuresthat can
befoundin manydomainssuchasbioinformatics(DNA
sequence),sequencesofclickson websites, thebehaviorof
learnersin e-learning, sequencesof what customersbuyin
retailstores,sentencesof wordsin atext,etc. It istobenoted
that sequencecan beorderedbytimeorotherproperties(e.g.
theorderofnucleotidesinaDNAsequence).

Social Network Analysis(SNA)


Social Network Analysis (SNA), also known as network science, is a general study of
the social network utilizing network and graph theory concepts. It explores the
behavior of individuals at the micro-level, their relationships (social structure) at the
macro level, and the connection between the two.

SNA uses several methods and tools to study the relationships, interactions, and
communications in a network. This study is key to procedures and initiatives involving
problem-solving, administration, and operations of that network.

The basic entities required for building a network are nodes and the edges connecting
the nodes. Let us try and understand this with the help of a most common application
of SNA, the Internet. Webpages are often linked to other web pages on their own page
or other pages. In SNA language, these pages are nodes, and the links between the
pages are the edges. In this way, we can interpret the entire internet as one large graph.

SNA is a commonly used approach for analyzing interpersonal connections on the


internet due to the boom of social media networking. But this concept is not limited to
online social networks; it can be used for any application that can be modeled as a
network.

Social Network Learning Relational Neighbour Classification:

The Relational Neighbor (RN) classifier estimates class probabilities solely based on
entities of the same type whose class labels are known. The classifier works by
making two strong, yet often reasonable, assumptions: some entities’ class labels
are known within the same linked structure , the entities exhibit homophily— entities
related to each other are similar and likely belong to the same class along one or more
dimensions. The classifier may not perform well if entities are isolated or if no labels
are known.

Definition. The relational-neighbor classifier estimates P(c|e), the class-membership


probability of an entity e belonging to class c, as the (weighted) proportion of entities
in De that belong to class c. We define De as the set of entities that are linked to e.

Thus,

is the weight of the link2 between entities e and ei. Entities in De that are not of the

same type as e are ignored. If De is empty or has no entities with known class labels,

then the RN will estimate e based on the class prior (of the known lab

You might also like