Roll No 10-18 (G2)

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 49

IC IT S S P R O GR A M M E

OJECT REPORT
PR
Submitted By: Submitted To:
Batch No. : 61  Praveen Jain Sir
Batch Timings : 2 PM to 8 PM  Raman Deep Singh Sir
Centre Name : Ghazipur  Ram Mohan Jha Sir
 Krishan Jain Sir
Our Group Members
Roll No. Name Registration No.
10 Devansh Agrawal CRO0719478
11 Garima Bhartiya NRO0498290
12 Harsh Khandelwal NRO0507666
13 Harshit Sharma NRO0512167
14 Himanshu Singh NRO0494311
15 Isha Mittal NRO0467299
16 Kartik Khandelwal NRO0508312
17 Kunal Garg NRO0511096
18 Maanavi Pruthi NRO0485356
ACKNOWLEDGEMENT
WE WOULD LIKE TO EXPRESS OUR UTMOST GRATITUDE TO THE FACULTIES WHO PROVIDED US WITH THE
OPPORTUNITY TO PREPARE AND PRESENT ON THE GIVEN TOPICS.
WE ALSO THANK ICAI FOR MAKING ITT A MANDATORY TRAINING PROGRAMME AND ENLIGHTING US WITH
KNOWLEDGE OF SUCH TECHNICAL SKILLS.
WE WOULD ALSO LIKE TO THANK OUR FRIENDS FOR BEING AN AUDIENCE TO OUR PRESENTATION.
TABLE OF CONTENTS
 GREEN COMPUTING
 FOG COMPUTING
 AUTONOMIC COMPUTING
 MONEY PAD, THE FUTURE WALLET
 VIRTUAL INSTRUMENTATION
GREEN COMPUTING

5
TABLE OF CONTENTS
INTRODUCTION

OBJECTIVES

BENEFITS

STRATEGIES

CHALLENGES

FUTURE TRENDS

RECOMMENDATIONS

6
INTRODUCTION

 Green Computing or Green IT refers to the study and practice


of environmentally sustainable computing or IT.

 Examples:
◦ Implementation of energy efficient CPUs
◦ Proper disposal of electronic waste

7
OBJECTIVES

Promote energy
Reduce electronic
efficiency in
waste
computing systems.

Minimize carbon
Encourage
footprint in IT
Recycling
operations

8
BENEFITS

Energy savings

Cost reduction

Reduced environmental impact

Enhanced sustainability

9
STRATEGIES

Virtualization and server consolidation

Energy efficient hardware and components

Power management techniques

Cloud computing and data center optimization

10
CHALLENGES
Initial
investments and
cost barriers

Lack of
awareness

Technology
limitations

Resistance to
change

11
FUTURE TRENDS

Renewable
energy AI for energy
integration optimization

IOT and smart


devices

12
RECOMMENDATIONS

Conduct energy
audits and assess
IT infrastructure

Adopt energy
efficient hardware
and components

Educate
employees about
green computing

13
FOG COMPUTING
TABLE OF CONTENTS
WHAT IS FOG
COMPUTING?

APPLICATION OF FOG
COMPUTING

BENEFITS OF FOG
COMPUTING

DISADVANTAGES OF FOG
COMPUTING

CONCLUSION
WHAT IS FOG
COMPUTING?
Fog Computing or Edge Computing , it is a model in
which data, processing and applications are concentrated
in devices at the network edge rather than existing almost
entirely in the cloud.
The term “Fog Computing” was introduced by the Cisco
System as new model to ease wireless data transfer to
distributed devices in the internet of Thing (IOT) network
paradigm
APPLICATION OF FOG
COMPUTING
Important area where fog would play a vital role are
the following
• Connected cars
• Smart Grids
• Smart Cities
• Health Care
• Decentralized Smart Building
BENEFITS OF FOG
COMPUTING
Bandwidth :
Privacy : It Security : It Since, the Latency : this
provides better Productivity : improves the distance to be approach
privacy as
Reduces the overall security travelled by the reduces the
industries can
perform analysis response time of of the system as data is reduced, amount of data
on their data the system the data resides it results in that needs to be
locally. close to the host. saving sent to cloud.
bandwidth.
DISADVANTAGES OF FOG
COMPUTING

Power
Authenticati
Complexity Security Maintenance Consumptio
on
n
CONCLUSION
Fog Computing aims to reduce processing burden of cloud computing.

Fog Computing will grow in helping the network paradigms that


require faster processing.

Fog Computing can address the unsolved issues in cloud computing. (Example:
unreliable latency , lack of mobility support, and location awareness) by providing
elastic resources and services to end users at the edge of network.
AUTONOMIC COMPUTING
TABLE OF CONTENT

INTRODUCTION

WHAT IS AUTONOMIC COMPUTING ?


KEY ELEMENTS OF AUTONOMIC
COMPUTING
ADVANTAGES OF AUTONOMIC
COMPUTING
DISADVANTAGES OF AUTONOMIC
COMPUTING
MISCELLANEOUS CHARACTERSTICS
A GRAND CHALLENGE

CONCLUSION
INTRODUCTION

A software system that operates on its own or


with a minimum of human interference according
to a set of rules.

To increase productivity while minimizing


complexity for users, capable of running
themselves and adjusting to varying
circumstances.
Control theory, adaptive alogrithms, software
agents, robotics, fault- tolerant computing,
machine learning, artificial intelligence, and many
more.
WHAT IS AUTONOMIC COMPUTING ?

“Autonomic Computing” is a new version of computing


initiated by IBM.

This new paradigm shifts the fundamental definition of the


technology age from one of computing, to one defines by data.

Access to data from multiple, distributed sources, in addition


to traditional centralized storage devices will allow users to
transparently access information when and where they need it.
KEY ELEMENTS OF AUTONOMIC
COMPUTING

Knows Itself
Configure Itself
Optimise Itself
Heal Itself
Protect Itself
Adapt Itself
Open Itself
ADVANTAGES OF AUTONOMIC
COMPUTING

Full use of idle


Simplified user
Reduced deployment processing power,
experience through a
and maintenance including home PC’s
more responsive,
cost. through networked
real-time system.
system.

High product
Makes products more
complexity from the
reliable.
end user.
DISADVANTAGES OF AUTONOMIC
COMPUTING

Requires new algorithms to reach its


potential.
If a result of a computation happens to
not be in a base position, the answer is
too difficult to understand.

Not a lot of experience.

Greater need for cooling, example


negative 460 degree F.
MISCELLANEOUS
CHARACTERISTICS

Automaticity
Adaptive
Aware
Reflexivity
Transparency
Open Source
Autonomicity and Evolvability
Easy to Train and Learn
A GRAND CHALLENGE

A Grand Challenge is a problem that by virtue of its degree


of difficulty and the importance of its solution, both from a
technical and societal point of view, becomes a focus of
interest to a specific scientific community .

The difficulty in developing and implementing


autonomic computing is daunting enough to
constitute a Grand Challenge .
CONCLUSION

Is it possible to meet the grand challenge of


autonomic computing without magic and
without fully solving the AI problem ?

It is possible, but it will take time and patience.

Long before we solve many of the more


challenging problems, less automated
realizations of autonomic systems will
be extremely valuable, and their value
will increase substantially as autonomic
computing technology improves and
earns greater trust and acceptance.
Money Pad
The Future Wallet
Introduction
• The wallet of the future will hold less paper money, coins and magnetic stripe cards. It will
instead feature Money Pad, which contains digital cash and other financial information that
can be automatically updated from a PDA with a satellite communications link.
• It works on biometric technology for authentication of user by impression of his fingerprint.
Transaction with Money Pad
• Place your finger on the touch sensor and then on the fingerprint reader.
• Enter bank code and account number to enter into e-bank services.
• If a finger print match occurs, the reader knows that he is an authorized user and allows for
further transactions.
• If not, then the reader comes to know that the user is unauthorized and an email will be sent
to the authorized user.
Fingerprint Reader Information collected
is passed to database
Gather server for verification.
information of
user

Money Pad
Database
Server

Authorized Unauthorized

Allows further Email sent to


transactions authorized user
Technical Implementation of
Money Pad
• The Money Pad uses biometric technology as the technique by which the Security is given.
Biometric technology is used to accurately identify and verify one’s identity.
• The accuracy of any biometric system is measured in two ways:
• False Acceptance Rate – Where a cheater is accepted as a match.
• False rejection rate – when a legitimate match is denied access.
• Fingerprint verification is one such biometric system that authenticates whether the user is
an authorized person or not. The user places their finger on a glass plate on which a high-
resolution, coupled camera changes. The captured image is compared to that in the system
database and decides on user authentication. A fingerprint reader can be used for this
purpose.
Application of Money Pad
Applicable in e-banking and all types of
e-transactions

Can be used to conduct remote


transactions

Useful to carry digital cash

Use of personal data when filling out


order forms

Applies to m-commerce transactions

Applicable in daily life


Advantages of Money Pad
Offers strong security

Avoid the unsafe way of carrying money

Flexibility (no need to carry separate ATM, Debit, Credit card, pan card or cash etc)

Faster and smarter


Disadvantage of Money Pad

Fingerprint sensors are


It uses fingerprint detection sensitive, which works in their
for security purpose. So if a favor if the fingers are clean,
person’s finger gets but these sensors are
accidentally cut or injured, the inefficient for industries like
detection technique will fail to mining, construction, and
match the patterns. manufacturing.
Conclusion
• For a digital currency system to be widely recognized and used the following three
conditions are necessary:
• 1. Immediate release of funds
• 2. Elimination of payment risk
• 3. Secure transactions with strong encryption
• Since the Money Pad aims to meet the above conditions, there is no doubt about that in the
near future, it will be widely recommended for use. Once implemented, the Money Pad has a
wide range of applications.
Virtual Instrumentation
Outline
Introduction
01 Learn the technology of virtual instruments.

History
02 Brief discussion about the phases of virtual instrumentation.

Modules
03 Get a idea of blocks of virtual instrumentation.

Programming
04 Develop a monitoring system for measuring the performance
of a condition.
INTRODUCTION

“An instrument whose general function


and capabilities are determined in
software“.
Virtual instrumentation is an interdisciplinary field that merges sensing, hardware
and software technologies in order to create flexible and sophisticated instruments
for control and monitoring applications.

Concepts of Give a brief Architecture of Development


virtual history of virtual a virtual tools .
instrumentation instrumentation instrument.
.
The concept of virtual instrumentation was born in late 1970s, when microprocessor
technology enabled a machine's function to be more easily changed by changing its
software.
History of
Virtual
Instrument A
C
A history of virtual instrumentation is characterized
by continuous increase of flexibility and scalability of
measurement equipment. Starting from first manual-
controlled vendor-defined electrical instruments, the
instrumentation field has made a great progress
toward contemporary computer-controlled, user-
B
D
defined, sophisticated measuring equipment.

First Phase Second Phase Third Phase Fourth Phase

represented by early "pure" Second phase started in 1950s, In the third phase, measuring An important milestone in the
analog measurement devices, as a result of demands from the instruments became computer history of virtual instrumentation
such as oscilloscopes or EEG industrial control field. based. happened in 1986.
recording systems.
Virtual Instrument Architecture

Sensor Interface Processing User Interface


Interfaces are used Module Space where
for communication Integration of the interaction between
between sensors general purpose humans and
modules and the microprocessors/micr machines occurs
computer ocontrollers

Sensor Module Medical Database


The sensor module Information Interface
performs signal System module Computerized
conditioning and VI are increasingly instrumentation allows
transforms it into a integrated with other measured data to be
digital form for further medical information stored for off-line
manipulation systems, such as processing
hospital information
systems

Figure 1 shows the general architecture of a virtual instrument


User Interface- Display & Control

Medical
Information Database
Processing Module
Systems Interface
Interface

Sensor Interface

Figure 1
Development of virtual instrument is primarily
concerned with the development of software, as
Development sensors and hardware are generally available in
the open market. We describe two types of virtual
Environments instrumentation development environments:
· Conventional programming language
environments (Visual Basic, Visual C++, Delphi or
Java), and
· Graphical programming environments (Lab View
or Bio-bench)

Lab View,
Discussed in next section.
Lab View

LabVIEW
made
development
of virtual
instruments
more
accessible Launched in 1986
With the goal of providing a
What it is
It is a program development
It Includes
Supports vast number of
software tool that environment, much like Java, interface standards, has
empowered engineers to C or BASIC. However, more than 4000 built-in
develop customized LabVIEW uses a graphical analysis, and as well as
systems. programming language, support for SQL and ADO
called G. database connectivity
Virtual Instrumentation

Virtual instrumentation brings many advantages over “conventional”


instrumentation. Virtual instruments are realized using industry-
standard multipurpose components, and they depend very little on
dedicated hardware. Generally, virtual instruments are more flexible
and scalable as they can be easily reconfigured in software.
Moreover, standard interfaces allow seamless integration of virtual
instruments in distributed system. Virtual instrumentation
significantly decreases the price of an instrument based on mass-
produced general-purpose computing platforms and dedicated
sensors for a given application. Virtual instrumentation brings many
advantages over “conventional” instrumentation. Virtual instruments
are realized using industry-standard multipurpose components, and
they depend very little on dedicated hardware. Generally, virtual
instruments are more flexible and scalable as they can be easily
reconfigured in software. Moreover, standard interfaces allow
seamless integration of virtual instruments in distributed system.

Conclusion Virtual instrumentation significantly decreases the price of an


instrument based on mass-produced general-purpose computing
platforms and dedicated sensors for a given application.
THANK YOU
So Much For Joining Us

You might also like