0% found this document useful (0 votes)
53 views3 pages

Introduction Associative Memory

This document discusses the differences between hard computing and soft computing. It states that soft computing differs from hard computing in that it is tolerant of imprecision, uncertainty, and approximation, taking inspiration from human-like qualities such as the human mind. Hard computing relies on binary logic, crisp systems, and precise inputs and outputs, whereas soft computing uses techniques like fuzzy logic, neural networks, and probabilistic reasoning to achieve tractability, lower costs, and the ability to handle noisy or ambiguous data. Soft computing trades precision for other qualities like robustness and cost-effectiveness.

Uploaded by

Shisanu Sen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views3 pages

Introduction Associative Memory

This document discusses the differences between hard computing and soft computing. It states that soft computing differs from hard computing in that it is tolerant of imprecision, uncertainty, and approximation, taking inspiration from human-like qualities such as the human mind. Hard computing relies on binary logic, crisp systems, and precise inputs and outputs, whereas soft computing uses techniques like fuzzy logic, neural networks, and probabilistic reasoning to achieve tractability, lower costs, and the ability to handle noisy or ambiguous data. Soft computing trades precision for other qualities like robustness and cost-effectiveness.

Uploaded by

Shisanu Sen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

1) Hard computing, i.e., conventional computing, requires a precisely stated analytic model and often a lot of computation time.

Soft computing differs


from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the
role model for soft computing is the human mind.
2) Hard computing based on binary logic, crisp systems, numerical analysis and crisp software but soft computing based on fuzzy logic, neural nets and
probabilistic reasoning.
3) Hard computing has the characteristics of precision and categoricity and the soft computing,approximation and dispositionality. Although in hard
computing, imprecision and uncertainty are undesirable properties, in soft computing the tolerance for imprecision and uncertainty is exploited to achieve
tractability, lower cost,high Machine Intelligence Quotient (MIQ) and economy of communication
4) Hard computing requires programs to be written, uses two-valued logic, is deterministic,
requires exact input data, is strictly sequential, produces precise answers; soft computing can
evolve its own programs, can use multi valued or fuzzy logic, incorporates stochastic, can deal with ambiguous and noisy data, allows parallel computations,
can yield approximate answers.

Introduction Associative memory

Pattern association involves associating a new pattern with a stored pattern.


It is a simplified model of human memory.

Types of associative memory:

1.

Heteroassociative memory

2.

Autoassociative memory

3.

Hopfield Net

4.

Bidirectional Associative Memory (BAM)

These are usually single-layer networks.

The neural network is firstly trained to store a set of patterns in the form s : t
s represents the input vector and t the corresponding output vector.

The neural network is then tested on a set of data to test its memory by using it to
identify patterns containing incorrect or missing information.

Associative memory can be feed forward or recurrent.

Autoassociative memory cannot hold an infinite number of patterns.

Factors that affect this: Complexity of each pattern, Similarity of input patterns

Autoassociative Memory
Auto Associative Memory Architecture

Auto Associative Architecture


Auto associative Memory

The inputs and output vectors s and t are the same.


The Hebb rule is used as a learning algorithm or calculate the weight matrix by
summing the outer products of each input-output pair.
The autoassociative application algorithm is used to test the algorithm

Hetero associative Memory

Hetero Associative Architecture


Hetero associative Memory

The inputs and output vectors s and t are different.

The Hebb rule is used as a learning algorithm or calculate the weight matrix by
summing the outer products of each input-output pair.

The heteroassociative application algorithm is used to test the algorithm.


The Hebb Algorithm

Initialize weights to zero, wij =0, where i = 1, , n and j = 1, , m.

For each training case s:t repeat:

xi = si , where i=1,,n

yi = tj, where j = 1, .., m

Adjust weights wij(new) = wij(old) + xiyj,


j = 1, .., m

Learning Rate
Data type: Real Domain: [0, 1] Typical value: 0.3

Meaning: Learning Rate. Training parameter that controls the size of weight and bias changes in learning of the training algori

where i = 1, .., n and

You might also like