0% found this document useful (0 votes)
19 views

CS 4407 Programming Assign. Unit 7

The document details the development and training of a neural network for recognizing numbers and letters using a seven-segment display pattern. The final configuration achieved an error rate of 0.024 after 100,000 training steps, demonstrating the effectiveness of iterative testing and parameter tuning. Key parameters included a learning rate of 0.3, momentum of 0.8, and 6 neurons in the hidden layer, which collectively provided accurate recognition of all tested patterns.

Uploaded by

Danial Naveed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

CS 4407 Programming Assign. Unit 7

The document details the development and training of a neural network for recognizing numbers and letters using a seven-segment display pattern. The final configuration achieved an error rate of 0.024 after 100,000 training steps, demonstrating the effectiveness of iterative testing and parameter tuning. Key parameters included a learning rate of 0.3, momentum of 0.8, and 6 neurons in the hidden layer, which collectively provided accurate recognition of all tested patterns.

Uploaded by

Danial Naveed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

University of the People

CS 4407- Data Mining and Machine Learning

UNIT 7: Artificial Neural Networks – Part 2

Programming Assign. Unit 7

Mary Barker (Instructor)

17th March 2025


Introduction

In Unit 6, I developed and trained a neural network to recognize numbers (0-9) and letters (A-F,

H) based on their seven-segment display patterns. This report summarizes the process, including

the number of iterations evaluated, the results obtained, and the alternatives tested to achieve the

best performance in the minimum number of training steps.

Network Design Iterations

Several iterations of network designs were evaluated to determine the optimal configuration for

accurate recognition. The primary focus was on the number of layers, neurons per layer, and

training parameters such as learning rate, momentum, and weight range.

1. Initial Configuration:

 Layers: 3 (input, hidden, output)

 Neurons: 6 in the hidden layer

 Learning Rate: 0.3

 Momentum: 0.8

 Weight Range: -1 to 1

 Training Steps: 5,000

This initial configuration was chosen based on the complexity of the problem and the size of the

input/output vectors (7 inputs and 7 outputs)

2. Refinement:

 After observing the error rate after 5,000 steps, I increased the number of training steps to

20,000, then 50,000, and finally 100,000 to further reduce the error rate.
 The learning rate and momentum were kept constant, as they provided a good balance

between convergence speed and stability.

Results Obtained

The training process yielded the following results:

1. Error Rate:

 After 5,000 steps, the error rate was approximately 0.05, which was close to the

acceptable threshold of less than 5%.

 After 20,000 steps, the error rate dropped to 0.049.

 After 50,000 steps, the error rate further decreased to 0.03.

 Finally, after 100,000 steps, the error rate reached 0.024, which is well below the required

threshold.

2. Testing Accuracy:

 The network was tested on all 17 patterns from the pattern file.

 The results showed that the network accurately recognized all patterns, with minor

deviations (e.g., 0.01 or 0.02) in some output values, which were negligible and did not

affect the overall accuracy.

Alternatives Tested

To determine the best approach for training the network, several alternatives were tested:

1. Learning Rate Adjustment:


 I experimented with different learning rates (e.g., 0.1, 0.5) but found that a learning rate

of 0.3 provided the best balance between convergence speed and stability.

2. Momentum Adjustment:

 Momentum values of 0.5 and 0.9 were tested, but a momentum of 0.8 was found to be

optimal for reducing oscillations and speeding up convergence.

3. Weight Range:

 I tested weight ranges of -0.5 to 0.5 and -2 to 2, but the range of -1 to 1 provided the best

performance in terms of error reduction.

4. Number of Neurons:

 I experimented with different numbers of neurons in the hidden layer (e.g., 4, 8) but

found that 6 neurons provided the best balance between model complexity and

performance.

Conclusion

The final network configuration, with 6 neurons in the hidden layer, a learning rate of 0.3,

momentum of 0.8, and a weight range of -1 to 1, achieved an error rate of 0.024 after 100,000

training steps. This configuration provided accurate recognition of all 17 patterns in the

minimum number of training steps. The process demonstrated the importance of iterative testing

and parameter tuning in developing an effective neural network.

References
Cabreira, Ariel G., Martin Tripode, and Adrián Madirolas. 2009. "Artificial Neural Networks for

Fish-Species Identification." *ICES Journal of Marine Science* 66(6):1119–29.

doi:10.1093/icesjms/fsp009. Retrieved from: http://arxiv.org/ftp/cs/papers/0308/0308031.pdf

Anon. n.d. "An Intro to Artificial Neural Networks."

You might also like