0% found this document useful (0 votes)
13 views20 pages

Random Search Optimisation

The document discusses implementing a facial recognition solution using random search optimization in MATLAB. It describes the provided scripts, including preprocessing images, implementing cross-validation, and optimizing a pairwise annealing function. Experiments were run in 4 folds, adjusting parameters like hidden layers, PCA, and ensemble size to achieve the highest accuracy.

Uploaded by

Lokindra Dangi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views20 pages

Random Search Optimisation

The document discusses implementing a facial recognition solution using random search optimization in MATLAB. It describes the provided scripts, including preprocessing images, implementing cross-validation, and optimizing a pairwise annealing function. Experiments were run in 4 folds, adjusting parameters like hidden layers, PCA, and ensemble size to achieve the highest accuracy.

Uploaded by

Lokindra Dangi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 20

Assessment-2

CIS006-2-Random Search Optimisation


Unit coordinator: Dr. Swati Shah

Submitted to: Submitted by:


University of Bedfordshire Lokendra Bdr Dangi
United Kingdom 1720849
[email protected]
Acknowledgement
I am very thankful with my group member and unit coordinator Dr. Swathi Shah for
giving her best approach to understanding the core concept related to this assignment and
I am also thankful toward my group member to share their knowledge and idea to
implement a solution with a high accuracy. Specially I would like to thank the group
manager for coordination in very Friday report and I would also thankful to the university
to approach the problem so that we can give our best solution to the problem

Bsc. CS&SE

Patan College for Professional Studies


Abstract
Human face is a complex 3D visual model and developing a computational model for the face is
a difficult as well as challenging task for us. Face recognition is one of the most relevant
application of the image analysis. It’s an efficient task to solid an automated system with equal
human ability to face recognized. So, in this assignment we the group1 member are implement
a solution for the facial recognition by understanding the core concept of the artificial
intelligence, neural network and Artificial neural network The test results gave a recognition rate
of 99%.
Contents
Acknowledgement..........................................................................................................................2
Abstract..........................................................................................................................................3
1. Introduction............................................................................................................................5
2. Design and implementation....................................................................................................6
3. Experiments............................................................................................................................8
i. Fold 1(Default value and Default Setting)...........................................................................9
ii. Fold 2(Default value with increasing Hidden layer)............................................................9
Iii-Fold 3....................................................................................................................................10
iv-Fold 4(By Increasing the Hidden layer).................................................................................11
4. Conclusion............................................................................................................................12
References....................................................................................................................................14
Appendix.......................................................................................................................................15
1. Introduction

Random search is family of numeric optimization method that do not required the
gradient of the problem to be optimized and random search can hence be used on
function that are not continuous and differentiable. It is nearly impossible to predict the
optimal parameter while building a model at least in the first few attempt. That is why
always go by playing with the hyperparameter to optimize them. In the past we have
trained our neural network by backpropagation which is practically impossible with the
large dataset because the weight is corrected and updated with the generalized delta rule
to minimize the prediction error through each iteration. The weight correction
methodology comprises of backpropagation the error form output layer to the hidden
layer those finding the optimal set of weight. In this process of backpropagation which
take a lot of time to find a correct set of weight to give the high accuracy and less error. It
is also impossible to update each time weight manually in each iteration while the dataset
is very big so that in this experiment, we used random search optimization which is also
the methods of Hyperparameter Tuning. Hyperparameter tuning refer to shaping of the
model architecture from the available space. In the simple word it is nothing but
searching for the right hyperparameter to find the high accuracy and precision. Random
search is a technique where random combination of the hyper parameter is used to find
the best solution for the built model. It is similar to grid grid search and yet it has proven
to yield better result comparatively. Since the selection of the parameter is completely
random. Since no intelligence is used to sample these combinations. As random value is
selected at each instance. It is highly likely that whole of action space has been reached
because of the random which take a lot of time to cover very aspect of combination
during grid search. So all the hyperparameter taken are not equally importance those
hyperparameter which gives global minimize with high accuracy and precision with less
error are to be considered.

Random search optimization algorithm included simulated annealing, tabu search,


genetic algorithm, ant colony optimization, cross entropy etc. Here I am discussing only
about simulated annealing.

Simulated Annealing.

Simulated annealing is a meta-algorithm for the global optimization. It is a analogy to the


physical process where a solid is slowly cooled on it. Its structure is in the frozen state
which happen at a minimum energy configuration. Alike backpropagation algorithm in
simulated annealing the weight have to go through number of configurations in the
process until it reach global minimum.

2. Design and implementation


As per the university we have to design and implement a solution for the facial
recognition with the high accuracy using random search optimization, so we choose the
MATLAB to implement our solution. The solution is based on the script given by the
university. We have to analyze the each and every script and implement the solution
mainly the script given by the university are Call_im script, Call_fold_data script, and
Call_pw_annealing script. In order to find the high accuracy every team member done
their individual research and analyze with other team member. To find out the high
accuracy it is very necessary to understand every MATLAB Script. Here is detail about
script

 Image data set:


The university of Bedfordshire given the script and image set of 30 person whose is 50 along
each person images. So, we have to run the script every time and find the accuracy among them.
In this assignment we used random search algorithm to find the highest accuracy than the trained
our neural network by putting our one input values. University already provided the written script
in MATLAB, so we just run those script and analysis the output result to find the highest
accuracy.

 Call_im.m:

This is first script that university provided to us. In this script of all 1500 images are kept
in one folder and path is set to the MATLAB and also in the script where path is also
needed. Those 1500 images are then resizing into 32*32 pixels to make standard of all
images and help easy to recognize those images. Along with the images are converted in
to double and recorded as noc and npic in workspace of MATLAB. After that all images
are set in row and column format of 1:1 ratio as in workspace of MATLAB and finally
im.mat file is created in this script.

 Call_fold_data.m:

In this script the given data from the first script are place in the variable called D1. The
data are run in 5 folds cross-validation and output variable D1 is visible in a MATLAB
workspace. The variable X1, T1, X2 AND T2 are used which are matrix of training
dataset, validation set and target vector of the images in the validation set. Principal
component analysis is also used in this script. For loop is used to iterated all over the
column.

 Call_pw_annealing_main.m:

This is the third script that university provided. In this script the given data are store in
the variable as mention in the above script. The main aim of annealing function is to
optimize the pairwise annealing. In workspace D1 are the image dataset created by
function that contain 1500 images. Variable like number binary classifier(nobc), number
of classifier(noc), binary classifier(nbc) and number of fold(nf) are used in this script.
3. Experiments
University Bedfordshire already the given the script written in MATLAB so we
have only to run those script and analyses those part. So, there are almost five
members in our group we have to run the script and written the report individual,
but the accuracy of the highest member should be mention there so we run the
script three times in each fold. To give the highest accuracy I mainly focus on the
neural network I trained my network three times with the different parameter.
Mainly neural network is trained consider on five different factors i.e. number of
principle component, number of hidden layer and VR ration. In the first fold I run my
experiment with default value with three times and write the value which is maximum
accuracy and then I changed the ensemble size and hidden neuron by keeping constant
the PCA and min PCA. Similarly I do my experiment with three fold also.
i. Fold 1(Default value and Default Setting)

No of pc Min pc E size No hn Av Accuracy


80 20 3 1 2 79.8%
80 20 2 2 2 81.8%
80 20 4 3 2 82.2%

In every fold of the experiment I have run three times. In this fold first experiment I trained
neural network three times with default value and changing the value also and I found that
highest accuracy is 82.2% when the number of pc is 80, min pc is 20, e size is 4 and hidden
neuron is 3 and average value is 82.27%

Figure 1 Default value with Time Run


ii. Fold 2(Default value with increasing Hidden layer)

No of Min pc E size No hn Av Accuracy


80 20 3 1 2 81.5%
80 20 2 2 2 82.5%
80 20 4 3 2 83.8%

In the second fold I also run the experiment three times with the default value and changing the
value also. In this fold I change the value of esize and hidden neuron. In the default value the
accuracy is 81.5%. in the same fold when I changed the value of ensemble size from 3 to 2 and
hidden neuron from 1 to 2 then accuracy become 82.5%. in this experiment the average accuracy
is 82.5%
Iii-Fold 3

No of PC Min pc E size No hn Av Accuracy


80 20 3 1 2 82.2
80 20 2 2 2 80.2%
80 20 4 3 2 82.4%

In the third fold I also run the experiment three times with the default value and changing the
value also. In this fold I change the value of esize and hidden neuron. In the default value the
accuracy is 82.2%. in the same fold when I changed the value of ensemble size from 3 to 2 and
hidden neuron from 1 to 2 then accuracy become 80.2%. in this experiment the average accuracy
is 82.4%.
iv-Fold 4(By Increasing the Hidden layer)
No of pc Min pc E size No hn Av Accuracy
80 20 3 1 2 84.2%
80 20 2 2 2 84.5%
80 20 4 3 2 85.2%

In the fourth fold I also run the experiment three times with the default value and changing the
value also. In this fold I change the value of esize and hidden neuron. In the default value the
accuracy is 84.2%. in the same fold when I changed the value of ensemble size from 3 to 2 and
hidden neuron from 1 to 2 then accuracy become 82.5%. in this experiment the average accuracy
is 85.2%
4. Conclusion
In this assignment we can understand the concept of artificial intelligent and neural
network and related another topic. It is also necessary to understand every topic and
script provided by university. To obtained high accuracy it is also necessary to
understand deep concept of hill climbing, ada-boost, and annealing. Every group member
run their experiment individually and obtained their own result. So to obtained high
accuracy I have run my experiments with default value three times and then I changed
different parameter. The highest accuracy in my group is 96% and my highest accuracy is
85.2% with changed of values. From all these experiments I found that as the hidden
layer increases accuracy also increases.
References
Anon., 2019. facial recognition. [Online]
Available at: https://findbiometrics.com/solutions/facial-recognition/
[Accessed 18 Feburary 2019].

Coppin, B., 2004. In: Artifical Intelligence. s.l.:Jones and Bartlett.

Grady, D., 1993. The Vision Thing: Mainly in the Brain. [Online]
Available at: http://discovermagazine.com/1993/jun/thevisionthingma227
[Accessed 17 2 2019].

Hassaballah, M. & Aly, S., 2014. The Institution of Engineering and Technology. Face recognition:
challenges, achievementsand future directions, 9(4), pp. 614-628.

Roomi, M. M., 2013. Introduction. A REVIEW OF FACE RECOGNITION METHODS, 27(International


Journal of pattern Recognition and artificial Intelligence), p. 35.
Appendix
Fold 1 High Accuracy Screen shot

Figure 2 High Accuracy on fold 1


Fold 2 High Accuracy Screen shot

Figure 3 High Accuracy on fold 2


Fold 3 High Accuracy Screen shot

Figure 4 High Accuracy fold 3


Call_im Script

Figure 5 Call_im image set cod


Call_fold_data_im

Figure 6 Call_fold_data code


Call_pw_annealing

Figure 7 Call_annealing_pw Code

You might also like