NNpred

Download as xls, pdf, or txt
Download as xls, pdf, or txt
You are on page 1of 74

Instructions on Using the tool

( Building a prediction Model)

Step 1: Enter Your Data


(A) Enter your data in The Data worksheet, starting from the cell AC105
(B) The observations should be in rows and the variables should be in columns.
(C) Above each column, choose appropriate Type (Omit, Output, Cont, Cat)
To drop a column from model - set the type = Omit
To treat a column as categorical Input, set type = Cat
To treat a column as continuous Input, set type = Cont
To treat a column as Output, set type = Output
You can have atmost 10 output variables. Application will automatically treat them all as continuous variables.
Usually one builds prediction model with 1 output only.
If you have say, 2 output variables Y1 and Y2, both of which depend on the same set of Input variables,
you may be better off, building 2 separate models - One with Y1 as Output, another one with Y2 as output.

You can have at most 50 input variables, out of which atmost 40 could be categorical.
Make sure that the number of Input (Cat & Cont) columns exactly match with the number entered in UserInput sheet.
(D) Please make sure that your data does not have blank rows or blank columns.
(E) Continuous Inputs:
Any non-number in Cont column will be treated as missing value.
Application will replace it by the column mean
(E) Categorical Inputs:
Any blank cell or cells containing Excel error in Cat column will be treated as missing value
Application will reaplce it by the most frequently occuring category.
Category labels are case insensitive - lables good, Good, GoOd, GOOD will all be treated as the same categor
There should be at least 2 observations in each category of a Cat column.
If one of the category of a Cat column has only 1 observation, you should do one of the following Remove that observation OR
Rename the category to any other categories of that Cat column.
Step 2: Fill up Model Inputs
(A) Fill up the model inputs in the User Input Page.
(B) Make sure that your inputs are within the range of values allowed by the application.
(C) Click the 'Build Model' button to start modeling.
Step 3: Results of Modeling
(A) A Neural Network model is basically a set of weights between the layers of the net.
At the end of the run, the final set of weights are saved in the Calc sheet.
(B) The output page of this file will show you the values of MSE and ARE on the training and validation set
as the training of the model progresses. Two charts showing training and Validation MSE's
have been already provided in the Output sheet.
(C) In UserInput page if you have asked to save the model in a separate file, then a new file
will be created containing the model inputs, your data, and the fitted model ( i.e. the weights)
You will be able to use this file as a calculator to do prediction, given any new input.
Step 4: Study Profiles
Fitted model is a surface in p -dimension where the number of your inputs is p .
Unless p is 2 or less, it is not possible to show the surface graphically.
Profile plot is the next best way to visualize this fitted surface.
By varying only one predictor between two values and keeping all the others fixed at some pre-specified values

we get the profile plot - which is really a one dimensional cross section of the high dimensional surface.
In the Profile sheet you can specify which predictor to vary and the values at which the other predictors should be held fixe
Click Create Profile button to generate the profile.
If the predictor you choose to vary is categorical then the other info ( #points to be generated, start and end values)
will be ignored and the graph will show you the predicted response for each category of the predictor you have chosen to v
Profile plot lets you study the following things:
(1) Nature of relationship bettween a particular predictor X and the response Y
( E.g. Y increases as X increases OR Y decreases as X increases
OR the relationship is non-linear - Y first increases and then decreases with X etc etc.
(2) Profile plots also lets you study the interaction between predictors.
Suppose there are two predictors X and Z and we are studying the profile of Y as X varies
Suppose we look at the profile by keeping Z fixed at 1 and varying X between -10 and 10.
Now keep Z fixed at 2 instead of 1 and vary X between -10 and 10.
If the shape of the profiles in these two scenarios are drastically different
(e.g. one is increasing and the other is decreasing) then that says thay X and Z has interaction.
In other words, the effect of X on the Response is not same at all levels of Z
To study the effect of X, it matters where Z is set.

A few more points


Initial weights
For the training of the model, we need to start with an initial set of values of the network weights.
By default, the weights are initialized with random values between -w and w .
where w is a number between 0 and 1, specified by you in the UserInput page.
(A) Once you build a model, the final weights are stored in Calc page.
Next time you want to train a model with same architecture and same data,
the application will ask you whether to start with the weights already saved in Calc sheet.
If you say YES , these wights are used. If you say NO , the weights are re-initialized with random values.
(B) Instead of starting with ramdom weights, you may want to start with our own choice of weights.
Specifying your choice of starting weights is a bit non-trivial for this application. Here is how you do it.
Specify the inputs in the UserInput page and specify the number of training cycle as 0.
This will just setup the Calc page without doing any training.
Now go to Calc sheet and write down your choice of weights in the appropriate places of the weight matrices.
Now come back to UserInput sheet and specify the number of trining cycles you want and click on the Buil Model button.
When the application asks whether to use the already saved weights, click on the YES button.
Now your network will be trained with the starting weights specified by you.

# Missing Value
Min

ll as continuous variables.

Max
Average

et of Input variables,
one with Y2 as output.

sd
Intercept
Slope

umber entered in UserInput sheet.

d as missing value
will all be treated as the same category

ld do one of the following -

ning and validation set

t some pre-specified values

dimensional surface.
he other predictors should be held fixed.

enerated, start and end values)


y of the predictor you have chosen to vary.

with X etc etc.

le of Y as X varies
ween -10 and 10.

X and Z has interaction.

work weights.

with random values.


oice of weights.
e is how you do it.
ing cycle as 0.

opriate places of the weight matrices.


nt and click on the Buil Model button.

Cont. Var.
# Missing Value

Cat. Var.
#Levels
Lables

Values

Dummy

Partition data into Tr


Use who

Network Architectu

Number of Inputs ( bew

Number of Hidden Laye

Learning parameter (be


Momentum (between 0

Training Options
Total #rows in your data

Present Inputs in Rando

From very last cycle

With least Training Error

With least Validation Error

Partition data into Training / Validation set


Use whole data as training set

1
2

Save Network weights

Training / Validation S

If you want to partition, h


Please ch
Please fill up the input n

Save model in a sepa

Network ArchitectureOptions
Number of Inputs ( bewtween 2 and 50)

Number of Outputs ( between 1 and 10 )

Number of Hidden Layers ( 1 or 2 )

Hidden Layer sizes ( Maximum 20 )

Learning parameter (between 0 and 1)

0.7

Momentum (between 0 and 1)

0.5

Training Options
Total #rows in your data ( Minimum 10 )

13

No. of Training cycles ( Maximum 500 )

Present Inputs in Random order while Training ?

NO

Training Mode (Batch or Sequential )

Initial Wt Range ( 0 +/- w): w =

Save Network weights

With least Training Error

Training / Validation Set

Partition data into Training / Validation set

If you want to partition, how do you want to select the Validation set ?
Please choose one option
1
Please fill up the input necessary for the selected option
Save model in a separate workbook?

NO

Option 1 : Randomly select


Option 2:
Use last

etween 1 and 10 )

aximum 20 )

( Maximum 500 )

or Sequential )

1
Hidden 1

Hidden 2

6
0.99

500
Sequential

10%
6

of data as Validation set (between 1% and 50%)


rows of the data as validation set

Enter your Data in this sheet


Start Entering your data from cell AC105.
Make sure that the row 104 is blank.
Specify variable type in row 102.
Cont - for continuous Input,
Cat - for Categorical Input,
Output -for Output var.
Omit - if you don't want to usethe variable in the model
For each continuous Input, there will be 1 neuron in Input Layer.
For Each categorical Input with K levels, there will be K neurons in Input Layer
Please make sure that there are no more than 50 neurons in Input Layer.
There should be at most 10 Output variables - application will treat them all as Continuous.
There should be no more than 40 Categorical Input Variables.
Instructions:

Var Type
Var Name

Cont

Cont

X1

Output

X2

0
1
2
3
-1
-2
-3
-4
5
3
1
-1
5

1
-2
3
5
-5
6
0
1
2
10
2
-10
5

2.65
14.1
30.85
76.75
62.75
88.4
-10.5
-11.35
28.1
275.5
14.1
261.5
83.75

Omit

Specify variable name in row 103.

m all as Continuous.

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

Omit

X18

X19

Omit

Omit

Omit

Omit

Omit

X20

X21

X22

X23

X24

Omit

Omit

Omit

Omit

Omit

X25

X26

X27

X28

X29

Omit

Omit

Omit

Omit

Omit

X30

X31

X32

X33

X34

Omit

Omit

Omit

Omit

Omit

X35

X36

X37

X38

X39

Omit

Omit

Omit

Omit

Omit

X40

X41

X42

X43

X44

Omit

Omit

Omit

Omit

Omit

X45

X46

X47

X48

X49

Omit

Omit

Omit

Omit

Omit

X50

X51

X52

X53

X54

Omit

Omit

Omit

Omit

Omit

X55

X56

X57

X58

X59

Omit
X60

Neural Network Model for Prediction

Created On :

MSE(Training)

MSE(Validation)

154.621

Number of Hidden Layers


Layer Sizes

6-Nov-07
267.5323

2
2

True Output (if available)


Model (Predicted) Output
ABS( (Tru - Predicted) / Tru )

RMSE

#VALUE!

9.6373
#DIV/0!

Raw Input
Transformed Input
Hdn1_bias
Hdn1_Nrn1
Hdn1_Nrn2
Hdn1_Nrn3
Hdn1_Nrn4
Hdn1_Nrn5
Hdn1_Nrn6
Hdn2_bias
Hdn2_Nrn1
Hdn2_Nrn2
Hdn2_Nrn3
Op_bias
Op_Nrn1

Bias

Cont
X1

1
Bias

X1

Cont
X2
X2

0.4444

0.0000

0.0000

0.5000
0.0000

0.0000

0.0524

2.5165

-0.4399

0.9509

1.2854

0.1413

1.9945

2.3454

-0.7819

-0.7233

-2.3057

-2.2563

2.2004

-0.9966

-10.6639

-3.5744

-0.1531

0.6052

-1.8043

-0.7863

-7.2927
1.0000

0.2966
0.7213

8.4806
0.9126

-2.9206
0.0948

0.0000

0.0000

0.0000

0.0000

-0.1402

-1.2248

-0.7537

0.7570

-0.4941

0.5985

-1.1056

0.4482

0.3762
1.0000

0.2046
0.1376

1.1232
0.2649

-0.2514
0.7978

0.0000

0.0000

0.0000

0.0000

0.0699
1.0000

-0.6589
0.0732

4.2641

-4.5722

ARE

#DIV/0!

0.0273

0.3130

0.0511

0.0000

0.0000

0.0000

0.0000

-0.8817

-0.5728

0.1543

-1.8351

3.0189

-0.7146

2.9264

-1.0206

-3.2625

0.3926

-3.6364

1.3728

0.0000
-2.5391

500

10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44

45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96

97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148

149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200

201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252

253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304

305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356

357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408

409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460

461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500

Avg. error per Input (Original


Scale)
(Training Set)

Avg. error per Input (Original


Scale)
(Validation Set)
10000.000

Epoch
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44

9000.000
MSE (Original Scale)

ARE (%)

MSE (Original Scale)

ARE (%)

9138.942
9110.262
9100.245
9096.318
9094.499
9093.431
9092.640
9091.952
9091.309
9090.687
9090.077
9089.476
9088.883
9088.300
9087.724
9087.155
9086.592
9086.038
9085.491
9084.951
9084.418
9083.891
9083.374
9082.859
9082.354
9081.854
9081.360
9080.876
9080.395
9079.920
9079.450
9078.989
9078.532
9078.082
9077.638
9077.199
9076.765
9076.337
9075.913
9075.496
9075.084
9074.677
9074.273
9073.875

395.89%
404.66%
408.04%
409.40%
410.02%
410.37%
410.63%
410.84%
411.04%
411.24%
411.43%
411.61%
411.80%
411.98%
412.17%
412.35%
412.53%
412.70%
412.88%
413.05%
413.22%
413.39%
413.56%
413.72%
413.89%
414.05%
414.21%
414.37%
414.52%
414.68%
414.83%
414.98%
415.13%
415.28%
415.43%
415.57%
415.71%
415.85%
415.99%
416.13%
416.26%
416.40%
416.53%
416.66%

3049.480
3148.870
3188.305
3204.333
3211.658
3215.788
3218.738
3221.247
3223.580
3225.835
3228.051
3230.236
3232.399
3234.539
3236.658
3238.756
3240.832
3242.886
3244.918
3246.928
3248.917
3250.884
3252.828
3254.750
3256.649
3258.527
3260.381
3262.214
3264.023
3265.810
3267.574
3269.317
3271.036
3272.732
3274.406
3276.058
3277.687
3279.294
3280.877
3282.439
3283.979
3285.497
3286.992
3288.466

347.69%
353.08%
355.15%
355.99%
356.37%
356.58%
356.73%
356.86%
356.98%
357.10%
357.21%
357.33%
357.44%
357.55%
357.66%
357.76%
357.87%
357.98%
358.08%
358.18%
358.28%
358.38%
358.48%
358.58%
358.68%
358.77%
358.87%
358.96%
359.05%
359.14%
359.23%
359.32%
359.41%
359.49%
359.58%
359.66%
359.74%
359.82%
359.90%
359.98%
360.06%
360.13%
360.21%
360.28%

8000.000
7000.000
6000.000
5000.000
4000.000
3000.000
2000.000
1000.000
0.000
0

4000.000
3500.000
3000.000
2500.000
2000.000
1500.000
1000.000
500.000
0.000
0

45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96

9073.483
9073.095
9072.713
9072.334
9071.960
9071.590
9071.225
9070.862
9070.505
9070.151
9069.804
9069.457
9069.117
9068.777
9068.446
9068.115
9067.786
9067.464
9067.143
9066.826
9066.513
9066.201
9065.895
9065.588
9065.288
9064.988
9064.691
9064.398
9064.108
9063.819
9063.534
9063.250
9062.970
9062.691
9062.415
9062.140
9061.866
9061.597
9061.328
9061.062
9060.798
9060.535
9060.274
9060.015
9059.757
9059.501
9059.246
9058.993
9058.741
9058.491
9058.242
9057.994

416.79%
416.91%
417.04%
417.16%
417.28%
417.40%
417.52%
417.64%
417.75%
417.86%
417.98%
418.09%
418.19%
418.30%
418.41%
418.51%
418.61%
418.72%
418.82%
418.91%
419.01%
419.11%
419.20%
419.29%
419.39%
419.48%
419.56%
419.65%
419.74%
419.82%
419.91%
419.99%
420.07%
420.15%
420.23%
420.31%
420.38%
420.46%
420.53%
420.61%
420.68%
420.75%
420.82%
420.89%
420.96%
421.02%
421.09%
421.15%
421.22%
421.28%
421.34%
421.40%

3289.917
3291.347
3292.755
3294.141
3295.507
3296.851
3298.174
3299.475
3300.756
3302.017
3303.256
3304.475
3305.674
3306.852
3308.012
3309.151
3310.270
3311.370
3312.451
3313.512
3314.556
3315.580
3316.585
3317.572
3318.541
3319.492
3320.425
3321.341
3322.240
3323.120
3323.985
3324.831
3325.662
3326.475
3327.272
3328.054
3328.819
3329.568
3330.302
3331.020
3331.724
3332.412
3333.084
3333.742
3334.386
3335.016
3335.630
3336.231
3336.819
3337.392
3337.952
3338.498

360.35%
360.42%
360.49%
360.56%
360.63%
360.70%
360.76%
360.83%
360.89%
360.95%
361.02%
361.08%
361.13%
361.19%
361.25%
361.31%
361.36%
361.41%
361.47%
361.52%
361.57%
361.62%
361.67%
361.72%
361.77%
361.81%
361.86%
361.90%
361.94%
361.99%
362.03%
362.07%
362.11%
362.15%
362.19%
362.22%
362.26%
362.30%
362.33%
362.37%
362.40%
362.43%
362.46%
362.50%
362.53%
362.56%
362.58%
362.61%
362.64%
362.67%
362.69%
362.72%

97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148

9057.747
9057.501
9057.257
9057.014
9056.772
9056.530
9056.290
9056.050
9055.812
9055.573
9055.335
9055.098
9054.861
9054.626
9054.391
9054.157
9053.923
9053.688
9053.455
9053.222
9052.989
9052.755
9052.522
9052.290
9052.059
9051.824
9051.593
9051.358
9051.126
9050.894
9050.660
9050.428
9050.194
9049.959
9049.725
9049.491
9049.255
9049.019
9048.782
9048.545
9048.308
9048.067
9047.830
9047.589
9047.348
9047.108
9046.865
9046.621
9046.376
9046.132
9045.884
9045.637

421.46%
421.52%
421.58%
421.64%
421.69%
421.75%
421.80%
421.86%
421.91%
421.96%
422.01%
422.06%
422.11%
422.16%
422.21%
422.26%
422.30%
422.35%
422.40%
422.44%
422.48%
422.53%
422.57%
422.61%
422.65%
422.69%
422.73%
422.77%
422.81%
422.85%
422.88%
422.92%
422.96%
422.99%
423.03%
423.06%
423.09%
423.13%
423.16%
423.19%
423.22%
423.25%
423.28%
423.31%
423.34%
423.37%
423.40%
423.43%
423.45%
423.48%
423.51%
423.53%

3339.030
3339.550
3340.056
3340.550
3341.032
3341.501
3341.958
3342.401
3342.833
3343.253
3343.661
3344.057
3344.442
3344.815
3345.177
3345.527
3345.867
3346.196
3346.514
3346.821
3347.117
3347.403
3347.679
3347.945
3348.198
3348.444
3348.679
3348.904
3349.120
3349.325
3349.521
3349.707
3349.884
3350.051
3350.209
3350.358
3350.498
3350.629
3350.750
3350.863
3350.968
3351.063
3351.150
3351.228
3351.297
3351.358
3351.410
3351.453
3351.489
3351.516
3351.536
3351.546

362.74%
362.77%
362.79%
362.81%
362.84%
362.86%
362.88%
362.90%
362.92%
362.94%
362.95%
362.97%
362.99%
363.01%
363.02%
363.04%
363.05%
363.07%
363.08%
363.09%
363.11%
363.12%
363.13%
363.14%
363.15%
363.16%
363.17%
363.18%
363.19%
363.20%
363.20%
363.21%
363.22%
363.22%
363.23%
363.24%
363.24%
363.24%
363.25%
363.25%
363.25%
363.26%
363.26%
363.26%
363.26%
363.26%
363.26%
363.26%
363.26%
363.26%
363.26%
363.26%

149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200

9045.388
9045.138
9044.887
9044.632
9044.379
9044.122
9043.864
9043.605
9043.345
9043.082
9042.819
9042.554
9042.285
9042.017
9041.745
9041.473
9041.196
9040.918
9040.640
9040.357
9040.071
9039.786
9039.497
9039.205
9038.910
9038.613
9038.313
9038.012
9037.706
9037.397
9037.087
9036.772
9036.454
9036.134
9035.810
9035.483
9035.152
9034.816
9034.478
9034.135
9033.790
9033.440
9033.086
9032.728
9032.365
9031.998
9031.628
9031.253
9030.872
9030.486
9030.097
9029.700

423.56%
423.58%
423.61%
423.63%
423.65%
423.68%
423.70%
423.72%
423.74%
423.76%
423.78%
423.80%
423.82%
423.84%
423.86%
423.88%
423.90%
423.91%
423.93%
423.95%
423.96%
423.98%
424.00%
424.01%
424.03%
424.04%
424.05%
424.07%
424.08%
424.09%
424.10%
424.12%
424.13%
424.14%
424.15%
424.16%
424.17%
424.18%
424.19%
424.20%
424.21%
424.21%
424.22%
424.23%
424.23%
424.24%
424.25%
424.25%
424.26%
424.26%
424.27%
424.27%

3351.549
3351.543
3351.529
3351.507
3351.478
3351.439
3351.394
3351.341
3351.279
3351.210
3351.133
3351.047
3350.955
3350.854
3350.746
3350.629
3350.505
3350.374
3350.235
3350.088
3349.933
3349.770
3349.600
3349.423
3349.237
3349.044
3348.843
3348.634
3348.418
3348.194
3347.963
3347.724
3347.475
3347.221
3346.958
3346.687
3346.408
3346.122
3345.827
3345.525
3345.214
3344.896
3344.569
3344.234
3343.891
3343.540
3343.180
3342.813
3342.436
3342.051
3341.658
3341.257

363.25%
363.25%
363.25%
363.25%
363.24%
363.24%
363.23%
363.23%
363.22%
363.22%
363.21%
363.20%
363.19%
363.19%
363.18%
363.17%
363.16%
363.15%
363.14%
363.13%
363.12%
363.11%
363.10%
363.09%
363.08%
363.07%
363.05%
363.04%
363.03%
363.01%
363.00%
362.98%
362.97%
362.95%
362.94%
362.92%
362.90%
362.89%
362.87%
362.85%
362.83%
362.81%
362.79%
362.78%
362.76%
362.73%
362.71%
362.69%
362.67%
362.65%
362.63%
362.60%

201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252

9029.301
9028.894
9028.483
9028.066
9027.645
9027.217
9026.782
9026.341
9025.895
9025.442
9024.983
9024.516
9024.044
9023.563
9023.076
9022.580
9022.078
9021.568
9021.049
9020.522
9019.987
9019.443
9018.892
9018.331
9017.759
9017.179
9016.588
9015.988
9015.378
9014.757
9014.125
9013.481
9012.827
9012.160
9011.482
9010.791
9010.089
9009.372
9008.642
9007.898
9007.142
9006.370
9005.584
9004.780
9003.963
9003.129
9002.277
9001.411
9000.524
8999.621
8998.699
8997.758

424.27%
424.28%
424.28%
424.28%
424.28%
424.28%
424.29%
424.29%
424.29%
424.29%
424.29%
424.28%
424.28%
424.28%
424.28%
424.28%
424.27%
424.27%
424.26%
424.26%
424.25%
424.25%
424.24%
424.24%
424.23%
424.22%
424.21%
424.20%
424.19%
424.18%
424.17%
424.16%
424.15%
424.14%
424.13%
424.11%
424.10%
424.08%
424.07%
424.05%
424.04%
424.02%
424.00%
423.98%
423.96%
423.94%
423.92%
423.90%
423.88%
423.86%
423.83%
423.81%

3340.846
3340.426
3339.998
3339.561
3339.115
3338.660
3338.197
3337.722
3337.240
3336.748
3336.246
3335.735
3335.214
3334.683
3334.143
3333.592
3333.030
3332.459
3331.877
3331.286
3330.682
3330.068
3329.442
3328.806
3328.158
3327.500
3326.829
3326.146
3325.451
3324.743
3324.024
3323.293
3322.548
3321.790
3321.018
3320.234
3319.435
3318.622
3317.795
3316.954
3316.097
3315.226
3314.339
3313.437
3312.518
3311.584
3310.632
3309.664
3308.679
3307.676
3306.654
3305.615

362.58%
362.56%
362.53%
362.51%
362.48%
362.45%
362.43%
362.40%
362.37%
362.35%
362.32%
362.29%
362.26%
362.23%
362.20%
362.17%
362.14%
362.11%
362.07%
362.04%
362.01%
361.97%
361.94%
361.90%
361.87%
361.83%
361.79%
361.76%
361.72%
361.68%
361.64%
361.60%
361.56%
361.52%
361.47%
361.43%
361.39%
361.34%
361.30%
361.25%
361.20%
361.16%
361.11%
361.06%
361.01%
360.96%
360.91%
360.85%
360.80%
360.74%
360.69%
360.63%

253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304

8996.798
8995.816
8994.813
8993.791
8992.746
8991.679
8990.586
8989.471
8988.329
8987.163
8985.972
8984.751
8983.503
8982.226
8980.919
8979.582
8978.212
8976.810
8975.374
8973.902
8972.395
8970.848
8969.267
8967.642
8965.978
8964.270
8962.519
8960.721
8958.875
8956.981
8955.036
8953.040
8950.987
8948.880
8946.713
8944.485
8942.196
8939.840
8937.417
8934.925
8932.355
8929.713
8926.992
8924.188
8921.297
8918.319
8915.249
8912.081
8908.814
8905.443
8901.963
8898.368

423.78%
423.76%
423.73%
423.70%
423.67%
423.64%
423.61%
423.58%
423.54%
423.51%
423.47%
423.44%
423.40%
423.36%
423.32%
423.28%
423.23%
423.19%
423.14%
423.10%
423.05%
423.00%
422.95%
422.89%
422.84%
422.78%
422.72%
422.66%
422.60%
422.53%
422.47%
422.40%
422.33%
422.25%
422.18%
422.10%
422.02%
421.93%
421.85%
421.76%
421.67%
421.57%
421.47%
421.37%
421.26%
421.16%
421.04%
420.93%
420.80%
420.68%
420.55%
420.41%

3304.557
3303.479
3302.382
3301.264
3300.126
3298.968
3297.787
3296.585
3295.360
3294.112
3292.841
3291.544
3290.224
3288.878
3287.506
3286.107
3284.680
3283.227
3281.743
3280.231
3278.688
3277.113
3275.507
3273.868
3272.194
3270.487
3268.743
3266.961
3265.143
3263.287
3261.389
3259.451
3257.470
3255.444
3253.374
3251.257
3249.092
3246.878
3244.614
3242.295
3239.922
3237.495
3235.008
3232.462
3229.854
3227.181
3224.442
3221.635
3218.758
3215.808
3212.780
3209.676

360.58%
360.52%
360.46%
360.40%
360.34%
360.27%
360.21%
360.14%
360.08%
360.01%
359.94%
359.87%
359.80%
359.73%
359.65%
359.58%
359.50%
359.42%
359.34%
359.26%
359.18%
359.09%
359.01%
358.92%
358.83%
358.74%
358.64%
358.55%
358.45%
358.35%
358.25%
358.15%
358.04%
357.93%
357.82%
357.71%
357.59%
357.47%
357.35%
357.23%
357.10%
356.97%
356.84%
356.70%
356.56%
356.42%
356.27%
356.12%
355.97%
355.81%
355.65%
355.48%

305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356

8894.655
8890.819
8886.854
8882.755
8878.515
8874.128
8869.587
8864.884
8860.016
8854.970
8849.740
8844.316
8838.694
8832.856
8826.798
8820.507
8813.969
8807.176
8800.112
8792.766
8785.120
8777.162
8768.873
8760.238
8751.236
8741.850
8732.061
8721.842
8711.173
8700.029
8688.386
8676.212
8663.483
8650.165
8636.226
8621.629
8606.345
8590.327
8573.539
8555.937
8537.476
8518.106
8497.784
8476.454
8454.062
8430.551
8405.864
8379.942
8352.720
8324.135
8294.124
8262.619

420.27%
420.13%
419.98%
419.82%
419.66%
419.49%
419.32%
419.14%
418.95%
418.76%
418.55%
418.34%
418.12%
417.89%
417.66%
417.41%
417.15%
416.88%
416.60%
416.31%
416.00%
415.68%
415.35%
415.01%
414.64%
414.26%
413.87%
413.45%
413.01%
412.56%
412.08%
411.58%
411.06%
410.51%
409.93%
409.32%
408.68%
408.01%
407.31%
406.56%
405.78%
404.96%
404.09%
403.18%
402.22%
401.21%
400.14%
399.01%
397.83%
396.57%
395.25%
393.86%

3206.488
3203.217
3199.859
3196.410
3192.867
3189.226
3185.485
3181.637
3177.680
3173.610
3169.422
3165.111
3160.674
3156.103
3151.394
3146.540
3141.538
3136.377
3131.055
3125.563
3119.893
3114.038
3107.991
3101.741
3095.280
3088.599
3081.688
3074.535
3067.132
3059.465
3051.521
3043.288
3034.752
3025.900
3016.715
3007.181
2997.283
2987.002
2976.320
2965.219
2953.677
2941.675
2929.189
2916.199
2902.680
2888.609
2873.963
2858.714
2842.839
2826.313
2809.111
2791.208

355.31%
355.14%
354.96%
354.77%
354.58%
354.39%
354.19%
353.98%
353.77%
353.55%
353.33%
353.10%
352.86%
352.61%
352.36%
352.10%
351.83%
351.56%
351.27%
350.97%
350.67%
350.35%
350.03%
349.69%
349.34%
348.98%
348.61%
348.22%
347.82%
347.40%
346.97%
346.52%
346.06%
345.58%
345.07%
344.55%
344.01%
343.45%
342.86%
342.25%
341.61%
340.94%
340.25%
339.53%
338.77%
337.98%
337.16%
336.29%
335.39%
334.45%
333.47%
332.44%

357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408

8229.551
8194.858
8158.470
8120.320
8080.348
8038.491
7994.686
7948.880
7901.017
7851.053
7798.941
7744.646
7688.131
7629.371
7568.343
7505.028
7439.411
7371.486
7301.242
7228.675
7153.780
7076.550
6996.981
6915.063
6830.785
6744.127
6655.067
6563.577
6469.623
6373.162
6274.145
6172.516
6068.212
5961.171
5851.318
5738.585
5622.903
5504.207
5382.442
5257.563
5129.549
4998.398
4864.137
4726.832
4586.588
4443.560
4297.953
4150.036
4000.136
3848.656
3696.067
3542.917

392.39%
390.84%
389.21%
387.49%
385.68%
383.78%
381.78%
379.68%
377.47%
375.17%
372.75%
370.23%
367.60%
364.87%
362.03%
359.08%
356.03%
352.88%
349.64%
346.30%
342.88%
339.38%
335.81%
332.16%
328.46%
324.71%
320.90%
317.05%
313.17%
309.26%
305.32%
301.36%
297.38%
293.39%
289.38%
285.35%
281.31%
277.26%
273.18%
269.09%
264.97%
260.82%
256.63%
252.41%
248.15%
243.83%
239.47%
235.06%
230.60%
226.08%
221.52%
216.92%

2772.580
2753.207
2733.065
2712.137
2690.406
2667.860
2644.488
2620.285
2595.250
2569.385
2542.701
2515.214
2486.943
2457.914
2428.158
2397.716
2366.628
2334.943
2302.709
2269.982
2236.821
2203.282
2169.428
2135.320
2101.019
2066.590
2032.094
1997.596
1963.160
1928.853
1894.741
1860.894
1827.385
1794.291
1761.692
1729.675
1698.331
1667.759
1638.059
1609.344
1581.727
1555.324
1530.258
1506.646
1484.609
1464.259
1445.701
1429.025
1414.307
1401.595
1390.916
1382.251

331.36%
330.24%
329.06%
327.83%
326.55%
325.20%
323.80%
322.34%
320.81%
319.22%
317.57%
315.84%
314.06%
312.21%
310.29%
308.30%
306.26%
304.15%
301.98%
299.75%
297.47%
295.14%
292.76%
290.34%
287.88%
285.38%
282.86%
280.32%
277.76%
275.20%
272.63%
270.06%
267.51%
264.97%
262.46%
259.99%
257.55%
255.17%
252.85%
250.60%
248.42%
246.34%
244.35%
242.47%
240.71%
239.07%
237.57%
236.21%
235.01%
233.96%
233.07%
232.33%

409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460

3389.837
3237.537
3086.802
2938.483
2793.481
2652.693
2516.977
2387.077
2263.563
2146.767
2036.760
1933.354
1836.130
1744.514
1657.847
1575.460
1496.738
1421.169
1348.372
1278.096
1210.210
1144.672
1081.484
1020.657
962.190
906.086
852.387
801.216
752.785
707.379
665.287
626.731
591.806
560.442
532.409
507.353
484.853
464.486
445.875
428.723
412.809
397.984
384.147
371.225
359.156
347.879
337.331
327.449
318.169
309.435
301.198
293.417

212.28%
207.74%
203.24%
198.72%
194.18%
189.64%
185.09%
180.57%
176.64%
172.72%
168.93%
165.73%
162.53%
159.37%
156.25%
153.21%
150.26%
147.41%
144.67%
142.03%
139.45%
136.89%
134.32%
131.66%
128.86%
125.88%
122.70%
119.34%
115.82%
112.20%
108.56%
104.96%
101.48%
98.16%
95.05%
92.29%
89.96%
87.84%
85.91%
84.16%
82.55%
81.18%
79.99%
78.87%
77.81%
76.81%
75.87%
74.97%
74.13%
73.32%
72.56%
71.84%

1375.548
1370.698
1367.535
1365.829
1365.276
1365.504
1366.075
1366.491
1366.217
1364.698
1361.378
1355.724
1347.230
1335.423
1319.866
1300.169
1276.002
1247.133
1213.469
1175.109
1132.379
1085.863
1036.383
984.934
932.598
880.436
829.413
780.369
734.009
690.912
651.521
616.128
584.840
557.575
534.056
513.856
496.452
481.300
467.906
455.866
444.884
434.761
425.379
416.660
408.556
401.021
394.006
387.461
381.333
375.572
370.138
364.997

231.75%
231.32%
231.03%
230.85%
230.76%
230.73%
230.71%
230.67%
230.55%
230.31%
229.91%
229.29%
228.44%
227.31%
225.88%
224.14%
222.07%
219.64%
216.86%
213.69%
210.12%
206.16%
201.80%
197.07%
192.03%
186.75%
181.34%
175.93%
170.62%
165.53%
160.75%
156.32%
152.29%
148.66%
145.40%
142.50%
139.90%
137.57%
135.46%
133.54%
131.77%
130.12%
128.59%
127.15%
125.79%
124.52%
123.32%
122.19%
121.12%
120.11%
119.14%
118.23%

461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500

286.056
279.088
272.485
266.223
260.280
254.633
249.260
244.140
239.259
234.601
230.152
225.898
221.830
217.935
214.203
210.625
207.190
203.892
200.722
197.674
194.741
191.919
189.198
186.575
184.044
181.601
179.242
176.964
174.761
172.631
170.569
168.572
166.637
164.761
162.942
161.179
159.467
157.805
156.190
154.621

71.15%
70.49%
69.86%
69.26%
68.69%
68.14%
67.61%
67.11%
66.62%
66.16%
65.71%
65.28%
64.86%
64.47%
64.08%
63.71%
63.35%
63.01%
62.68%
62.36%
62.04%
61.74%
61.45%
61.17%
60.90%
60.64%
60.38%
60.14%
59.90%
59.66%
59.44%
59.22%
59.01%
58.80%
58.60%
58.41%
58.22%
58.03%
57.86%
57.68%

360.123
355.498
351.104
346.927
342.953
339.165
335.550
332.092
328.781
325.606
322.563
319.639
316.830
314.128
311.529
309.024
306.606
304.275
302.023
299.845
297.741
295.706
293.734
291.821
289.965
288.166
286.418
284.723
283.076
281.473
279.915
278.396
276.917
275.474
274.068
272.698
271.360
270.055
268.779
267.532

117.36%
116.53%
115.73%
114.97%
114.25%
113.55%
112.88%
112.24%
111.62%
111.03%
110.46%
109.91%
109.38%
108.87%
108.37%
107.89%
107.43%
106.99%
106.55%
106.13%
105.73%
105.33%
104.95%
104.58%
104.22%
103.87%
103.52%
103.19%
102.87%
102.55%
102.25%
101.95%
101.66%
101.37%
101.10%
100.83%
100.56%
100.30%
100.05%
99.80%

MSE (Training)

100

200

300

400

500

600

Epoch

MSE (Validation)

100

200

300
Epoch

400

500

600

Profile plot for the fitted model

Generate profile for


Generate
by varying
keeping the other predictors fixe
Outputs
Y

Predictors
X1
X2

X1
0
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0.1
0.11
0.12
0.13
0.14
0.15
0.16
0.17
0.18
0.19
0.2
0.21
0.22
0.23
0.24
0.25
0.26
0.27
0.28
0.29
0.3
0.31
0.32
0.33
0.34
0.35
0.36
0.37
0.38
0.39
0.4
0.41
0.42
0.43
0.44

Predicted Y
0.825191
0.838926
0.851391
0.862673
0.872862
0.882046
0.890312
0.897742
0.904413
0.910398
0.915763
0.920568
0.924869
0.928715
0.932152
0.935219
0.937952
0.940383
0.94254
0.944448
0.94613
0.947605
0.94889
0.950001
0.95095
0.951749
0.952409
0.952938
0.953345
0.953634
0.953813
0.953885
0.953855
0.953725
0.953499
0.953178
0.952763
0.952255
0.951655
0.950962
0.950176
0.949294
0.948317
0.947242
0.946066

Predictor
Fixed Value

X1
0.692

Min / Max in Original Data (for user's reference o


Min
-4.00
Max
5.00

0.45
0.46
0.47
0.48
0.49
0.5
0.51
0.52
0.53
0.54
0.55
0.56
0.57
0.58
0.59
0.6
0.61
0.62
0.63
0.64
0.65
0.66
0.67
0.68
0.69
0.7
0.71
0.72
0.73
0.74
0.75
0.76
0.77
0.78
0.79
0.8
0.81
0.82
0.83
0.84
0.85
0.86
0.87
0.88
0.89
0.9
0.91
0.92
0.93
0.94
0.95
0.96

0.944788
0.943403
0.941909
0.940302
0.938576
0.936729
0.934755
0.932648
0.930403
0.928013
0.925473
0.922776
0.919914
0.91688
0.913666
0.910264
0.906666
0.902861
0.898843
0.894601
0.890125
0.885407
0.880435
0.875201
0.869695
0.863907
0.857826
0.851445
0.844753
0.837742
0.830404
0.822731
0.814718
0.806357
0.797645
0.788578
0.779153
0.769371
0.759231
0.748736
0.737891
0.726702
0.715176
0.703325
0.69116
0.678695
0.665947
0.652935
0.639678
0.626199
0.612521
0.59867

0.97 0.584673
0.98 0.570556
0.99 0.556349

or the fitted model


Generate profile for

Y
data points
X1
between
-4
and
keeping the other predictors fixed at the specified values
100

X2
1.385

ginal Data (for user's reference only)


-10.00
10.00

You might also like