DeepTrading With TensorFlow 4 - TodoTrader
DeepTrading With TensorFlow 4 - TodoTrader
BLACK BELT
fter you have trained a neural network (NN), you would want to save it for future calculation and eventually deploying to production. So, what is a Tensorflow
model? Tensorflow model contains the network design or graph and values of the network parameters that we have trained.
A Important Note: I know that the reader is impatient to use real data from the financial markets. Please be patient, I promise that we will use them properly
when you are ready, but now we must strengthen our knowledge to have a strong foundation.
Also, remember to have a look at the first posts of the series to have the full picture:
https://todotrader.com/deeptrading-with-tensorflow/
https://todotrader.com/deeptrading-with-tensorflow-ii/
https://todotrader.com/deeptrading-with-tensorflow-iii/
The progress of the model can be saved during and after training. This means that a model can be resumed where it left off and avoid long training times. Saving also
means that you can share your model and others can recreate your work.
We will illustrate how to create a one hidden layer NN, save it and make predictions with a trained model after reloading it.
Again, we will use the iris data for this exercise. Remember the important note above!
We will build a one-hidden layer neural network to predict the fourth attribute, Petal Width from the other three (Sepal length, Sepal width, Petal length).
There are several differences with respect to the example before in order to lustrate
il more Tensorflow possibilities.
Caution: TensorFlow model files are code. Be careful with untrusted code. SeeUsing TensorFlow Securely for details.
Load configuration
In [1]:
# Before getting into pandas dataframes we will load an example dataset from sklearn library
# type(data) #iris is a bunch instance which is inherited from dictionary
data = load_iris() #load iris dataset
# Dimensions of dataset
print("Dimensions of dataset")
n = X_raw.shape[0]
p = X_raw.shape[1]
print("n=",n,"p=",p)
Dimensions of dataset
n= 150 p= 3
In [3]:
Out[3]:
In [4]:
X_raw.shape # Array 150x3. Each element is a 3-dimensional data point: sepal length, sepal width, petal length
Out[4]:
(150, 3)
In [5]:
y_raw.shape # Vector 150. Each element is a 1-dimensional (scalar) data point: petal width
Out[5]:
(150,)
In [6]:
df
Out[6]:
sepal length (cm) sepal width (cm) petal length (cm) petal width (cm)
… … … … …
#
# Leave in blanck intentionally
#
Split data
In [8]:
# split into train and test sets
# Total samples
nsamples = n
# Samples in train
nsamples_train = jindex
# Samples in test
nsamples_test = nsamples - nsamples_train
print("Total number of samples: ",nsamples,"\nSamples in train set: ", nsamples_train,
"\nSamples in test set: ",nsamples_test)
X_test = X_raw[jindex:, :]
y_test = y_raw[jindex:]
Transform features
Note
Be careful do not to write X_test_std = sc.fit_transform(X_test) instead of X_test_std = sc.transform(X_test) . In this case, it wouldn’t make a great difference since the mean
and standard deviation of the test set should be (quite) similar to the training set. However, this is not always the case in Forex market data, as has been well established
in the literature. The correct way is to re-use parameters from the training set if we are doing any kind of transformation. So, the test set should basically stand for “new,
unseen” data. In [9]:
# Scale data
from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train_std = sc.fit_transform(X_train)
X_test_std = sc.transform(X_test)
# Clears the default graph stack and resets the global default graph
ops.reset_default_graph()
In [11]:
# make results reproducible
seed = 2
tf.set_random_seed(seed)
np.random.seed(seed)
# Parameters
learning_rate = 0.005
batch_size = 50
n_features = X_train.shape[1]# Number of features in training data
epochs = 1000
display_step = 50
model_path = "/tmp/model.ckpt"
n_classes = 1
# Network Parameters
# See figure of the model
d0 = D = n_features # Layer 0 (Input layer number of features)
d1 = 10 # Layer 1 (1st hidden layer number of features. Selected 10 for this example)
d2 = C = 1 # Layer 2 (Output layer)
# tf Graph input
print("Placeholders")
X = tf.placeholder(dtype=tf.float32, shape=[None, n_features], name="X")
y = tf.placeholder(dtype=tf.float32, shape=[None,n_classes], name="y")
# Initializers
print("Initializers")
sigma = 1
weight_initializer = tf.variance_scaling_initializer(mode="fan_avg", distribution="uniform", scale=sigma)
bias_initializer = tf.zeros_initializer()
# Create model
def onelayer_perceptron(X, variables):
# Hidden layer with ReLU activation
layer_1 = tf.nn.relu(tf.add(tf.matmul(X, variables['W1']), variables['bias1']))
# Output layer with ReLU activation
out_layer = tf.nn.relu(tf.add(tf.matmul(layer_1, variables['W2']), variables['bias2']))
return out_layer
# Construct model
y_hat = onelayer_perceptron(X, variables)
Placeholders
Initializers
# Writer to record image, scalar, histogram and graph for display in tensorboard
writer = tf.summary.FileWriter("/tmp/tensorflow_logs", sess.graph) # create writer
writer.add_graph(sess.graph)
# Training cycle
train_loss = []
test_loss = []
# Close writer
writer.flush()
writer.close()
In [13]:
%matplotlib inline
# Plot loss (MSE) over time
plt.plot(train_loss, 'k-', label='Train Loss')
plt.plot(test_loss, 'r--', label='Test Loss')
plt.title('Loss (MSE) per Generation')
plt.legend(loc='upper right')
plt.xlabel('Generation')
plt.ylabel('Loss')
plt.show()
Tensorboard Graph
What follows is the graph we have executed and all the data about it. Note the “save” label.
Meta graph: This is a protocol buffer which saves the complete Tensorflow graph; i.e. all variables, operations, collections, etc. This file
has a .meta extension.
Two Checkpoint files: they are binary files which contain all the values of the weights, biases, gradients and all the other variables saved.
Tensorflow has changed from version 0.11. Instead of a single .ckpt file, we have now two files: .index and .data file that contains our
training variables.
Along with thes, Tensorflow also has a file named checkpoint which simply keeps a record of latest checkpoint files saved.
In [14]:
# Running a new session
print("Starting 2nd session...")
with tf.Session() as sess:
# Initialize variables
sess.run(init)
# Resume training
for epoch in range(epochs*2):
rand_index = np.random.choice(len(X_train), size=batch_size)
X_rand = X_train[rand_index]
y_rand = np.transpose([y_train[rand_index]])
sess.run(optimizer, feed_dict={X: X_rand, y: y_rand})
# Close writer
writer.flush()
writer.close()
Predict
We got it!
In [15]:
OK, not very good results. But it is worst that we could think! Data are not right because we have trained our model with transformed data (standardization) and now we
must use again transformed data to make predictions. Also, we will get back-transformed data again. So, we must inverse the transformation to get the original kind of
data.
First: transform our original data. The data we want to make the prediction about.
In [16]:
In [17]:
X_pred_std = sc.transform(X_pred)
X_pred_std
Out[17]:
In [18]:
In [19]:
y_hat_rev = sc.inverse_transform(prediction)
y_hat_rev
Out[19]:
array([[0.9423405],
[0.9764485],
[2.3178802]], dtype=float32)
Not bad. True values are 0.2, 0.1, 2.4. We’ll try to improve them with a deeper network. That is the goal of the next notebook.
https://github.com/parrondo/deeptrading
parrondo
TensorFlow
Related articles
1 COMMENT
Quantocracy's Daily Wrap for 06/23/2019 | Quantocracy
2019-06-24 at 07:17 R E P L Y
[…] DeepTrading with Tensorflow IV [Todo Trader] […]
Leave a Reply
Your email address will not be published. Required fields are marked *
Comment
Your Name*
Email Address*
Save my name, email, and website in this browser for the next time I
comment.
L E A V E C O M M E N T
Type Here
Recent Posts
DeepTrading with TensorFlow VI
Recent Comments
Quantocracy's Daily Wrap for 07/08/2019 | Quantocracy on DeepTrading with TensorFlow VI
Quantocracy's Daily Wrap for 07/03/2019 | Quantocracy on Deep Trading with TensorFlow V
Quantocracy's Daily Wrap for 06/23/2019 | Quantocracy on DeepTrading with Tensorflow IV
Quantocracy's Daily Wrap for 06/19/2019 | Quantocracy on DeepTrading with TensorFlow III
Quantocracy's Daily Wrap for 06/15/2019 | Quantocracy on DeepTrading with TensorFlow II
Archives
July 2019
June 2019
May 2019
April 2019
November 2017
Categories
Be Productive
Black Belt
Brown Belt
Create Systems
Software
Start to trade
Meta
Log in
Entries RSS
Comments RSS
WordPress.org
Theme by Powered by