Friday, December 2, 2022

Training a sign wave with feed forward neural network

 Lets create some sample sign wave data and add some noise to it.

import numpy as np
import matplotlib.pyplot as plt
import math
from sklearn.utils import shuffle

#lets take some 5000 points
n=5000

#lets consider 0.2% of 5000 as test data
test_per=0.2

#lets consider 0.2% of 5000 as validation data
val_per=0.2

#generate some 5000 points in 2 pi(full cycle) of sign wave
x=np.random.uniform(low=0,high=2*math.pi,size=n)
y=np.sin(x)+0.1*np.random.randn(n)

#lets shuffle the dataset to get variety of data for train, test and validation data sets
x,y=shuffle(x,y)
test_num=int(test_per*n)
val_num=test_num+int(val_per*n)
x_test,x_val,x_train=np.split(x,[test_num,val_num])
y_test,y_val,y_train=np.split(y,[test_num,val_num])

#lets plot train, test and validation data to understand the data size
plt.plot(x_train,y_train,"r.",label="train")
plt.plot(x_test,y_test,"b.",label="test")
plt.plot(x_val,y_val,"g.",label="val")

plt.show()



import numpy as np
from keras.models import Sequential
from keras.layers.core import Dense

#lets train the model with feed forward neural network

model = Sequential()

#I am taking 40 neurons in single hidden layer. we can also implement the same with multiple hidden layers with reduced neurons(16)
model.add(Dense(40, input_dim=1, activation='sigmoid'))
model.add(Dense(1))
model.compile(loss='mae',
                       optimizer='adam',
                       metrics=['mae'])
model.fit(x_train, y_train,batch_size=100, epochs=800)
scores = model.evaluate(x_val, y_val)

#lets print the accuracy of model
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

#lets predict the model with test data
y_pred=model.predict(x_test)

#lets plot the actual vs predicted test values
plt.scatter(x_test,y_test,marker=".",c="r")
plt.scatter(x_test,y_pred,marker=".",c='b')
plt.show()


Neural network trained well and actual vs predicted almost fit.

But wait a minute, lets try to add some future values and test the same.

#lets add next cycle data points with some noise
x_extra=np.random.uniform(low=2*math.pi,high=4*math.pi,size=int(n/8))
y_extra=np.sin(x_extra)+0.1*np.random.randn(int(n/8))
#lest add next cycle points to existing test data
x_future=np.append(x_test,x_extra)
y_future=np.append(y_test,y_extra)
#lets predict over all values
y_pred=model.predict(x_future)
#plot actual vs predict
plt.scatter(x_future,y_future,marker=".",c="r")
plt.scatter(x_future,y_pred,marker=".",c='b')
plt.show()


Ooops ! neural network learnt only a range of data but failed to predict future data.
Don't worry we can achieve this by training this with algorithm's which learn the pattern
of sequence example: RNN, LSTM. We will perform it in next blog.



Information: One can tune neural network to any number of hidden layers and number of
neurons per hidden layer. If we have more layers/neurons per layer the
learning will be quicker.

No comments:

Post a Comment