[PYTHON] Learn elliptical orbits with Chainer

Learn elliptical orbits with Feedforward Neural Network using chainer. The test is done in an open loop.

I will solve this problem while explaining how to use chainer.

Problems and approaches

[problem] Learn the elliptical orbits of x = 0.8cos (θ) and y = 0.8sin (θ).

[approach] x(θn)=(0.8cos(θn),0.8sin(θn)), (0<=θn<=2π) Design an FNN that predicts x (θn + 1) from x (θn). Then, use this FNN to learn using training data, and test and confirm the results using test data.

Data creation and variable setting

Training data and test data

Specify any point on the ellipse.

[Training data] xtrain (θn) = (0.8cos (θn), 0.8sin (θn)), θn = 2πn / 20 (0 <= n <= 20, n is a natural number) will do. Write as follows.

a=np.linspace(0,20,sample_no).reshape(sample_no,1) 
xtrain=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtrain[:,0]=[0.8*np.cos(2.0*np.pi*i/20) for i in a]
xtrain[:,1]=[0.8*np.sin(2.0*np.pi*i/20) for i in a]

[test data] xtest (θn) = (0.8cos (θn), 0.8sin (θn)), θn = 2πn / 27 (0 <= n <= 27, n is a natural number) will do. Similarly, write as follows.

a=np.linspace(0,27,sample_no).reshape(sample_no,1) 
xtest=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtest[:,0]=[0.8*np.cos(2.0*np.pi*i/27) for i in a]
xtest[:,1]=[0.8*np.sin(2.0*np.pi*i/27) for i in a]

variable

Set as follows. Number of teacher data: sample_no Number of learning: epoch Number of layers: Input layer: input_no = 2 (fixed) Middle layer: hidden_no Output layer: output_no = 2 (fixed) Batch size: bs

Construction of learning model by chainer

Preparing a connection (Link)

Register the Link named Linear with the names l1 and l2.


class FNN(Chain):
	def __init__(self): #Prepare a connection
		super(FNN,self).__init__(
			l1=L.Linear(input_no,hidden_no),
			l2=L.Linear(hidden_no,output_no),

In addition, it should be noted l1=L.Linear(input_no,hidden_no), l2=L.Linear(hidden_no,output_no), Is self.add_link("l1",F.Linear(input_no,hidden_no)), self.add_link("l2",F.Linear(hidden_no,output_no)), Is the same as writing.

forward calculation

The registered Link is called as a function. The argument seems to be a variable class in principle.

#class FNN
	def fwd(self,x):
		h1=F.tanh(self.l1(x))
		h2=F.tanh(self.l2(h1))
		return h2
	
	def get_predata(self,x):
		return self.fwd(Variable(x.astype(np.float32).reshape(sample_no,input_no))).data

Loss function

Called from chainer.functions. This time, we will use the mean square error.

#class FNN
	def __call__(self,x,y): #Loss function
		return F.mean_squared_error(self.fwd(x),y)

optimisation

Create an optimizer and set it with the FNN model. There are various types such as SGD, Adam, and RMS Drop.

model=FNN()
optimizer=optimizers.SGD()
optimizer.setup(model)

Training

Train the next data set (Xtrain_n + 1) as the correct answer for the current data set (Xtrain_n). Gradient initialization, backpropagation, and update   optimizer.zero_grads()   loss.backward()   optimizer.update() Repeat.

for i in range(epoch):
	for j in range(sample_no): #Put one ahead
		if (j+1<sample_no):
			ytrain[j]=np.array(xtrain[j+1])
		else:
			ytrain[j]=np.array(xtrain[0])

	model.zerograds()
	loss=model(xtrain,ytrain)
	loss.backward()
	optimizer.update()

test

Since it is done in an open loop, it reads the test data every time and predicts the next point. Since the test data is calculated by FeedForward and output to you, just write as follows.

yout=model.get_predata(xtest)

test results

The teacher data (target ellipse) is drawn in blue and the test results are drawn in red.

100 learning times: still out of sync

5000 lessons: It's getting closer to an ellipse

Number of learning 5000000 times: Almost correct answer

Full code

ellipse.py


#-*- coding:utf-8 -*-
import numpy as np
import chainer
from chainer import cuda,Function,gradient_check,Variable,optimizers,serializers,utils
from chainer import Link,Chain,ChainList
import chainer.functions as F
import chainer.links as L
from sklearn import datasets
import matplotlib.pyplot as plt

#Number of teacher data
sample_no=100 

#Number of learning
epoch=500000

#Number of layers
input_no=2
hidden_no=2
output_no=2

#Teacher data creation
a=np.linspace(0,20,sample_no).reshape(sample_no,1) 
xtrain=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtrain[:,0]=[0.8*np.cos(2.0*np.pi*i/20) for i in a]
xtrain[:,1]=[0.8*np.sin(2.0*np.pi*i/20) for i in a]

#Model building
class FNN(Chain):
	def __init__(self): #Prepare a connection
		super(FNN,self).__init__(
			l1=L.Linear(input_no,hidden_no),
			l2=L.Linear(hidden_no,output_no),
		)
	def __call__(self,x,y): #Loss function
		return F.mean_squared_error(self.fwd(x),y)

	def fwd(self,x):
		h1=F.tanh(self.l1(x))
		h2=F.tanh(self.l2(h1))
		return h2
	
	def get_predata(self,x):
		return self.fwd(Variable(x.astype(np.float32).reshape(sample_no,input_no))).data

#Optimization method
model=FNN()
optimizer=optimizers.SGD()
optimizer.setup(model)

#Stores the correct answer value for training
ytrain=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)

#Batch size
bs=25

#Training
for i in range(epoch):
	for j in range(sample_no): #Put one ahead
		if (j+1<sample_no):
			ytrain[j]=np.array(xtrain[j+1])
		else:
			ytrain[j]=np.array(xtrain[0])

	model.zerograds()
	loss=model(xtrain,ytrain)
	loss.backward()
	optimizer.update()

#test(openloop)
a=np.linspace(0,27,sample_no).reshape(sample_no,1) 
xtest=np.zeros(input_no*sample_no).reshape(sample_no,input_no).astype(np.float32)
xtest[:,0]=[0.8*np.cos(2.0*np.pi*i/27) for i in a]
xtest[:,1]=[0.8*np.sin(2.0*np.pi*i/27) for i in a]
yout=model.get_predata(xtest)
print yout

#drawing
plt.plot(yout[:,0],yout[:,1],"r",label="training data")    #Draw learning results in red
plt.plot(xtrain[:,0],xtrain[:,1],"b",label="teaching data") #Draw teacher data in blue
plt.show()

Recommended Posts

Learn elliptical orbits with Chainer
Learn to colorize monochrome images with Chainer
Seq2Seq (1) with chainer
Learn Python with ChemTHEATER
Learn Zundokokiyoshi with LSTM
Learn Pandas with Cheminformatics
Use tensorboard with Chainer
Learn with chemoinformatics scikit-learn
Learn with Cheminformatics Matplotlib
Learn with Cheminformatics NumPy
DCGAN with TF Learn
Learn Pendulum-v0 with DDPG
I tried to learn the sin function with chainer
Try implementing RBM with chainer.
Learn new data with PaintsChainer
Seq2Seq (3) ~ CopyNet Edition ~ with chainer
Use chainer with Jetson TK1
Neural network starting with Chainer
Implemented Conditional GAN with chainer
Image caption generation with Chainer
Implemented SmoothGrad with Chainer v2
Deep Embedded Clustering with Chainer 2.0
A little stuck with chainer
Learn algorithms with Go @ recursive call
Learn with Causal ML Package Meta-Learner
Learn with FizzBuzz Iterator, Generator, Decorator
Multilayer Perceptron with Chainer: Function Fitting
Learn with PyTorch Graph Convolutional Networks
[TensorFlow 2] Learn RNN with CTC Loss
Let's learn Deep SEA with Selene
Try horse racing prediction with Chainer
[Chainer] Learning XOR with multi-layer perceptron
Learn document categorization with spaCy CLI
First Anime Face Recognition with Chainer
Inference works with Chainer 2.0 MNIST sample
Using Chainer with CentOS7 [Environment construction]
Try Common Representation Learning with chainer
Seq2Seq (2) ~ Attention Model edition ~ with chainer
I tried to learn the angle from sin and cos with chainer