一、总结

一句话总结:

这个应用不看准确率(这里不是分类问题),看loss就好了,loss低的话,预测自然准
# 构建容器
model = tf.keras.Sequential()
# 输入层
model.add(tf.keras.Input(shape=(60,1)))
model.add(tf.keras.layers.GRU(80, return_sequences=True)) 
model.add(tf.keras.layers.Dropout(0.2)) 
model.add(tf.keras.layers.GRU(100)) 
model.add(tf.keras.layers.Dropout(0.2)) 
model.add(tf.keras.layers.Dense(1)) 

# 模型的结构
model.summary()
# 该应用只观测loss数值,不观测准确率

# 配置优化函数和损失器
model.compile(optimizer=tf.keras.optimizers.Adam(0.001),loss='mse',metrics=['acc'])
# 开始训练
history = model.fit(x_train,y_train,epochs=50,validation_data=(x_test,y_test)) #epochs表示训练的次数

 

 

1、报错:ValueError: Failed to find data adapter that can handle input: <class 'numpy.ndarray'>, (<class 'list'> containing values of types {"<class 'numpy.float64'>"})?

很可能是没有将数据转为numpy:是没有将数据(y_train、y_test)转为numpy

 

 

2、异常二:训练集和测试集准确率都为0?

真正原因是 该应用只观测loss数值,不观测准确率

 

 

3、从归一化转换成正常数据?

MinMaxScaler的inverse_transform方法:predicted_stock_price = sc.inverse_transform(predicted_stock_price)
# 测试集输入模型进行预测
predicted_stock_price = model.predict(x_test)
# 对预测数据还原---从(0,1)反归一化到原始范围
predicted_stock_price = sc.inverse_transform(predicted_stock_price)
# 对真实数据还原---从(0,1)反归一化到原始范围
real_stock_price = sc.inverse_transform(test_set[60:])
# 画出真实数据和预测数据的对比曲线
plt.plot(real_stock_price, color='red', label='MaoTai Stock Price')
plt.plot(predicted_stock_price, color='blue', label='Predicted MaoTai Stock Price')
plt.title('MaoTai Stock Price Prediction')
plt.xlabel('Time')
plt.ylabel('MaoTai Stock Price')
plt.legend()
plt.show()

 

 

4、.iloc[0:2126,2:3].values和.iloc[0:2126,2:3]获取的数据格式分别是什么?

(I)、.iloc[0:2126,2:3].values 获取的是 numpy.ndarray
(II)、.iloc[0:2126,2:3] 获取的是 pandas.core.frame.DataFrame

 

 

 

二、循环神经网络实现股票预测(GRU)

博客对应课程的视频位置:

 

步骤

1、读取数据集
2、拆分数据集(拆分成训练数据集和测试数据集)
3、构建模型
4、训练模型
5、检验模型

需求

In [1]:
import pandas as pd
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt

1、读取数据集

In [2]:
data = pd.read_csv("SH600519.csv")
print(data.head(5))
print(data.shape)
   Unnamed: 0        date    open   close    high     low     volume    code
0          74  2010-04-26  88.702  87.381  89.072  87.362  107036.13  600519
1          75  2010-04-27  87.355  84.841  87.355  84.681   58234.48  600519
2          76  2010-04-28  84.235  84.318  85.128  83.597   26287.43  600519
3          77  2010-04-29  84.592  85.671  86.315  84.592   34501.20  600519
4          78  2010-04-30  83.871  82.340  83.871  81.523   85566.70  600519
(2426, 8)

2、拆分数据集(拆分成训练数据集和测试数据集)

将后300行做测试集,之前的做训练集

In [3]:
# .iloc[0:2126,2:3].values 获取的是 numpy.ndarray
# .iloc[0:2126,2:3] 获取的是 pandas.core.frame.DataFrame
train_data = data.iloc[0:2126,2:3].values
train_data1 = data.iloc[0:2126,2:3]
test_data = data.iloc[2126:,2:3].values
print(type(train_data))
print(type(train_data1))
print(train_data.shape)
print(test_data.shape)
print(train_data[0:5])
print(test_data[0:5])
<class 'numpy.ndarray'>
<class 'pandas.core.frame.DataFrame'>
(2126, 1)
(300, 1)
[[88.702]
 [87.355]
 [84.235]
 [84.592]
 [83.871]]
[[677.5 ]
 [684.99]
 [680.  ]
 [697.04]
 [695.  ]]
In [4]:
# 归一化
from sklearn.preprocessing import MinMaxScaler
sc = MinMaxScaler(feature_range=(0, 1))  # 定义归一化:归一化到(0,1)之间
training_set = sc.fit_transform(train_data)  # 求得训练集的最大值,最小值这些训练集固有的属性,并在训练集上进行归一化
test_set = sc.transform(test_data)  # 利用训练集的属性对测试集进行归一化
In [5]:
# min-max归一化中的min和max
print(sc.data_max_)
print(sc.data_min_)

print(training_set.shape)
print(test_set.shape)
print(training_set[:5])
print(test_set[:5])
[788.8]
[80.406]
(2126, 1)
(300, 1)
[[0.011711  ]
 [0.00980951]
 [0.00540518]
 [0.00590914]
 [0.00489135]]
[[0.84288404]
 [0.85345726]
 [0.84641315]
 [0.87046756]
 [0.86758781]]

2.1、构建训练和测试数据集

In [6]:
x_train = []
y_train = []

x_test = []
y_test = []
In [7]:
print(type(training_set))
# 切片的索引
# 前面一个是行,后面一个是列
print(training_set[70-10:70,0])
print(training_set[70,0])
<class 'numpy.ndarray'>
[0.01388634 0.01615909 0.01645977 0.01314099 0.01254951 0.01652329
 0.01561419 0.01750579 0.02088809 0.02129747]
0.0243522672411116
In [8]:
print(training_set[70-10:70])
print(training_set[70])
[[0.01388634]
 [0.01615909]
 [0.01645977]
 [0.01314099]
 [0.01254951]
 [0.01652329]
 [0.01561419]
 [0.01750579]
 [0.02088809]
 [0.02129747]]
[0.02435227]
In [9]:
print(training_set[70-10:70,0])
print(training_set[70,0])
[0.01388634 0.01615909 0.01645977 0.01314099 0.01254951 0.01652329
 0.01561419 0.01750579 0.02088809 0.02129747]
0.0243522672411116

组装 train 数据

In [10]:
for i in range(60,len(training_set)):
    x_train.append(training_set[i-60:i,0])
    y_train.append(training_set[i,0])
# 对训练集进行打乱
np.random.seed(7)
np.random.shuffle(x_train)
np.random.seed(7)
np.random.shuffle(y_train)
tf.random.set_seed(7)
In [11]:
print(len(x_train))
print(len(y_train))
print(x_train[:2])
print(y_train[:2])
2066
2066
[array([0.11437703, 0.12028758, 0.12245869, 0.12454369, 0.11934037,
       0.11990361, 0.12065884, 0.1179019 , 0.11856114, 0.12389715,
       0.11991632, 0.1223147 , 0.12153547, 0.12145077, 0.11910039,
       0.12251798, 0.12929246, 0.13166684, 0.12628283, 0.12815326,
       0.13109089, 0.13720613, 0.13654689, 0.17293625, 0.16226422,
       0.17425472, 0.20878635, 0.20303108, 0.18849821, 0.19533339,
       0.19104058, 0.18156845, 0.18762299, 0.18943413, 0.1886422 ,
       0.17902608, 0.18024997, 0.18159245, 0.18804366, 0.19487743,
       0.1908006 , 0.19373823, 0.18793496, 0.18024997, 0.17551391,
       0.1886422 , 0.19343896, 0.19785035, 0.21022482, 0.23012758,
       0.22883311, 0.21651934, 0.19824702, 0.20295909, 0.21082477,
       0.20444555, 0.20180719, 0.19596863, 0.18805495, 0.20422957]), array([0.98329743, 0.98605296, 0.97216238, 0.96666403, 0.96937862,
       1.        , 0.98608119, 0.97797836, 0.97204945, 0.93252343,
       0.96671344, 0.9508748 , 0.96781452, 0.96684049, 0.94805151,
       0.89440904, 0.89200925, 0.92337597, 0.88842367, 0.86335288,
       0.87464603, 0.88452754, 0.90292126, 0.9155837 , 0.88876247,
       0.90005562, 0.93435856, 0.948475  , 0.95510973, 0.93454208,
       0.92970014, 0.93322925, 0.92546521, 0.92077855, 0.94092271,
       0.94367541, 0.92546521, 0.92955897, 0.9033024 , 0.91897165,
       0.89718998, 0.86758781, 0.84217822, 0.8355435 , 0.85632289,
       0.83370836, 0.86318348, 0.85064809, 0.84531207, 0.84613083,
       0.80688713, 0.81465117, 0.77928949, 0.79982891, 0.83370836,
       0.83469651, 0.83088507, 0.82100357, 0.86476452, 0.84923644])]
[0.19300699892997392, 0.8580027498821277]

组装 test 数据

In [12]:
for i in range(60,len(test_set)):
    x_test.append(test_set[i-60:i,0])
    y_test.append(test_set[i,0])
In [13]:
print(len(x_test))
print(len(y_test))
print(x_test[:2])
print(y_test[:2])
240
240
[array([0.84288404, 0.85345726, 0.84641315, 0.87046756, 0.86758781,
       0.90711384, 0.90711384, 0.91868932, 0.92388417, 0.91840699,
       0.93246696, 0.92405356, 0.90287891, 0.89864397, 0.91805408,
       0.9421226 , 0.92123028, 0.92814733, 0.96146212, 1.00875219,
       0.99463575, 0.98051932, 0.96247851, 0.92814733, 0.93116825,
       0.96473714, 0.93887582, 0.95510973, 0.98193096, 0.98588356,
       1.02004534, 1.00169397, 0.99887068, 0.99918125, 0.9960474 ,
       0.98757753, 0.98898918, 1.00653591, 1.06521794, 1.10050904,
       1.10049492, 1.07244556, 1.07227616, 1.15697479, 1.16120972,
       1.16120972, 1.22473369, 1.16967959, 1.20073575, 1.16389185,
       1.21061726, 1.22107754, 1.21903065, 1.2409676 , 1.22896862,
       1.2697369 , 1.25408177, 1.2223339 , 1.22049876, 1.25296657]), array([0.85345726, 0.84641315, 0.87046756, 0.86758781, 0.90711384,
       0.90711384, 0.91868932, 0.92388417, 0.91840699, 0.93246696,
       0.92405356, 0.90287891, 0.89864397, 0.91805408, 0.9421226 ,
       0.92123028, 0.92814733, 0.96146212, 1.00875219, 0.99463575,
       0.98051932, 0.96247851, 0.92814733, 0.93116825, 0.96473714,
       0.93887582, 0.95510973, 0.98193096, 0.98588356, 1.02004534,
       1.00169397, 0.99887068, 0.99918125, 0.9960474 , 0.98757753,
       0.98898918, 1.00653591, 1.06521794, 1.10050904, 1.10049492,
       1.07244556, 1.07227616, 1.15697479, 1.16120972, 1.16120972,
       1.22473369, 1.16967959, 1.20073575, 1.16389185, 1.21061726,
       1.22107754, 1.21903065, 1.2409676 , 1.22896862, 1.2697369 ,
       1.25408177, 1.2223339 , 1.22049876, 1.25296657, 1.19297171])]
[1.1929717078349054, 1.1474885445105405]

如此便构建了训练数据和测试数据

In [14]:
print(type(x_train))
print(type(x_test))
<class 'list'>
<class 'list'>
In [15]:
print(x_train[0])
print(x_test[0])
[0.11437703 0.12028758 0.12245869 0.12454369 0.11934037 0.11990361
 0.12065884 0.1179019  0.11856114 0.12389715 0.11991632 0.1223147
 0.12153547 0.12145077 0.11910039 0.12251798 0.12929246 0.13166684
 0.12628283 0.12815326 0.13109089 0.13720613 0.13654689 0.17293625
 0.16226422 0.17425472 0.20878635 0.20303108 0.18849821 0.19533339
 0.19104058 0.18156845 0.18762299 0.18943413 0.1886422  0.17902608
 0.18024997 0.18159245 0.18804366 0.19487743 0.1908006  0.19373823
 0.18793496 0.18024997 0.17551391 0.1886422  0.19343896 0.19785035
 0.21022482 0.23012758 0.22883311 0.21651934 0.19824702 0.20295909
 0.21082477 0.20444555 0.20180719 0.19596863 0.18805495 0.20422957]
[0.84288404 0.85345726 0.84641315 0.87046756 0.86758781 0.90711384
 0.90711384 0.91868932 0.92388417 0.91840699 0.93246696 0.92405356
 0.90287891 0.89864397 0.91805408 0.9421226  0.92123028 0.92814733
 0.96146212 1.00875219 0.99463575 0.98051932 0.96247851 0.92814733
 0.93116825 0.96473714 0.93887582 0.95510973 0.98193096 0.98588356
 1.02004534 1.00169397 0.99887068 0.99918125 0.9960474  0.98757753
 0.98898918 1.00653591 1.06521794 1.10050904 1.10049492 1.07244556
 1.07227616 1.15697479 1.16120972 1.16120972 1.22473369 1.16967959
 1.20073575 1.16389185 1.21061726 1.22107754 1.21903065 1.2409676
 1.22896862 1.2697369  1.25408177 1.2223339  1.22049876 1.25296657]
In [16]:
x_train=np.array(x_train)
x_test=np.array(x_test)
y_train=np.array(y_train)
y_test=np.array(y_test)
In [17]:
x_train = np.reshape(x_train, (x_train.shape[0], 60, 1))
x_test = np.reshape(x_test, (x_test.shape[0], 60, 1))
print(x_train.shape)
print(x_test.shape)
(2066, 60, 1)
(240, 60, 1)

3、构建模型

In [18]:
# 构建容器
model = tf.keras.Sequential()
# 输入层
model.add(tf.keras.Input(shape=(60,1)))
model.add(tf.keras.layers.GRU(80, return_sequences=True)) 
model.add(tf.keras.layers.Dropout(0.2)) 
model.add(tf.keras.layers.GRU(100)) 
model.add(tf.keras.layers.Dropout(0.2)) 
model.add(tf.keras.layers.Dense(1)) 

# 模型的结构
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
gru (GRU)                    (None, 60, 80)            19920     
_________________________________________________________________
dropout (Dropout)            (None, 60, 80)            0         
_________________________________________________________________
gru_1 (GRU)                  (None, 100)               54600     
_________________________________________________________________
dropout_1 (Dropout)          (None, 100)               0         
_________________________________________________________________
dense (Dense)                (None, 1)                 101       
=================================================================
Total params: 74,621
Trainable params: 74,621
Non-trainable params: 0
_________________________________________________________________

4、训练模型

In [19]:
# 该应用只观测loss数值,不观测准确率

# 配置优化函数和损失器
model.compile(optimizer=tf.keras.optimizers.Adam(0.001),loss='mse',metrics=['acc'])
# 开始训练
history = model.fit(x_train,y_train,epochs=50,validation_data=(x_test,y_test)) #epochs表示训练的次数
Epoch 1/50
65/65 [==============================] - 1s 17ms/step - loss: 0.0096 - acc: 4.8403e-04 - val_loss: 0.0041 - val_acc: 0.0000e+00
Epoch 2/50
65/65 [==============================] - 1s 10ms/step - loss: 0.0011 - acc: 4.8403e-04 - val_loss: 0.0029 - val_acc: 0.0000e+00
Epoch 3/50
65/65 [==============================] - 1s 10ms/step - loss: 0.0011 - acc: 4.8403e-04 - val_loss: 0.0033 - val_acc: 0.0000e+00
Epoch 4/50
65/65 [==============================] - 1s 10ms/step - loss: 9.6007e-04 - acc: 4.8403e-04 - val_loss: 0.0021 - val_acc: 0.0000e+00
Epoch 5/50
65/65 [==============================] - 1s 10ms/step - loss: 9.4882e-04 - acc: 4.8403e-04 - val_loss: 0.0019 - val_acc: 0.0000e+00
Epoch 6/50
65/65 [==============================] - 1s 10ms/step - loss: 8.7472e-04 - acc: 4.8403e-04 - val_loss: 0.0015 - val_acc: 0.0000e+00
Epoch 7/50
65/65 [==============================] - 1s 10ms/step - loss: 7.8159e-04 - acc: 4.8403e-04 - val_loss: 0.0014 - val_acc: 0.0000e+00
Epoch 8/50
65/65 [==============================] - 1s 10ms/step - loss: 8.4548e-04 - acc: 4.8403e-04 - val_loss: 0.0022 - val_acc: 0.0000e+00
Epoch 9/50
65/65 [==============================] - 1s 9ms/step - loss: 8.0948e-04 - acc: 4.8403e-04 - val_loss: 0.0017 - val_acc: 0.0000e+00
Epoch 10/50
65/65 [==============================] - 1s 9ms/step - loss: 8.0718e-04 - acc: 4.8403e-04 - val_loss: 0.0020 - val_acc: 0.0000e+00
Epoch 11/50
65/65 [==============================] - 1s 10ms/step - loss: 7.4465e-04 - acc: 4.8403e-04 - val_loss: 0.0020 - val_acc: 0.0000e+00
Epoch 12/50
65/65 [==============================] - 1s 9ms/step - loss: 8.3900e-04 - acc: 4.8403e-04 - val_loss: 0.0109 - val_acc: 0.0000e+00
Epoch 13/50
65/65 [==============================] - 1s 10ms/step - loss: 8.4583e-04 - acc: 4.8403e-04 - val_loss: 0.0014 - val_acc: 0.0000e+00
Epoch 14/50
65/65 [==============================] - 1s 10ms/step - loss: 7.3636e-04 - acc: 4.8403e-04 - val_loss: 0.0043 - val_acc: 0.0000e+00
Epoch 15/50
65/65 [==============================] - 1s 9ms/step - loss: 7.2510e-04 - acc: 4.8403e-04 - val_loss: 0.0021 - val_acc: 0.0000e+00
Epoch 16/50
65/65 [==============================] - 1s 9ms/step - loss: 6.6910e-04 - acc: 4.8403e-04 - val_loss: 0.0034 - val_acc: 0.0000e+00
Epoch 17/50
65/65 [==============================] - 1s 10ms/step - loss: 7.5454e-04 - acc: 4.8403e-04 - val_loss: 0.0013 - val_acc: 0.0000e+00
Epoch 18/50
65/65 [==============================] - 1s 9ms/step - loss: 6.6776e-04 - acc: 4.8403e-04 - val_loss: 0.0100 - val_acc: 0.0000e+00
Epoch 19/50
65/65 [==============================] - 1s 10ms/step - loss: 7.4855e-04 - acc: 4.8403e-04 - val_loss: 0.0024 - val_acc: 0.0000e+00
Epoch 20/50
65/65 [==============================] - 1s 9ms/step - loss: 7.1869e-04 - acc: 4.8403e-04 - val_loss: 0.0022 - val_acc: 0.0000e+00
Epoch 21/50
65/65 [==============================] - 1s 9ms/step - loss: 6.2107e-04 - acc: 4.8403e-04 - val_loss: 0.0016 - val_acc: 0.0000e+00
Epoch 22/50
65/65 [==============================] - 1s 9ms/step - loss: 6.5129e-04 - acc: 4.8403e-04 - val_loss: 0.0039 - val_acc: 0.0000e+00
Epoch 23/50
65/65 [==============================] - 1s 10ms/step - loss: 5.9437e-04 - acc: 4.8403e-04 - val_loss: 0.0014 - val_acc: 0.0000e+00
Epoch 24/50
65/65 [==============================] - 1s 9ms/step - loss: 6.8762e-04 - acc: 4.8403e-04 - val_loss: 0.0035 - val_acc: 0.0000e+00
Epoch 25/50
65/65 [==============================] - 1s 9ms/step - loss: 7.1815e-04 - acc: 4.8403e-04 - val_loss: 0.0036 - val_acc: 0.0000e+00
Epoch 26/50
65/65 [==============================] - 1s 9ms/step - loss: 6.3727e-04 - acc: 4.8403e-04 - val_loss: 0.0013 - val_acc: 0.0000e+00
Epoch 27/50
65/65 [==============================] - 1s 9ms/step - loss: 6.2668e-04 - acc: 4.8403e-04 - val_loss: 0.0018 - val_acc: 0.0000e+00
Epoch 28/50
65/65 [==============================] - 1s 9ms/step - loss: 6.0982e-04 - acc: 4.8403e-04 - val_loss: 0.0019 - val_acc: 0.0000e+00
Epoch 29/50
65/65 [==============================] - 1s 10ms/step - loss: 7.4356e-04 - acc: 4.8403e-04 - val_loss: 0.0025 - val_acc: 0.0000e+00
Epoch 30/50
65/65 [==============================] - 1s 10ms/step - loss: 5.7358e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00
Epoch 31/50
65/65 [==============================] - 1s 9ms/step - loss: 6.0366e-04 - acc: 4.8403e-04 - val_loss: 0.0013 - val_acc: 0.0000e+00
Epoch 32/50
65/65 [==============================] - 1s 10ms/step - loss: 7.0813e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00
Epoch 33/50
65/65 [==============================] - 1s 9ms/step - loss: 6.8998e-04 - acc: 4.8403e-04 - val_loss: 0.0018 - val_acc: 0.0000e+00
Epoch 34/50
65/65 [==============================] - 1s 10ms/step - loss: 6.3556e-04 - acc: 4.8403e-04 - val_loss: 0.0015 - val_acc: 0.0000e+00
Epoch 35/50
65/65 [==============================] - 1s 10ms/step - loss: 5.7062e-04 - acc: 4.8403e-04 - val_loss: 0.0028 - val_acc: 0.0000e+00
Epoch 36/50
65/65 [==============================] - 1s 9ms/step - loss: 6.2591e-04 - acc: 4.8403e-04 - val_loss: 0.0049 - val_acc: 0.0000e+00
Epoch 37/50
65/65 [==============================] - 1s 9ms/step - loss: 6.2909e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00
Epoch 38/50
65/65 [==============================] - 1s 9ms/step - loss: 5.6850e-04 - acc: 4.8403e-04 - val_loss: 0.0014 - val_acc: 0.0000e+00
Epoch 39/50
65/65 [==============================] - 1s 10ms/step - loss: 6.5196e-04 - acc: 4.8403e-04 - val_loss: 0.0082 - val_acc: 0.0000e+00
Epoch 40/50
65/65 [==============================] - 1s 9ms/step - loss: 5.2014e-04 - acc: 4.8403e-04 - val_loss: 0.0011 - val_acc: 0.0000e+00
Epoch 41/50
65/65 [==============================] - 1s 10ms/step - loss: 5.0786e-04 - acc: 4.8403e-04 - val_loss: 0.0022 - val_acc: 0.0000e+00
Epoch 42/50
65/65 [==============================] - 1s 9ms/step - loss: 4.4572e-04 - acc: 4.8403e-04 - val_loss: 0.0029 - val_acc: 0.0000e+00
Epoch 43/50
65/65 [==============================] - 1s 9ms/step - loss: 5.3645e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00
Epoch 44/50
65/65 [==============================] - 1s 9ms/step - loss: 6.6873e-04 - acc: 4.8403e-04 - val_loss: 0.0015 - val_acc: 0.0000e+00
Epoch 45/50
65/65 [==============================] - 1s 10ms/step - loss: 5.0728e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00
Epoch 46/50
65/65 [==============================] - 1s 10ms/step - loss: 4.9257e-04 - acc: 4.8403e-04 - val_loss: 0.0019 - val_acc: 0.0000e+00
Epoch 47/50
65/65 [==============================] - 1s 9ms/step - loss: 6.4136e-04 - acc: 4.8403e-04 - val_loss: 0.0061 - val_acc: 0.0000e+00
Epoch 48/50
65/65 [==============================] - 1s 10ms/step - loss: 6.6164e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00
Epoch 49/50
65/65 [==============================] - 1s 10ms/step - loss: 5.5160e-04 - acc: 4.8403e-04 - val_loss: 0.0034 - val_acc: 0.0000e+00
Epoch 50/50
65/65 [==============================] - 1s 9ms/step - loss: 5.8400e-04 - acc: 4.8403e-04 - val_loss: 0.0012 - val_acc: 0.0000e+00

问题一:

报错:

ValueError: Failed to find data adapter that can handle input: 
<class 'numpy.ndarray'>, 
(<class 'list'> containing values of types {"<class 'numpy.float64'>"})

问题原因:

很可能是没有将数据转为numpy。
In [ ]:
print(type(x_train))
print(type(y_train))
print(type(x_test))
print(type(y_test))

问题二:

训练集和测试集准确率都为0

解决方案:

来初步检验一下训练和测试数据

不是因为我没有打乱数据

真正原因是 该应用只观测loss数值,不观测准确率
In [ ]:
print(x_train.shape)
print(y_train.shape)
print(x_test.shape)
print(y_test.shape)
In [19]:
print("===================x_train[0:2]===================")
print(x_train[0:2])
print("===================y_train[0:2]===================")
print(y_train[0:2])
print("===================x_test[0:2]===================")
print(x_test[0:2])
print("===================y_test[0:2]===================")
print(y_test[0:2])
===================x_train[0:2]===================
[[[0.11437703]
  [0.12028758]
  [0.12245869]
  [0.12454369]
  [0.11934037]
  [0.11990361]
  [0.12065884]
  [0.1179019 ]
  [0.11856114]
  [0.12389715]
  [0.11991632]
  [0.1223147 ]
  [0.12153547]
  [0.12145077]
  [0.11910039]
  [0.12251798]
  [0.12929246]
  [0.13166684]
  [0.12628283]
  [0.12815326]
  [0.13109089]
  [0.13720613]
  [0.13654689]
  [0.17293625]
  [0.16226422]
  [0.17425472]
  [0.20878635]
  [0.20303108]
  [0.18849821]
  [0.19533339]
  [0.19104058]
  [0.18156845]
  [0.18762299]
  [0.18943413]
  [0.1886422 ]
  [0.17902608]
  [0.18024997]
  [0.18159245]
  [0.18804366]
  [0.19487743]
  [0.1908006 ]
  [0.19373823]
  [0.18793496]
  [0.18024997]
  [0.17551391]
  [0.1886422 ]
  [0.19343896]
  [0.19785035]
  [0.21022482]
  [0.23012758]
  [0.22883311]
  [0.21651934]
  [0.19824702]
  [0.20295909]
  [0.21082477]
  [0.20444555]
  [0.20180719]
  [0.19596863]
  [0.18805495]
  [0.20422957]]

 [[0.98329743]
  [0.98605296]
  [0.97216238]
  [0.96666403]
  [0.96937862]
  [1.        ]
  [0.98608119]
  [0.97797836]
  [0.97204945]
  [0.93252343]
  [0.96671344]
  [0.9508748 ]
  [0.96781452]
  [0.96684049]
  [0.94805151]
  [0.89440904]
  [0.89200925]
  [0.92337597]
  [0.88842367]
  [0.86335288]
  [0.87464603]
  [0.88452754]
  [0.90292126]
  [0.9155837 ]
  [0.88876247]
  [0.90005562]
  [0.93435856]
  [0.948475  ]
  [0.95510973]
  [0.93454208]
  [0.92970014]
  [0.93322925]
  [0.92546521]
  [0.92077855]
  [0.94092271]
  [0.94367541]
  [0.92546521]
  [0.92955897]
  [0.9033024 ]
  [0.91897165]
  [0.89718998]
  [0.86758781]
  [0.84217822]
  [0.8355435 ]
  [0.85632289]
  [0.83370836]
  [0.86318348]
  [0.85064809]
  [0.84531207]
  [0.84613083]
  [0.80688713]
  [0.81465117]
  [0.77928949]
  [0.79982891]
  [0.83370836]
  [0.83469651]
  [0.83088507]
  [0.82100357]
  [0.86476452]
  [0.84923644]]]
===================y_train[0:2]===================
[0.193007   0.85800275]
===================x_test[0:2]===================
[[[0.84288404]
  [0.85345726]
  [0.84641315]
  [0.87046756]
  [0.86758781]
  [0.90711384]
  [0.90711384]
  [0.91868932]
  [0.92388417]
  [0.91840699]
  [0.93246696]
  [0.92405356]
  [0.90287891]
  [0.89864397]
  [0.91805408]
  [0.9421226 ]
  [0.92123028]
  [0.92814733]
  [0.96146212]
  [1.00875219]
  [0.99463575]
  [0.98051932]
  [0.96247851]
  [0.92814733]
  [0.93116825]
  [0.96473714]
  [0.93887582]
  [0.95510973]
  [0.98193096]
  [0.98588356]
  [1.02004534]
  [1.00169397]
  [0.99887068]
  [0.99918125]
  [0.9960474 ]
  [0.98757753]
  [0.98898918]
  [1.00653591]
  [1.06521794]
  [1.10050904]
  [1.10049492]
  [1.07244556]
  [1.07227616]
  [1.15697479]
  [1.16120972]
  [1.16120972]
  [1.22473369]
  [1.16967959]
  [1.20073575]
  [1.16389185]
  [1.21061726]
  [1.22107754]
  [1.21903065]
  [1.2409676 ]
  [1.22896862]
  [1.2697369 ]
  [1.25408177]
  [1.2223339 ]
  [1.22049876]
  [1.25296657]]

 [[0.85345726]
  [0.84641315]
  [0.87046756]
  [0.86758781]
  [0.90711384]
  [0.90711384]
  [0.91868932]
  [0.92388417]
  [0.91840699]
  [0.93246696]
  [0.92405356]
  [0.90287891]
  [0.89864397]
  [0.91805408]
  [0.9421226 ]
  [0.92123028]
  [0.92814733]
  [0.96146212]
  [1.00875219]
  [0.99463575]
  [0.98051932]
  [0.96247851]
  [0.92814733]
  [0.93116825]
  [0.96473714]
  [0.93887582]
  [0.95510973]
  [0.98193096]
  [0.98588356]
  [1.02004534]
  [1.00169397]
  [0.99887068]
  [0.99918125]
  [0.9960474 ]
  [0.98757753]
  [0.98898918]
  [1.00653591]
  [1.06521794]
  [1.10050904]
  [1.10049492]
  [1.07244556]
  [1.07227616]
  [1.15697479]
  [1.16120972]
  [1.16120972]
  [1.22473369]
  [1.16967959]
  [1.20073575]
  [1.16389185]
  [1.21061726]
  [1.22107754]
  [1.21903065]
  [1.2409676 ]
  [1.22896862]
  [1.2697369 ]
  [1.25408177]
  [1.2223339 ]
  [1.22049876]
  [1.25296657]
  [1.19297171]]]
===================y_test[0:2]===================
[1.19297171 1.14748854]

5、检验模型

In [20]:
################## predict ######################
# 测试集输入模型进行预测
predicted_stock_price = model.predict(x_test)
# 对预测数据还原---从(0,1)反归一化到原始范围
predicted_stock_price = sc.inverse_transform(predicted_stock_price)
# 对真实数据还原---从(0,1)反归一化到原始范围
real_stock_price = sc.inverse_transform(test_set[60:])
# 画出真实数据和预测数据的对比曲线
plt.plot(real_stock_price, color='red', label='MaoTai Stock Price')
plt.plot(predicted_stock_price, color='blue', label='Predicted MaoTai Stock Price')
plt.title('MaoTai Stock Price Prediction')
plt.xlabel('Time')
plt.ylabel('MaoTai Stock Price')
plt.legend()
plt.show()
Tensorflow2(预课程)---11.3.2、(a)循环神经网络实现股票预测(GRU)
In [23]:
predicted_stock_price = model.predict(x_test)
print(predicted_stock_price.shape)
print(predicted_stock_price)
(240, 1)
[[1.2496097]
 [1.2180107]
 [1.1920016]
 [1.1572804]
 [1.1571976]
 [1.1487962]
 [1.162889 ]
 [1.1513494]
 [1.1556352]
 [1.1770049]
 [1.1861501]
 [1.1704055]
 [1.1597265]
 [1.1639699]
 [1.1507306]
 [1.1362598]
 [1.1428137]
 [1.1391671]
 [1.1465825]
 [1.1641831]
 [1.1602017]
 [1.1567447]
 [1.1590788]
 [1.1501439]
 [1.1426033]
 [1.126034 ]
 [1.1359271]
 [1.1572855]
 [1.163716 ]
 [1.1697279]
 [1.1700536]
 [1.1798601]
 [1.1997631]
 [1.1986607]
 [1.2321659]
 [1.2441592]
 [1.2632165]
 [1.2651492]
 [1.2709178]
 [1.2747902]
 [1.291428 ]
 [1.3105906]
 [1.3131642]
 [1.2931318]
 [1.293417 ]
 [1.2972054]
 [1.2848128]
 [1.2852478]
 [1.2900039]
 [1.2833602]
 [1.2731198]
 [1.276648 ]
 [1.2689617]
 [1.2629356]
 [1.255669 ]
 [1.2567824]
 [1.2566088]
 [1.2530268]
 [1.2428858]
 [1.2530054]
 [1.2519662]
 [1.2640859]
 [1.2646676]
 [1.2683247]
 [1.2448223]
 [1.2435943]
 [1.2283033]
 [1.2387761]
 [1.2396653]
 [1.2597725]
 [1.286981 ]
 [1.2987366]
 [1.3174903]
 [1.3228873]
 [1.3453027]
 [1.3592703]
 [1.3621441]
 [1.3788081]
 [1.3853014]
 [1.4140732]
 [1.4262015]
 [1.4377273]
 [1.4385366]
 [1.4401965]
 [1.4569238]
 [1.4708261]
 [1.4765506]
 [1.4825215]
 [1.477727 ]
 [1.4909699]
 [1.4928187]
 [1.4883826]
 [1.4788319]
 [1.4388505]
 [1.444756 ]
 [1.4444124]
 [1.4530085]
 [1.4820683]
 [1.4888107]
 [1.5005957]
 [1.5042789]
 [1.5263536]
 [1.5272175]
 [1.5252371]
 [1.5268217]
 [1.5203409]
 [1.5298126]
 [1.5016943]
 [1.5220764]
 [1.5396878]
 [1.5396177]
 [1.5361525]
 [1.5433   ]
 [1.5459824]
 [1.5427814]
 [1.5481669]
 [1.5414389]
 [1.5397809]
 [1.5303941]
 [1.5378081]
 [1.547111 ]
 [1.5551813]
 [1.562668 ]
 [1.5554315]
 [1.563697 ]
 [1.570631 ]
 [1.5711436]
 [1.5731399]
 [1.5803179]
 [1.579383 ]
 [1.5830431]
 [1.5861375]
 [1.6015246]
 [1.6062604]
 [1.607214 ]
 [1.6119435]
 [1.6149733]
 [1.6175414]
 [1.613611 ]
 [1.5899261]
 [1.5863079]
 [1.5849715]
 [1.5836451]
 [1.5615681]
 [1.5248895]
 [1.526314 ]
 [1.5214719]
 [1.506787 ]
 [1.5108902]
 [1.5350037]
 [1.5255493]
 [1.5354614]
 [1.5274994]
 [1.5212291]
 [1.5316311]
 [1.5212969]
 [1.539139 ]
 [1.5314708]
 [1.5332769]
 [1.5164481]
 [1.5247804]
 [1.5148988]
 [1.5064287]
 [1.5165644]
 [1.5294131]
 [1.5410366]
 [1.5060812]
 [1.4982108]
 [1.4575131]
 [1.4535555]
 [1.4485898]
 [1.449975 ]
 [1.4584137]
 [1.4610232]
 [1.4710906]
 [1.4619825]
 [1.4704398]
 [1.4636949]
 [1.4655579]
 [1.4422989]
 [1.4316255]
 [1.4298326]
 [1.3578763]
 [1.3694799]
 [1.3814098]
 [1.3863902]
 [1.3962893]
 [1.3927637]
 [1.3953185]
 [1.4146342]
 [1.4236962]
 [1.4233013]
 [1.4208338]
 [1.4270904]
 [1.4239149]
 [1.433447 ]
 [1.4451462]
 [1.4440202]
 [1.4270732]
 [1.4144013]
 [1.4199626]
 [1.4125733]
 [1.3967794]
 [1.4286413]
 [1.4479666]
 [1.463504 ]
 [1.4905181]
 [1.4783645]
 [1.4689606]
 [1.498881 ]
 [1.4897205]
 [1.4577249]
 [1.4649007]
 [1.4230599]
 [1.406461 ]
 [1.361368 ]
 [1.3616215]
 [1.3417025]
 [1.3671429]
 [1.3979292]
 [1.3935053]
 [1.4090799]
 [1.3939685]
 [1.4117813]
 [1.4375412]
 [1.4337497]
 [1.4655135]
 [1.4869791]
 [1.4891987]
 [1.5069606]
 [1.5061187]
 [1.5162058]
 [1.5270276]
 [1.542497 ]
 [1.5484113]
 [1.5659509]
 [1.5786583]
 [1.5850556]
 [1.580079 ]
 [1.6137757]]
In [24]:
# 对预测数据还原---从(0,1)反归一化到原始范围
predicted_stock_price = sc.inverse_transform(predicted_stock_price)
print(predicted_stock_price.shape)
print(predicted_stock_price)
(240, 1)
[[ 965.622  ]
 [ 943.2375 ]
 [ 924.8128 ]
 [ 900.21655]
 [ 900.15784]
 [ 894.20636]
 [ 904.1896 ]
 [ 896.015  ]
 [ 899.0511 ]
 [ 914.1893 ]
 [ 920.6676 ]
 [ 909.5142 ]
 [ 901.9493 ]
 [ 904.95526]
 [ 895.57666]
 [ 885.3256 ]
 [ 889.9684 ]
 [ 887.38513]
 [ 892.6382 ]
 [ 905.1064 ]
 [ 902.2859 ]
 [ 899.83704]
 [ 901.4905 ]
 [ 895.161  ]
 [ 889.81934]
 [ 878.0817 ]
 [ 885.08997]
 [ 900.2201 ]
 [ 904.7754 ]
 [ 909.03424]
 [ 909.26495]
 [ 916.21185]
 [ 930.311  ]
 [ 929.5301 ]
 [ 953.26495]
 [ 961.7609 ]
 [ 975.261  ]
 [ 976.6301 ]
 [ 980.71655]
 [ 983.4597 ]
 [ 995.24585]
 [1008.82056]
 [1010.6437 ]
 [ 996.4528 ]
 [ 996.65485]
 [ 999.33856]
 [ 990.5597 ]
 [ 990.86786]
 [ 994.23706]
 [ 989.5307 ]
 [ 982.2764 ]
 [ 984.7758 ]
 [ 979.3309 ]
 [ 975.062  ]
 [ 969.91437]
 [ 970.7031 ]
 [ 970.5802 ]
 [ 968.0427 ]
 [ 960.8589 ]
 [ 968.0275 ]
 [ 967.2914 ]
 [ 975.8769 ]
 [ 976.289  ]
 [ 978.87964]
 [ 962.23065]
 [ 961.3607 ]
 [ 950.5287 ]
 [ 957.9476 ]
 [ 958.57745]
 [ 972.82135]
 [ 992.09564]
 [1000.4232 ]
 [1013.70825]
 [1017.53143]
 [1033.4104 ]
 [1043.3049 ]
 [1045.3407 ]
 [1057.1454 ]
 [1061.7451 ]
 [1082.127  ]
 [1090.7186 ]
 [1098.8834 ]
 [1099.4568 ]
 [1100.6326 ]
 [1112.4822 ]
 [1122.3304 ]
 [1126.3856 ]
 [1130.6154 ]
 [1127.219  ]
 [1136.6001 ]
 [1137.9098 ]
 [1134.7673 ]
 [1128.0016 ]
 [1099.6791 ]
 [1103.8625 ]
 [1103.619  ]
 [1109.7085 ]
 [1130.2943 ]
 [1135.0706 ]
 [1143.419  ]
 [1146.0282 ]
 [1161.6658 ]
 [1162.2777 ]
 [1160.8748 ]
 [1161.9973 ]
 [1157.4064 ]
 [1164.1161 ]
 [1144.1973 ]
 [1158.6357 ]
 [1171.1116 ]
 [1171.0619 ]
 [1168.6072 ]
 [1173.6705 ]
 [1175.5707 ]
 [1173.3031 ]
 [1177.1182 ]
 [1172.352  ]
 [1171.1775 ]
 [1164.528  ]
 [1169.78   ]
 [1176.3702 ]
 [1182.087  ]
 [1187.3906 ]
 [1182.2644 ]
 [1188.1196 ]
 [1193.0316 ]
 [1193.3948 ]
 [1194.8088 ]
 [1199.8937 ]
 [1199.2314 ]
 [1201.8242 ]
 [1204.0164 ]
 [1214.9164 ]
 [1218.2712 ]
 [1218.9468 ]
 [1222.2971 ]
 [1224.4434 ]
 [1226.2627 ]
 [1223.4784 ]
 [1206.7002 ]
 [1204.137  ]
 [1203.1903 ]
 [1202.2507 ]
 [1186.6116 ]
 [1160.6285 ]
 [1161.6377 ]
 [1158.2075 ]
 [1147.8048 ]
 [1150.7115 ]
 [1167.7933 ]
 [1161.096  ]
 [1168.1177 ]
 [1162.4774 ]
 [1158.0356 ]
 [1165.4043 ]
 [1158.0836 ]
 [1170.7229 ]
 [1165.2908 ]
 [1166.5702 ]
 [1154.6488 ]
 [1160.5513 ]
 [1153.5513 ]
 [1147.551  ]
 [1154.7311 ]
 [1163.8331 ]
 [1172.0671 ]
 [1147.3049 ]
 [1141.7295 ]
 [1112.8995 ]
 [1110.096  ]
 [1106.5784 ]
 [1107.5596 ]
 [1113.5376 ]
 [1115.3861 ]
 [1122.5177 ]
 [1116.0657 ]
 [1122.0568 ]
 [1117.2787 ]
 [1118.5985 ]
 [1102.122  ]
 [1094.5609 ]
 [1093.2908 ]
 [1042.3174 ]
 [1050.5374 ]
 [1058.9884 ]
 [1062.5165 ]
 [1069.529  ]
 [1067.0315 ]
 [1068.8413 ]
 [1082.5244 ]
 [1088.9438 ]
 [1088.6642 ]
 [1086.9161 ]
 [1091.3483 ]
 [1089.0988 ]
 [1095.8513 ]
 [1104.1389 ]
 [1103.3412 ]
 [1091.3362 ]
 [1082.3594 ]
 [1086.2991 ]
 [1081.0645 ]
 [1069.8762 ]
 [1092.4469 ]
 [1106.1368 ]
 [1117.1434 ]
 [1136.28   ]
 [1127.6705 ]
 [1121.0089 ]
 [1142.2043 ]
 [1135.7151 ]
 [1113.0496 ]
 [1118.1329 ]
 [1088.4932 ]
 [1076.7345 ]
 [1044.7909 ]
 [1044.9705 ]
 [1030.86   ]
 [1048.8818 ]
 [1070.6907 ]
 [1067.5569 ]
 [1078.5897 ]
 [1067.8849 ]
 [1080.5034 ]
 [1098.7516 ]
 [1096.0657 ]
 [1118.567  ]
 [1133.7731 ]
 [1135.3455 ]
 [1147.9279 ]
 [1147.3314 ]
 [1154.477  ]
 [1162.1432 ]
 [1173.1017 ]
 [1177.2913 ]
 [1189.7162 ]
 [1198.7181 ]
 [1203.2499 ]
 [1199.7245 ]
 [1223.5951 ]]
In [25]:
# 对真实数据还原---从(0,1)反归一化到原始范围
real_stock_price = sc.inverse_transform(test_set[60:])
print(real_stock_price.shape)
print(real_stock_price)
(240, 1)
[[ 925.5 ]
 [ 893.28]
 [ 860.  ]
 [ 875.  ]
 [ 875.66]
 [ 898.86]
 [ 885.  ]
 [ 890.24]
 [ 919.  ]
 [ 927.02]
 [ 900.  ]
 [ 885.  ]
 [ 894.98]
 [ 881.  ]
 [ 865.  ]
 [ 879.1 ]
 [ 877.99]
 [ 888.01]
 [ 910.  ]
 [ 900.53]
 [ 892.  ]
 [ 895.  ]
 [ 884.  ]
 [ 875.62]
 [ 857.98]
 [ 876.  ]
 [ 906.22]
 [ 910.  ]
 [ 911.  ]
 [ 907.  ]
 [ 917.  ]
 [ 940.  ]
 [ 932.5 ]
 [ 969.97]
 [ 978.3 ]
 [ 992.  ]
 [ 985.  ]
 [ 985.  ]
 [ 985.99]
 [1004.52]
 [1025.  ]
 [1020.  ]
 [ 985.8 ]
 [ 986.  ]
 [ 995.05]
 [ 979.3 ]
 [ 980.93]
 [ 989.96]
 [ 981.3 ]
 [ 967.8 ]
 [ 975.45]
 [ 968.  ]
 [ 961.5 ]
 [ 955.  ]
 [ 960.  ]
 [ 962.3 ]
 [ 958.31]
 [ 945.97]
 [ 961.97]
 [ 962.03]
 [ 976.5 ]
 [ 975.  ]
 [ 976.51]
 [ 944.  ]
 [ 945.  ]
 [ 931.  ]
 [ 949.5 ]
 [ 953.5 ]
 [ 978.5 ]
 [1010.31]
 [1016.16]
 [1030.02]
 [1028.  ]
 [1049.84]
 [1061.  ]
 [1055.  ]
 [1070.1 ]
 [1072.99]
 [1105.  ]
 [1113.  ]
 [1117.  ]
 [1109.  ]
 [1105.  ]
 [1125.  ]
 [1139.99]
 [1140.  ]
 [1140.8 ]
 [1129.  ]
 [1144.5 ]
 [1145.  ]
 [1134.3 ]
 [1119.22]
 [1066.  ]
 [1083.  ]
 [1094.  ]
 [1108.5 ]
 [1148.  ]
 [1151.  ]
 [1157.  ]
 [1155.  ]
 [1180.  ]
 [1174.94]
 [1164.  ]
 [1163.  ]
 [1153.  ]
 [1166.2 ]
 [1127.  ]
 [1157.  ]
 [1186.  ]
 [1180.  ]
 [1168.  ]
 [1176.  ]
 [1179.  ]
 [1171.86]
 [1178.  ]
 [1168.  ]
 [1164.95]
 [1153.3 ]
 [1166.01]
 [1181.11]
 [1190.  ]
 [1196.  ]
 [1181.  ]
 [1190.  ]
 [1199.5 ]
 [1197.  ]
 [1196.51]
 [1205.  ]
 [1201.5 ]
 [1204.  ]
 [1207.  ]
 [1227.  ]
 [1230.  ]
 [1225.12]
 [1228.  ]
 [1230.  ]
 [1231.  ]
 [1223.  ]
 [1188.05]
 [1186.68]
 [1192.8 ]
 [1194.97]
 [1165.5 ]
 [1118.2 ]
 [1133.  ]
 [1140.  ]
 [1125.  ]
 [1135.97]
 [1175.  ]
 [1159.6 ]
 [1168.  ]
 [1156.  ]
 [1146.  ]
 [1163.  ]
 [1149.7 ]
 [1174.  ]
 [1163.  ]
 [1162.18]
 [1139.  ]
 [1153.  ]
 [1143.  ]
 [1132.  ]
 [1150.  ]
 [1170.2 ]
 [1183.  ]
 [1128.  ]
 [1117.  ]
 [1070.86]
 [1077.5 ]
 [1085.05]
 [1094.  ]
 [1109.  ]
 [1112.5 ]
 [1124.2 ]
 [1109.01]
 [1118.87]
 [1110.  ]
 [1111.86]
 [1081.  ]
 [1070.  ]
 [1076.  ]
 [ 985.  ]
 [1015.  ]
 [1050.  ]
 [1059.43]
 [1070.01]
 [1062.  ]
 [1063.  ]
 [1089.  ]
 [1098.  ]
 [1090.45]
 [1082.5 ]
 [1090.01]
 [1085.  ]
 [1096.7 ]
 [1111.46]
 [1105.5 ]
 [1078.  ]
 [1062.  ]
 [1076.  ]
 [1070.3 ]
 [1050.11]
 [1098.  ]
 [1125.  ]
 [1136.31]
 [1163.  ]
 [1135.  ]
 [1113.  ]
 [1155.5 ]
 [1141.22]
 [1090.  ]
 [1103.98]
 [1055.  ]
 [1040.  ]
 [ 993.99]
 [1011.  ]
 [1000.  ]
 [1043.  ]
 [1087.98]
 [1073.33]
 [1085.  ]
 [1060.25]
 [1082.  ]
 [1117.  ]
 [1104.  ]
 [1139.03]
 [1161.95]
 [1152.  ]
 [1166.  ]
 [1157.88]
 [1165.  ]
 [1176.  ]
 [1192.  ]
 [1192.97]
 [1210.  ]
 [1221.  ]
 [1221.02]
 [1206.  ]
 [1250.  ]
 [1248.  ]]
In [ ]: