转载:Keras深度学习CNN+LSTM预测黄金主力收盘价

数据由JQData本地量化金融数据支持
上一篇做了2个实验,预测黄金期货主力合约的收盘价。
实验2:
使⽤历史前5个时刻的 open close high low volume money
预测当前时刻的收盘价,
即 [None, 5, 6] => [None, 1] # None是 batch_size
这一篇对 第2个实验的模型 进行拓展,增加CNN层
因为 对每个样本是5行,6列的数据,二维数据,能够使用CNN进行特征提取
模型架构
输入层
CNN进行特征提取,+池化层+dropout
双向LSTM层
输出层

实验结果:是测试集的结果。test为测试集的真实收盘价,pred为模型预测的收盘价
【转载】Keras深度学习CNN+LSTM预测黄金主力收盘价
Here’s what you should take away from this section:

In the same way that 2D convnets perform well for processing visual patterns in 2D space, 1D convnets perform well for processing temporal patterns. They offer a faster alternative to RNNs on some problems, in particular NLP tasks.
Typically 1D convnets are structured much like their 2D equivalents from the world of computer vision: they consist of stacks of Conv1D layers and MaxPooling1D layers, eventually ending in a global pooling operation or flattening operation.
Because RNNs are extremely expensive for processing very long sequences, but 1D convnets are cheap, it can be a good idea to use a 1D convnet as a preprocessing step before a RNN, shortening the sequence and extracting useful representations for the RNN to process.
One useful and important concept that we will not cover in these pages is that of 1D convolution with dilated kernels.
所以,模型架构:CNN+LSTM