【关于文件夹】

  这里Keras是在Windows环境,使用Anaconda安装

  Anaconda有两个主要文件夹需要了解:

  1 Anaconda 应用程序安装目录下的Keras子文件夹,需要搜索找到

  2 Anaconda 应用程序存储Keras模型和数据集文件的文件在 ,用对应的用户文件夹下的.kears文件夹***意有个.,实在找不见可以搜索

【数据集】:下载后默认存储目录 C:\Users\Administrator\.keras\datasets下的同名文件,注意有个点

  执行下载时,要import相应的模块,利用数据集模块提供的函数下载数据,模块文件结构如下图所示: 在Anaconda安装文件夹下搜索keras即可找到此目录

  Keras 自带数据集与模型

  以 cifar10数据集模块 为例,内容如下所示: 

 1 # Copyright 2015 The TensorFlow Authors. All Rights Reserved.
 2 #
 3 # Licensed under the Apache License, Version 2.0 (the "License");
 4 # you may not use this file except in compliance with the License.
 5 # You may obtain a copy of the License at
 6 #
 7 #     http://www.apache.org/licenses/LICENSE-2.0
 8 #
 9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 # ==============================================================================
15 """CIFAR10 small images classification dataset.
16 """
17 from __future__ import absolute_import
18 from __future__ import division
19 from __future__ import print_function
20 
21 import os
22 
23 import numpy as np
24 
25 from tensorflow.python.keras import backend as K
26 from tensorflow.python.keras.datasets.cifar import load_batch
27 from tensorflow.python.keras.utils.data_utils import get_file
28 from tensorflow.python.util.tf_export import tf_export
29 
30 
31 @tf_export('keras.datasets.cifar10.load_data')
32 def load_data():
33   """Loads CIFAR10 dataset.
34 
35   Returns:
36       Tuple of Numpy arrays: `(x_train, y_train), (x_test, y_test)`.
37   """
38   dirname = 'cifar-10-batches-py'
39   origin = 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz'
40   path = get_file(dirname, origin=origin, untar=True)
41 
42   num_train_samples = 50000
43 
44   x_train = np.empty((num_train_samples, 3, 32, 32), dtype='uint8')
45   y_train = np.empty((num_train_samples,), dtype='uint8')
46 
47   for i in range(1, 6):
48     fpath = os.path.join(path, 'data_batch_' + str(i))
49     (x_train[(i - 1) * 10000:i * 10000, :, :, :],
50      y_train[(i - 1) * 10000:i * 10000]) = load_batch(fpath)
51 
52   fpath = os.path.join(path, 'test_batch')
53   x_test, y_test = load_batch(fpath)
54 
55   y_train = np.reshape(y_train, (len(y_train), 1))
56   y_test = np.reshape(y_test, (len(y_test), 1))
57 
58   if K.image_data_format() == 'channels_last':
59     x_train = x_train.transpose(0, 2, 3, 1)
60     x_test = x_test.transpose(0, 2, 3, 1)
61 
62   return (x_train, y_train), (x_test, y_test)
dirname = 'cifar-10-batches-py' #指明下载后的文件的存储目录
origin = 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz' #指明要从哪个网址下载

在程序中需要下载对应的数据集时,首先导入对应的模块,然后调用.load_data()函数
1 #从Keras导入相应的模块
2 from keras.datasets import cifar10 #这里的cifar10 对应上面的cifar10.py
3 
4 #下载数据集,这里使用cifar10数据集
5 (x_train, y_train), (x_validate, y_validate) = cifar10.load_data() #下载的好慢,要一个半小时!

下载完成后,保存数据的文件夹内如下图所示

 当在程序中执行下载较慢时,也可从GitHub上下载下来后,替换对应的文件

【模型】:下载后默认存储目录 C:\Users\Administrator\.keras\datasets下的同名文件,注意有个点

  Keras中的模型文件封装在 Application 模块中,详细请查看官方中文文档,文档写的很全面:https://keras-cn.readthedocs.io/en/latest/

  模型相关文件位于与dataset同级的applications文件加****意,是Anaconda应用程序安装目录文件夹的子文件夹

  Keras 提供的模型如下图所示:

Keras 自带数据集与模型

  在 程序中实例化相应的模型时,会自动下载对应的模型文件并存储到 C:\Users\Administrator\.keras\datasets 目录中。

1 base_model = InceptionV3(weights='imagenet',include_top=False) #加载模型,不包含顶层

Keras 自带数据集与模型

  在程序内下载模型较慢时(例如 Inception v3模型大小有80多兆,我这下载显示要3个小时),可以下载下来后,替换对应的文件。

  下载地址可以在程序执行失败时根据错误信息获得。也可在applications文件夹下对应的文件内找到.

  下面时Inception v3.py 文件中关于下载地址的部分

1 WEIGHTS_PATH = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.5/inception_v3_weights_tf_dim_ordering_tf_kernels.h5'
2 WEIGHTS_PATH_NO_TOP = 'https://github.com/fchollet/deep-learning-models/releases/download/v0.5/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5'