Tensorflow学习笔记
  • Introduction
  • Tensorflow基础框架
    • 处理结构
    • 例子2
    • Session 会话控制
    • Variable 变量
    • Placeholder 传入值
    • 激励函数
  • 建造第一个神经网络
    • 添加层 def add_layer()
    • 建造神经网络
    • 结果可视化
    • Optimizer
    • Daterset
  • 可视化好助手Tensorboard
    • Tensorboard可视化好帮手1
    • Tensorboard 可视化好帮手 2
  • 高阶内容
    • Classification 分类学习
    • Dropout 解决 overfitting
    • CNN 卷积神经网络 1
    • CNN 卷积神经网络 2
    • Saver 保存读取
    • RNN LSTM 循环神经网络 (分类例子)
    • RNN LSTM (回归例子)
    • RNN LSTM (回归例子可视化)
    • 自编码Autoencoder(非监督学习)
    • scope 命名方法
    • Batch Normalization 批标准化
    • 用 Tensorflow 可视化梯度下降
Powered by GitBook
On this page

Was this helpful?

  1. 可视化好助手Tensorboard

Tensorboard可视化好帮手1

with tf.name_scope('inputs')可以将xs,ys所有包括进来,形成一个大的图层

图层名字就是with tf.name_scope()方法里的参数

with tf.name_scope('inputs'):
    # define placeholder for inputs to network
    xs = tf.placeholder(tf.float32, [None, 1],name='x_in')
    ys = tf.placeholder(tf.float32, [None, 1],name='y_in')

编辑layer:

当选择用tensorflow中的激励函数(激活函数)的时候,tensorflow会默认添加名称

def add_layer(inputs, in_size, out_size, activation_function=None):
    # add one more layer and return the output of this layer
    with tf.name_scope('layer'):
        with tf.name_scope('weights'):
            Weights = tf.Variable(
            tf.random_normal([in_size, out_size]), 
            name='W')
        with tf.name_scope('biases'):
            biases = tf.Variable(
            tf.zeros([1, out_size]) + 0.1, 
            name='b')
        with tf.name_scope('Wx_plus_b'):
            Wx_plus_b = tf.add(
            tf.matmul(inputs, Weights), 
            biases)
        if activation_function is None:
            outputs = Wx_plus_b
        else:
            outputs = activation_function(Wx_plus_b, )
        return outputs

loss部分:

# the error between prediciton and real data
with tf.name_scope('loss'):
    loss = tf.reduce_mean(
    tf.reduce_sum(
    tf.square(ys - prediction),
    axis=[1]
    ))

train_step:

with tf.name_scope('train'):
    train_step = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

tf.summary.FileWriter()将'绘画'出的图保存到一个目录中

第二个参数需要使用sess.graph,需要把这句话放在获取session的后面

graph是将前面定义的框架信息收集起来,然后放在logs/目录下面。

sess = tf.Session() # get session
# tf.train.SummaryWriter soon be deprecated, use following
writer = tf.summary.FileWriter("logs/", sess.graph)

终端:

tensorboard --logdir logs
Previous可视化好助手TensorboardNextTensorboard 可视化好帮手 2

Last updated 6 years ago

Was this helpful?