Tensorflow学习笔记
  • Introduction
  • Tensorflow基础框架
    • 处理结构
    • 例子2
    • Session 会话控制
    • Variable 变量
    • Placeholder 传入值
    • 激励函数
  • 建造第一个神经网络
    • 添加层 def add_layer()
    • 建造神经网络
    • 结果可视化
    • Optimizer
    • Daterset
  • 可视化好助手Tensorboard
    • Tensorboard可视化好帮手1
    • Tensorboard 可视化好帮手 2
  • 高阶内容
    • Classification 分类学习
    • Dropout 解决 overfitting
    • CNN 卷积神经网络 1
    • CNN 卷积神经网络 2
    • Saver 保存读取
    • RNN LSTM 循环神经网络 (分类例子)
    • RNN LSTM (回归例子)
    • RNN LSTM (回归例子可视化)
    • 自编码Autoencoder(非监督学习)
    • scope 命名方法
    • Batch Normalization 批标准化
    • 用 Tensorflow 可视化梯度下降
Powered by GitBook
On this page

Was this helpful?

  1. 建造第一个神经网络

结果可视化

Previous建造神经网络NextOptimizer

Last updated 6 years ago

Was this helpful?

for i in range(1000):
    # training
    ts_,loss_,prec_=sess.run([train_step,loss,prediction], feed_dict={xs: x_data, ys: y_data})
    if i % 50 == 0:
        plt.cla()
        plt.scatter(x, y)
        plt.plot(x, pred, 'r-', lw=5)
        plt.text(0.5, 0, 'Loss=%.4f' % l, fontdict={'size': 20, 'color': 'red'})
        plt.pause(0.1)
import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np

tf.set_random_seed(1)
np.random.seed(1)

# fake data
x = np.linspace(-1, 1, 100)[:, np.newaxis]          # shape (100, 1)
noise = np.random.normal(0, 0.1, size=x.shape)
y = np.power(x, 2) + noise                          # shape (100, 1) + some noise

# plot data
plt.scatter(x, y)
plt.show()

tf_x = tf.placeholder(tf.float32, x.shape)     # input x
tf_y = tf.placeholder(tf.float32, y.shape)     # input y

# neural network layers
l1 = tf.layers.dense(tf_x, 10, tf.nn.relu)          # hidden layer
output = tf.layers.dense(l1, 1)                     # output layer

loss = tf.losses.mean_squared_error(tf_y, output)   # compute cost
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.5)
train_op = optimizer.minimize(loss)

sess = tf.Session()                                 # control training and others
sess.run(tf.global_variables_initializer())         # initialize var in graph

plt.ion()   # something about plotting

for step in range(100):
    # train and net output
    _, l, pred = sess.run([train_op, loss, output], {tf_x: x, tf_y: y})
    if step % 5 == 0:
        # plot and show learning process
        plt.cla()
        plt.scatter(x, y)
        plt.plot(x, pred, 'r-', lw=5)
        plt.text(0.5, 0, 'Loss=%.4f' % l, fontdict={'size': 20, 'color': 'red'})
        plt.pause(0.1)

plt.ioff()
plt.show()