반응형
++ Linear Regression with multi-variables
X의 feature들이 여러 개 x1, x2, ...
import tensorflow as tf
# y = x1*2+x2+x3 training
# W => 2,1,1, b=0
x_train = [[1,2,3],[2,3,4],[3,4,5],[4,5,6], [1,3,5]]
y_train = [[7],[11],[15],[19], [10]]
#placeholder
x = tf.placeholder(tf.float32, shape=[None,3])
y = tf.placeholder(tf.float32, shape=[None,1])
# variables
W = tf.Variable(tf.random_normal(shape=[3,1]), name='W')
b = tf.Variable(tf.random_normal(shape=[1]), name='b')
# model
hypothesis = tf.matmul(x, W) + b
cost = tf.reduce_mean( tf.square(hypothesis-y) )
# train
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(cost)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
# test
# hres= sess.run(hypothesis, feed_dict={x:x_train, y:y_train})
for i in range(3001):
_, pcost, _W, _b= sess.run([train, cost, W, b], feed_dict={x:x_train, y:y_train})
if i%500==0:
print (i, pcost, _W, _b)
# test
print( sess.run(hypothesis, feed_dict={x: [[4, 2, 3]]}))
'AI(DeepLearning)' 카테고리의 다른 글
나무위키 데이터베이스 워드벡터(word2vec) 만들기 (2) | 2019.04.12 |
---|---|
SVM 간단 예제 (0) | 2018.09.04 |
tensorflow 강좌5. linear regression (0) | 2018.07.18 |
tensorflow 강좌4. 학습데이터 저장/복구 (0) | 2018.07.18 |
tensorflow 강좌3. 데이터처리 (0) | 2018.07.18 |