Two dimensional data regression using pytorch
来源:互联网 发布:淘宝关键词搜索规则 编辑:程序博客网 时间:2024/05/29 12:00
"""
View more, visit my tutorial page: https://morvanzhou.github.io/tutorials/
My Youtube Channel: https://www.youtube.com/user/MorvanZhou
Dependencies:
torch: 0.1.11
matplotlib
"""
import torch
from torch.autograd import Variable
import torch.nn.functional as F
import matplotlib.pyplot as plt
import numpy as np
torch.manual_seed(1) # reproducible
# x = torch.unsqueeze(torch.linspace(-1, 1, 100), dim=1) # x data (tensor), shape=(100, 1)
# y = x.pow(2) + 0.2*torch.rand(x.size()) # noisy y data (tensor), shape=(100, 1)
x = np.array([[0.7060,0.8235,0.4387],[0.0318,0.6948,0.3816],[0.2769,0.3171,0.7655],[0.0462,0.9502,0.7952]],dtype=np.float32)
y = np.array([[1.8447],[1.2161],[1.4053],[1.7721]],dtype=np.float32)
# torch can only train on Variable, so convert them to Variable
x, y = Variable(torch.from_numpy(x)), Variable(torch.from_numpy(y))
# plt.scatter(x.data.numpy()[:,0], y.data.numpy())
# plt.show()
class Net(torch.nn.Module):
def __init__(self, n_feature, n_hidden1, n_hidden2, n_output):
super(Net, self).__init__()
self.hidden1 = torch.nn.Linear(n_feature, n_hidden1) # hidden layer
self.hidden2 = torch.nn.Linear(n_hidden1, n_hidden2)
self.predict = torch.nn.Linear(n_hidden2, n_output) # output layer
def forward(self, x):
x = F.relu(self.hidden1(x)) # activation function for hidden layer
x = F.relu(self.hidden2(x))
x = self.predict(x) # linear output
return x
net = Net(n_feature=3, n_hidden1=5, n_hidden2=5, n_output=1) # define the network
print(net) # net architecture
optimizer = torch.optim.SGD(net.parameters(), lr=0.01)
loss_func = torch.nn.MSELoss() # this is for regression mean squared loss
plt.ion() # something about plotting
for t in range(10000):
prediction = net(x) # input x and predict based on x
loss = loss_func(prediction, y) # must be (1. nn output, 2. target)
optimizer.zero_grad() # clear gradients for next train
loss.backward() # backpropagation, compute gradients
optimizer.step() # apply gradients
print(loss.data.numpy())
if t % 5 == 0:
# plot and show learning process
plt.cla()
plt.scatter(x.data.numpy()[:,1], y.data.numpy())
plt.plot(x.data.numpy()[:,1], prediction.data.numpy(), 'r-', lw=5)
plt.text(0.5, 0, 'Loss=%.4f' % loss.data[0], fontdict={'size': 20, 'color': 'red'})
plt.pause(0.1)
plt.ioff()
plt.show()
View more, visit my tutorial page: https://morvanzhou.github.io/tutorials/
My Youtube Channel: https://www.youtube.com/user/MorvanZhou
Dependencies:
torch: 0.1.11
matplotlib
"""
import torch
from torch.autograd import Variable
import torch.nn.functional as F
import matplotlib.pyplot as plt
import numpy as np
torch.manual_seed(1) # reproducible
# x = torch.unsqueeze(torch.linspace(-1, 1, 100), dim=1) # x data (tensor), shape=(100, 1)
# y = x.pow(2) + 0.2*torch.rand(x.size()) # noisy y data (tensor), shape=(100, 1)
x = np.array([[0.7060,0.8235,0.4387],[0.0318,0.6948,0.3816],[0.2769,0.3171,0.7655],[0.0462,0.9502,0.7952]],dtype=np.float32)
y = np.array([[1.8447],[1.2161],[1.4053],[1.7721]],dtype=np.float32)
# torch can only train on Variable, so convert them to Variable
x, y = Variable(torch.from_numpy(x)), Variable(torch.from_numpy(y))
# plt.scatter(x.data.numpy()[:,0], y.data.numpy())
# plt.show()
class Net(torch.nn.Module):
def __init__(self, n_feature, n_hidden1, n_hidden2, n_output):
super(Net, self).__init__()
self.hidden1 = torch.nn.Linear(n_feature, n_hidden1) # hidden layer
self.hidden2 = torch.nn.Linear(n_hidden1, n_hidden2)
self.predict = torch.nn.Linear(n_hidden2, n_output) # output layer
def forward(self, x):
x = F.relu(self.hidden1(x)) # activation function for hidden layer
x = F.relu(self.hidden2(x))
x = self.predict(x) # linear output
return x
net = Net(n_feature=3, n_hidden1=5, n_hidden2=5, n_output=1) # define the network
print(net) # net architecture
optimizer = torch.optim.SGD(net.parameters(), lr=0.01)
loss_func = torch.nn.MSELoss() # this is for regression mean squared loss
plt.ion() # something about plotting
for t in range(10000):
prediction = net(x) # input x and predict based on x
loss = loss_func(prediction, y) # must be (1. nn output, 2. target)
optimizer.zero_grad() # clear gradients for next train
loss.backward() # backpropagation, compute gradients
optimizer.step() # apply gradients
print(loss.data.numpy())
if t % 5 == 0:
# plot and show learning process
plt.cla()
plt.scatter(x.data.numpy()[:,1], y.data.numpy())
plt.plot(x.data.numpy()[:,1], prediction.data.numpy(), 'r-', lw=5)
plt.text(0.5, 0, 'Loss=%.4f' % loss.data[0], fontdict={'size': 20, 'color': 'red'})
plt.pause(0.1)
plt.ioff()
plt.show()
阅读全文
0 0
- Two dimensional data regression using pytorch
- 读 Forecasting High-Dimensional Data
- pytorch-tensor data type
- Using a delegate to pass data between two forms.
- Recipe 4.8. Transposing Two-Dimensional Arrays
- The two-dimensional pointer operation in C++
- Two-dimensional arrays as arrays of arrays
- AT3 two-dimensional surfaces : the sphere
- Initializing a two dimensional std::vector
- Dimensional Data Warehousing with MySQL: A Tutorial
- Selecting Features for Classifying High-dimensional Data
- Head pose estimation via probabilistic high-dimensional regression
- Extract image convolution features using VGG11 & Pytorch
- Non-rigid Segmentation using Sparse Low Dimensional Manifolds and(泛读)
- High-Dimensional Continuous Control Using Generalized Advantage Estimation
- Two Dimensional TInt Array in Symbian(二维数组)
- Introduction to dynamic two dimensional arrays in C++
- 动态二维数组——The dynamic two-dimensional array
- 通过Git GUI将自己本地的项目上传至Github
- Spring框架基础
- 利用Python+plotly+MySQL制作统计接口请求时间的本地violin图表
- IT码农年底回馈打折福利来了!
- 2017年度科技冷知识,有几个把你雷到?
- Two dimensional data regression using pytorch
- spring
- Java| Java 运算符
- 62. Unique Paths
- java实现AES加解密
- git中出现sign_and_send_pubkey: signing failed: agent refused operation【已解决】
- 贼简单html计算器
- ZeroClipboard.js实现复制文本到剪切板
- 设计思想学习之六大原则