机器学习
\(x\)为学习时间,\(y\)为学习该时间能够在考试中取得的分数
在这里来为这些数据寻求一个最好的模型
线性回归
Linear Model:\(\hat{y}=x*w\)
训练损失 (误差)
MSE(Mean Squared Mean)均方误差:
\(x\ (Hours)\) | \(y \ (Points)\) | \(y\)_\(predict\ (w=3)\) | \(Loss\ (w=3)\) |
---|---|---|---|
1 | 2 | 3 | 1 |
2 | 4 | 6 | 4 |
3 | 6 | 9 | 9 |
\(mean=14/3\) |
\(x\ (Hours)\) | \(y \ (Points)\) | \(y\)_\(predict\ (w=3)\) | \(Loss\ (w=3)\) |
---|---|---|---|
1 | 2 | 4 | 4 |
2 | 4 | 8 | 16 |
3 | 6 | 12 | 36 |
\(mean=56/3\) |
\(x\ (Hours)\) | \(Loss\ (w=0)\) | \(Loss\ (w=1)\) | \(Loss\ (w=2)\) | \(Loss\ (w=3)\) | \(Loss\ (w=4)\) |
---|---|---|---|---|---|
1 | 4 | 1 | 0 | 1 | 4 |
2 | 16 | 4 | 0 | 4 | 16 |
3 | 36 | 9 | 0 | 9 | 36 |
\(MSE\) | 18.7 | 4.7 | 0 | 4.7 | 18.7 |
import numpy as np
import matplotlib.pyplot as plt
x_data = [1.0, 2.0, 3.0]
y_data = [2.0, 4.0, 6.0]
def forward(x):
return x*w # 定义线性模型:y=x*w
def loss(x, y):
y_pred = forward(x)
return (y_pred-y)**2 # 定义损失函数(y_pred-y)^2
w_list = []
mse_list = []
for w in np.arange(0.0, 4.1, 0.1): # w从0.0取到4.1间隔0.1
print('w=', w)
l_sum = 0 # 重置总损失为0
for x_val, y_val in zip(x_data, y_data): # 成对提取x,y
y_pred_val = forward(x_val) # 获得y_pred
loss_val = loss(x_val, y_val) # 计算损失函数Loss
l_sum += loss_val # 将损失添加到总损失中
print('\t', x_val, y_val, y_pred_val, loss_val)
print('MSE',l_sum/3) # 计算均方误差
w_list.append(w) # 将w添加到列表中
mse_list.append(l_sum/3) # 将均方误差添加到列表中
w= 1.9000000000000001
1.0 2.0 1.9000000000000001 0.009999999999999974
2.0 4.0 3.8000000000000003 0.0399999999999999
3.0 6.0 5.7 0.0899999999999999
MSE 0.046666666666666586
w= 2.0
1.0 2.0 2.0 0.0
2.0 4.0 4.0 0.0
3.0 6.0 6.0 0.0
MSE 0.0
w= 2.1
1.0 2.0 2.1 0.010000000000000018
2.0 4.0 4.2 0.04000000000000007
3.0 6.0 6.300000000000001 0.09000000000000043
MSE 0.046666666666666835
w= 2.2
1.0 2.0 2.2 0.04000000000000007
2.0 4.0 4.4 0.16000000000000028
3.0 6.0 6.6000000000000005 0.36000000000000065
MSE 0.18666666666666698
w= 2.3000000000000003
1.0 2.0 2.3000000000000003 0.09000000000000016
2.0 4.0 4.6000000000000005 0.36000000000000065
3.0 6.0 6.9 0.8100000000000006
MSE 0.42000000000000054
w= 2.4000000000000004
1.0 2.0 2.4000000000000004 0.16000000000000028
2.0 4.0 4.800000000000001 0.6400000000000011
3.0 6.0 7.200000000000001 1.4400000000000026
MSE 0.7466666666666679
w= 2.5
1.0 2.0 2.5 0.25
2.0 4.0 5.0 1.0
3.0 6.0 7.5 2.25
MSE 1.1666666666666667
w= 2.6
1.0 2.0 2.6 0.3600000000000001
2.0 4.0 5.2 1.4400000000000004
3.0 6.0 7.800000000000001 3.2400000000000024
MSE 1.6800000000000008
w= 2.7
1.0 2.0 2.7 0.49000000000000027
2.0 4.0 5.4 1.960000000000001
3.0 6.0 8.100000000000001 4.410000000000006
MSE 2.2866666666666693
w= 2.8000000000000003
1.0 2.0 2.8000000000000003 0.6400000000000005
2.0 4.0 5.6000000000000005 2.560000000000002
3.0 6.0 8.4 5.760000000000002
MSE 2.986666666666668
w= 2.9000000000000004
1.0 2.0 2.9000000000000004 0.8100000000000006
2.0 4.0 5.800000000000001 3.2400000000000024
3.0 6.0 8.700000000000001 7.290000000000005
MSE 3.780000000000003
w= 3.0
1.0 2.0 3.0 1.0
2.0 4.0 6.0 4.0
3.0 6.0 9.0 9.0
MSE 4.666666666666667
w= 3.1
1.0 2.0 3.1 1.2100000000000002
2.0 4.0 6.2 4.840000000000001
3.0 6.0 9.3 10.890000000000004
MSE 5.646666666666668
w= 3.2
1.0 2.0 3.2 1.4400000000000004
2.0 4.0 6.4 5.760000000000002
3.0 6.0 9.600000000000001 12.96000000000001
MSE 6.720000000000003
w= 3.3000000000000003
1.0 2.0 3.3000000000000003 1.6900000000000006
2.0 4.0 6.6000000000000005 6.7600000000000025
3.0 6.0 9.9 15.210000000000003
MSE 7.886666666666668
w= 3.4000000000000004
1.0 2.0 3.4000000000000004 1.960000000000001
2.0 4.0 6.800000000000001 7.840000000000004
3.0 6.0 10.200000000000001 17.640000000000008
MSE 9.14666666666667
w= 3.5
1.0 2.0 3.5 2.25
2.0 4.0 7.0 9.0
3.0 6.0 10.5 20.25
MSE 10.5
plt.plot(w_list,mse_list)
plt.ylabel('Loss')
plt.xlabel('w')
plt.show()