3.7 Softmax regression brief achievement
123import torchfrom torch import nnfrom d2l import torch as d2l
12batch_size = 256train_iter, test_iter = d2l.load_data_fashion_mnist(batch_size)#先加载数据集并且分类为训练集和测试集
3.7.1 Initialize modelling parameters1234567891011#Softmax regression output layer is a fully connection layer. So to realize the model, we only#need to add a 10-output fully connected layer in Sequential#pytorch will not adjust shape of input implicitly#so we define flatten layer before Linear layer to adjust the shape of network in ...
3.6 Start softmax regression from zero
12345#We should know about the detail of softmax, so we use Fashion-Mnist dataset in 3.5 chapter, and we set the#batch_size 256import torchfrom IPython import displayfrom d2l import torch as d2l
12batch_size = 256train_iter, test_iter = d2l.load_data_fashion_mnist(batch_size, resize=None)#用到了我们3.5节定义的函数
1234567891011121314#we know each image is 28x28, so we can regard them as vectors with length 784.#we recall y_j is the probability of class j with positive number and normalized by 1.#Because we ...
线性表
定义线性表注意
123对数组进行封装,在存储空间起始位置即为数组data,设置最大存储容量,一般不会变当前长度length,会变化
代码如下
12345typedef struct{ int data[MAXSIZE]; int length;}SqList;
变化的集合我们用*L,不变的集合我们参数才用L。下面是A = A并B的操作,所以我们知道A是要改变的,传入地址。
123456789101112131415void unionL(List *La, list Lb){ int La_len, Lb_len, i; ElemType e; La_len = ListLength(*La); Lb_len = ListLength(Lb); for (i = 1; i <= Lb_len; i++)//也就是判断b中每一个元素是否在La中 { GetElem(Lb, i, &e); if(!LocateElem(*La, e)) { ListInsert(La, ++La_len, e); ...
3.4 softmax regression and 3.5 The image classification dataset
3.4 softmax regression12345#硬性类别:属于某一类别#软性类别:属于某一类别的概率#我们用one-hot encoding来分类,one-hot encoding是一个向量,维数等于类别数#比如我们有猫狗鸡三类,向量为(1,0,0)或(0,1,0)或(0,0,1),当一个动物被分类为鸡时,可以看成(0,0,1)其他两个以此类推#所以y属于{(1,0,0),{0,1,0},{0,0,1}}
3.4.2 network architecture1#下面是公式推导部分,详见现有网址,后续这里补充推导。
3.5 image classification dataset1#MNIST数据集是图像分类中最广泛的数据集之一,我们使用更复杂的Fashion-MNIST数据集
1234567%matplotlib inlineimport torchimport torchvisionfrom torch.utils import datafrom torchvision import transformsfrom d ...
K-means算法python实现
1.我们后期补充数学推导我们算法的原理如下:
1首先对于一批样本点,我们并没有头绪,首先随机找到几个已知点作为初始质心,然后分别对样本所有点,计算其到质心的距离,为每个样本点选择最近的质心,这样就形成了一个初始的归类了,然后我们对于每个类,重新计算质心,得到新的质心后,对所有样本,再执行到质心距离的计算和判别,重新分类,多次迭代实现分类。
我们来实现一下伪代码:
12345678910111213141516171819init_center[] = select_random_smaple_point(k)#所以为k-means算法for i in range(dataset): for j in range(init_center): sample_to_center_distances[i][j] = calculate_distance(dataset(i), init_center(j)) sample_belongs_to_center[i] = min(sample_to_center_distances[i])#取距离最小的那个int class[][]k=0#将每个 ...
Information Retrieval basis
Precision and RecallWe use precision and recall as the measurements to estimate the efficiency of a retrieval system.
Now there is a ‘contingency’ table.
and now we define the measurements using formulas.
& PRECISION = \frac{|A\cap B|}{|B|} \\
& RECALL = \frac{|A\cap B|}{|A|} \\
& FALLOUT = \frac{|\overline {A} \cap B|}{|\overline A|}and There is a function relationship between all three involving a parameter called generality(G), which is a measure of the density of the relevant documents in ...
3.3 线性回归的简洁实现
3.3 线性回归的简洁实现1234import numpy as npimport torchfrom torch.utils import data#新导入一个data包from d2l import torch as d2l
3.3.1 生成数据集123true_w = torch.tensor([2, -3.4])true_b = 4.2features, labels = d2l.synthetic_data(true_w, true_b, 1000)#生成1000行数据,给出真实的特征和最后的结果
3.3.2 读取数据集1234567#we use the api to read data and we apply the features and labels as parameters of api, assigning batch_size by #data_iterdef load_array(data_arrays, batch_size, is_train=True): #@ #is_train represents whether the data wil ...
3.1 Linear regression
3.1.2 向量化加速123456%matplotlib inlineimport mathimport timeimport numpy as npimport torchfrom d2l import torch as d2l
1234n = 10000#10000维向量a = torch.ones(n)b = torch.ones(n)a, b
(tensor([1., 1., 1., ..., 1., 1., 1.]),
tensor([1., 1., 1., ..., 1., 1., 1.]))
12345678910111213141516171819202122232425class Timer():#@save """记录多次运行时间""" def __init__(self): self.times = [] self.start() def start(self): """启动计时器"& ...
2.6 Probability and Statistics
2.6 Probability and Statistics12345%matplotlib inlineimport randomimport torchfrom torch.distributions.multinomial import Multinomialfrom d2l import torch as d2l
12345num_toss = 100heads = sum([random.random() > 0.5 for _ in range(100)])#random() between 0 and 1tails = num_toss - headsprint("heads tails: ", [heads, tails])#完成了一个抽样过程,利用random函数
heads tails: [59, 41]
123fair_probs = torch.tensor([0.5, 0.5])Multinomial(100, fair_probs).sample()#We take 100 draws and set the probility ...
2.4 calculus with pytorch
2.4 Calculus12345678%matplotlib inline#This is a magic command in order to display the function curves directly in the notebook but not interactively.import numpy as npfrom matplotlib_inline import backend_inlinefrom d2l import torch as d2ldef f(x): return 3 * x ** 2 - 4 * x
123456789def numerical_lim(f, x, h): return (f(x + h) - f(x)) / hh = 0.1for i in range(5): print(f'h = {h:.5f}, numerical limit = {numerical_lim(f, 1, h):.5f}') #namely figure out th ...



