homework 3 explain

part1:

idea of the my.grad.desc()

my.grad.desc() X_n+1  <- X_n - lamda*f`(x) #(lamda bi small)

# number of iter = large
nln() = non.linear minimization
newton's X_n+1 <- X_n - (f``(x_n))^-1 * f(`(x_n))
lamda = 1/f(``(Xn))

# number of iter = small 
# cost per iter is high

knn:

IMG_7691.heic

IMG_7692.heic

Tidyverse

Importing Data with Base R, readr, and fread