Exercises the coming week
Week 17
Exam 2020: Problems d) and e).?
Exercises for past weeks
Week 16
Extra exercises 12 and 13. You find the zip_nn.R file with examples on the use of nnet() and mlp() on the 2022 page: /studier/emner/matnat/math/STK2100/v22/r-scripts/.
Exam 2021: Problem 3
- Solutions: see solutions from the previous week.
?
Week 15
Exam 2021: Problems 1, 2
Modified version of extra exercise 10: The purpose in this exercise is to reproduce Figures 8.6 and 8.8 of ISLR. The data set to be used in the analysis is the Heart data, store in the file Heart.csv in the data folder of the course. Here are the modified exercises:
a) Do as written in the exercise.
b) Reproduce Figure 8.6 of ISLR, but use r-code-week11.R from the lecture on March 19 as inspiration.
c) Do as written in the exercise, but to get the black curve, use mtry=p (i.e. do splits based on all covariates).
d) Do as written in the exercise, and additionally, you may also use r-code-week14.R from the lecture on April 8 as inspiration.
e) Add addtional curves to the plot, by also fitting gradient boosted trees, with possibly different tree sizes, using r-code-week13.R from the lecture on March 26 as inspiration. Here, you only need to compute the test error.
- Solutions: R-script.
?
Week 13
Exercises from the book: 9.6 (in b only fit a tree). The ozone data are available here with some info here
- Solution: R script.
Exercises from ISLR: Exercise 8.4, 8.8a-c,e
- Solutions: 8.4 and R-code for 8.8a-c, e.
Exam 2019: Exercise 1e (this was given by mistake last week before you had learned about decision trees, sorry about that!)
- Solution: can be found under last week.?
?
Week 12
Exam STK2100 2019: Exercises 1a,d,e and 3
Exam STK2100 2022: Exercise 1 (in d you do not need to consider the three last methods in the table)
?
Week 11
Exercises from the book: 5.4 and 5.7
- Solutions: see the solution manual and/or Vera's notes (5.4 and 5.7).
Exercises from ISLR: Exercise 7.1, 7.9
- Solutions: 7.1. Note that there are some similarities between ISLR 7.1 and ESL Ex. 5.1 from week 11. For 7.9, see the R code examples in ISLR Ch. 7.8 Lab: Non-linear Modeling.
Extra exercise 7
- Solution: Vinnie's solution (pages 22-27)
?
Week 10
Exercises from the book: 4.2 and 5.1
- Solutions: 4.2 and 5.1 or see the solution manual (but note that in 4.2 b)-c), \(\mu_1, \mu_2\) should be \(\hat{\mu}_1, \hat{\mu}_2\).
Exercises from ISLR: 4.9 and 4.14 (a-f)
- Solutions: 4.9 and see Chapter 4.7 Lab: Classification Methods in the ISLR book for R-code-examples similar to 4.14.
Exam STK2100 2018: Problem 2
- Solution: /studier/emner/matnat/math/STK2100/oppgaver/STK2100_2018_fasit.pdf (in Norwegian).
Extra exercise 6
- Solution: See Vera's solution.
?
Week 9
Exercises from the book: 4.1
- Solution: Berit's suggested solution (similar to the one in the solution manual, but with an argument for why the solution is the eigenvector corresponding to the largest eigenvalue of \(\mathbf{W}^{-1} \mathbf{B}\))
Exercise from ISLR: 4.13 (without KNN)
- Solution: R script.
Extra exercise: Modify the example with principal component (PC) regression in the R script r-code-week7.R, so that the numbers of principal components are selected through cross-validation instead of through separate training and test sets. Comment on the results.
- Solution: R script.
?
Week 8
Exercises from the book: 3.2 and 3.29:
- Solutions: For ex. 3.2, see Vera’s script from last year or Lars’ script from 2023. For ex. 3.29, see here (which is similar to this one, but I have added some details and corrected a small typo). Detailed discussions of both exercises can also be found in this solution manual.
Exercises from ISLR: 3.9 a)-c) and e)-f)
- Solutions: See here.
?
Week 7
Exercises from ISLR: 3.3. 3.4, 3.6, and 3.7
- Solutions: See here.
Extra exercises 4 and 5 (you will need this: extra4.r)
- Solutions: See Vinni's solutions and extra4_extended.r.
?
Week 6
Exercises from ISLR: 3.8 and 3.5
- Solutions: See?Vinni's solutions,?this one?or?that one.
Extra exercises (see link above): 1, 2 and 3?
- Solutions:?See?Vinni's solutions, except 2d, for which you can see?Vera's solution?(the lower bound is simply \(\sigma^2\), attained when \(f(x) = g(x)\)). For exercise 1, there are also?Geir's solutions?(excluding the R-part).?
?
Week 5
Exercises from the book: 2.7
- Solutions: See the slides from the exercise session here. Alternatively, see pages 8-9 here, but note a few typos. In (a) for linear regression: \(\hat\beta\) should be \((X^TX)^{-1}X^Ty\) and not \(X(X^TX)^{-1}X^Ty\). For k-NN, a \(y_i\) is missing in the expression. In (b) you should add and subtract \( \mathrm{E}_{\mathcal{Y}|\mathcal{X}}[\hat{f}(x_0)] \) instead of \( \mathrm{E}_{\mathcal{Y}|\mathcal{X}}[f(x_0)] \), and similarly with the unconditional expectation in exercise (c).
Exercises from ISLR: 2.1, 2.2, and 2.8 (see the webpage for the book for downloading data; the easiest alternative is to install the ISLR library (through the command install.packages("ISLR")), make the library available (through the library("ISLR")), and then make the data available through data(College)).
- Solutions:?link to R-file.