C1_W2: Regression with Multiple Input Variables

C1_W2: Regression with Multiple Input Variables

This week, you’ll extend linear regression to handle multiple input features. You’ll also learn some methods for improving your model’s training and performance, such as vectorization, feature scaling, feature engineering and polynomial regression. At the end of the week, you’ll get to practice implementing linear regression in code.

C1_W2_M1 Multiple Linear Regression

C1_W2_M1_1 Multiple features

-$\vec{x}^{(i)}$= vector of 4 parameters for$i^{th}$row =$[1416 3 2 40] $

Quiz

In the training set below (see slide: C1W2_M1_1 Multiple features), what is$x{1}^{(4)} $?

Ans852

C1_W2_M1_2 Vectorization part 1

Learning to write vectorized code allows you to take advantage of modern numberical linear algebra libraries, as well as maybe GPU hardware.

C1_W2_M1_3 Vectorization part 2

How does vectorized algorithm works…

C1_W2_Lab01: Python Numpy Vectorization

C1_W2_M1_4 Gradient descent for multiple linear regression

C1_W2_Lab02: Muliple linear regression

Quiz: Multiple linear regression

  1. In the training set below, what is$x_4^{(3)} $?
Size Rooms Floors Age Price
2104 5 1 45 460
1416 3 2 40 232
1534 3 2 30 315
852 2 1 36 178
  1. Which of the following are potential benefits of vectorization?
    • It makes your code run faster
    • It makes your code shorter
    • It allows your code to run more easily on parallel compute hardware
    • All of the above
  2. To make a gradient descent converge about twice as fast, a technique that almost always works is to double the learning rate$alpha $
    • True
    • False
Ans30, 4, F

C1_W2_M2 Gradient Descent in Practice

C1_W2_M2_01 Feature scaling part 1

:bulb: We can speed up gradient descent by scaling our features

C1_W2_M2_02 Feature scaling part 2

Quiz:

Which of the following is a valid step used during feature scaling? (see bedrooms vs size scatterplot)

Ans2

C1_W2_M2_03 Checking gradient descent for convergence

C1_W2_M2_04 Choosing the learning rate

C1_W2_M2_05 Optional Lab: Feature scaling and learning rate

C1_W2_M2_06 Feature engineering

C1_W2_M2_07 Polynomial regression

C1_W2_M2_08 Optional lab: Feature engineering and Polynomial regression

C1_W2_M2_09 Optional lab: Linear regression with scikit-learn

C1_W2_M2_10 Practice quiz: Gradient descent in practice

C1_W2_M2_11 Week 2 practice lab: Linear regression