# 课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization)，第一周（Practical aspects of Deep Learning） —— 2.Programming assignments:Initialization

## Initialization

Welcome to the first assignment of the hyper parameters tuning(超参数调整) specialization. It is very important that you regularize your model properly because it could dramatically improve your results.

By completing this assignment you will:

- Implement dropout and see it work on data.

- Recognize that a model without regularization gives you a better accuracy on the training set but nor necessarily on the test set.

- Understand that you could use both dropout and regularization on your model.

This assignment prepares you well for the upcoming assignment. Take your time to complete it and make sure you get the expected outputs when working through the different exercises. In some code blocks, you will find a "#GRADED FUNCTION: functionName" comment. Please do not modify it. After you are done, submit your work and check your results. You need to score 80% to pass. Good luck :) !

【中文翻译】

-了解可以帮助您的模型的不同的正则化方法。

-实施dropout, 并看到它对数据的有效。

-认识到没有正则化的模型会让你在训练集上有更好的精确性, 但在测试集上不是。
-了解您可以在您的模型上使用dropout和正则化。

加速梯度下降的收敛
增加梯度下降收敛到较低的训练 (和泛化) 错误的几率

【code】
```import numpy as np
import matplotlib.pyplot as plt
import sklearn
import sklearn.datasets
from init_utils import sigmoid, relu, compute_loss, forward_propagation, backward_propagation
from init_utils import update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec

%matplotlib inline
plt.rcParams[‘figure.figsize‘] = (7.0, 4.0) # set default size of plots
plt.rcParams[‘image.interpolation‘] = ‘nearest‘
plt.rcParams[‘image.cmap‘] = ‘gray‘

# load image dataset: blue/red dots in circles
train_X, train_Y, test_X, test_Y = load_dataset()```

【result】

You would like a classifier to separate the blue dots from the red dots.

-------------------------------------------------------------------------------------------

-------------------------------------------------------------------------------------------

-------------------------------------------------------------------------------------------

-------------------------------------------------------------------------------------------

(0)
(0)

0条