码迷,mamicode.com
首页 > Web开发 > 详细

课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第一周(Practical aspects of Deep Learning) —— 2.Programming assignments:Initialization

时间:2017-12-15 00:46:35      阅读:59      评论:0      收藏:0      [点我收藏+]

标签:res   imp   function   int   hyper   line   when   spec   update   

Initialization

Welcome to the first assignment of the hyper parameters tuning(超参数调整) specialization. It is very important that you regularize your model properly because it could dramatically improve your results.

技术分享图片

By completing this assignment you will:

- Understand that different regularization methods that could help your model.

- Implement dropout and see it work on data.

- Recognize that a model without regularization gives you a better accuracy on the training set but nor necessarily on the test set.

- Understand that you could use both dropout and regularization on your model.

This assignment prepares you well for the upcoming assignment. Take your time to complete it and make sure you get the expected outputs when working through the different exercises. In some code blocks, you will find a "#GRADED FUNCTION: functionName" comment. Please do not modify it. After you are done, submit your work and check your results. You need to score 80% to pass. Good luck :) !

【中文翻译】

初始

欢迎来到超参数调整 (超参数调整) 课程的第一个作业。你正确地正则化你的模型, 这是非常重要的, 因为它可以大大改善你的结果。
 技术分享图片
通过完成此任务, 您将:
  -了解可以帮助您的模型的不同的正则化方法。

  -实施dropout, 并看到它对数据的有效。 

  -认识到没有正则化的模型会让你在训练集上有更好的精确性, 但在测试集上不是。
  -了解您可以在您的模型上使用dropout和正则化。
这个任务为即将到来的任务做好准备。用你的时间完成它,, 通过不同的练习,并确保你得到预期的结果。在某些代码块中, 您将找到一个 "#GRADED FUNCTION: functionName" 的注释。请不要修改它。完成后, 提交您的工作, 并检查您的结果。你需要得分80% 才能过关。祝你好运:)!
 -------------------------------------------------------------------------------------------

Initialization

Welcome to the first assignment of "Improving Deep Neural Networks".

Training your neural network requires specifying an initial value of the weights. A well chosen initialization method will help learning.

If you completed the previous course of this specialization, you probably followed our instructions for weight initialization, and it has worked out so far. But how do you choose the initialization for a new neural network? In this notebook, you will see how different initializations lead to different results.

A well chosen initialization can:

  • Speed up the convergence of gradient descent
  • Increase the odds of gradient descent converging to a lower training (and generalization) error

To get started, run the following cell to load the packages and the planar dataset you will try to classify.

 

【中文翻译 】

初始
欢迎来到 "改进深层神经网络" 的第一个任务。
训练神经网络需要指定权重的初始值。一个精心选择的初始化方法将有助于学习。
如果您完成了这一专业的前一课, 您可能遵循了我们的权重初始化指令, 并且一直用这种方法到现在。但是, 你如何选择一个新的神经网络的初始化?在本笔记本中, 您将看到不同的初始化如何导致不同的结果。
选择良好的初始化可以:
  加速梯度下降的收敛
  增加梯度下降收敛到较低的训练 (和泛化) 错误的几率
要开始, 请运行下面的单元来加载包和要尝试去分类的平面数据集。
 
【code】
import numpy as np
import matplotlib.pyplot as plt
import sklearn
import sklearn.datasets
from init_utils import sigmoid, relu, compute_loss, forward_propagation, backward_propagation
from init_utils import update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec

%matplotlib inline
plt.rcParams[figure.figsize] = (7.0, 4.0) # set default size of plots
plt.rcParams[image.interpolation] = nearest
plt.rcParams[image.cmap] = gray

# load image dataset: blue/red dots in circles
train_X, train_Y, test_X, test_Y = load_dataset()

 

【result】

技术分享图片

You would like a classifier to separate the blue dots from the red dots.

 

 

 -------------------------------------------------------------------------------------------

 -------------------------------------------------------------------------------------------

 -------------------------------------------------------------------------------------------

 -------------------------------------------------------------------------------------------

 

课程二(Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization),第一周(Practical aspects of Deep Learning) —— 2.Programming assignments:Initialization

标签:res   imp   function   int   hyper   line   when   spec   update   

原文地址:http://www.cnblogs.com/hezhiyao/p/8040613.html

(0)
(0)
   
举报
评论 一句话评论(0
0条  
登录后才能评论!
© 2014 mamicode.com 版权所有 京ICP备13008772号-2
迷上了代码!