码迷,mamicode.com
首页 > 编程语言 > 详细

Python for Data Science - A neural network with a Perceptron

时间:2021-01-27 13:05:55      阅读:0      评论:0      收藏:0      [点我收藏+]

标签:prope   te pro   abi   you   cal   set   purpose   process   perl   

Chapter 6 - Other Popular Machine Learning Methods

Segment 2 - A neural network with a Perceptron

Perceptron

A perceptron is a neural network with just one layer,

It‘s a linear classifier that outputs a binary response variable.

Consequently, the algorithm is called a "linear binary classifier."

Linear Separability

  • Data is said to have "linear separability " if it can be cleanly classified into one of two classes.
  • Your data must be linearly separable in order for a perceptron to operate properly.

Activation Function

An activation function is a mathematical function that is deployed on each unit in a neural network.

All units in a shared layer deploy the same activation function.

The purpose of activation functions is to enable neural networks to model complex, nonlinear phenomenon.

import numpy as np
import pandas as pd
import sklearn

from pandas import Series, DataFrame
from sklearn import datasets
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
from sklearn.metrics import confusion_matrix, classification_report
from sklearn.linear_model import Perceptron
iris = datasets.load_iris()

X = iris.data
y = iris.target

X[0:10,]
array([[5.1, 3.5, 1.4, 0.2],
       [4.9, 3. , 1.4, 0.2],
       [4.7, 3.2, 1.3, 0.2],
       [4.6, 3.1, 1.5, 0.2],
       [5. , 3.6, 1.4, 0.2],
       [5.4, 3.9, 1.7, 0.4],
       [4.6, 3.4, 1.4, 0.3],
       [5. , 3.4, 1.5, 0.2],
       [4.4, 2.9, 1.4, 0.2],
       [4.9, 3.1, 1.5, 0.1]])
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
standardize = StandardScaler()

standardized_X_test = standardize.fit_transform(X_test)

standardized_X_train = standardize.fit_transform(X_train)
standardized_X_test[0:10,]
array([[ 0.60104076, -0.54300257,  1.03062704,  1.00726119],
       [-0.14142136,  2.04272396, -1.19635601, -1.17060084],
       [ 0.8131728 , -0.54300257,  0.87525613,  1.68784307],
       [ 0.49497475, -0.28442992,  1.03062704,  1.00726119],
       [-0.88388348,  0.7498607 , -1.09277541, -1.03448446],
       [-1.20208153, -0.54300257, -1.09277541, -1.30671722],
       [-0.56568542,  1.78415131, -0.9374045 , -0.89836809],
       [-0.45961941,  0.7498607 , -1.14456571, -1.17060084],
       [-0.88388348, -1.83586584, -0.26413055,  0.05444655],
       [-0.98994949,  0.49128804, -1.0409851 , -1.17060084]])
perceptron = Perceptron(max_iter=50, eta0=0.15, tol=1e-3, random_state=15)

perceptron.fit(standardized_X_train, y_train.ravel())
Perceptron(eta0=0.15, max_iter=50, random_state=15)
y_pred = perceptron.predict(standardized_X_test)
print(y_test)
[2 0 2 2 0 0 0 0 1 0 0 0 1 2 0 2 2 0 1 2 2 1 1 1 2 1 2 0 0 0]
print(y_pred)
[2 0 2 2 0 0 0 0 1 0 0 0 1 1 0 2 2 0 1 1 2 1 1 1 2 1 2 0 0 0]
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       1.00      1.00      1.00        13
           1       0.78      1.00      0.88         7
           2       1.00      0.80      0.89        10

    accuracy                           0.93        30
   macro avg       0.93      0.93      0.92        30
weighted avg       0.95      0.93      0.93        30

Python for Data Science - A neural network with a Perceptron

标签:prope   te pro   abi   you   cal   set   purpose   process   perl   

原文地址:https://www.cnblogs.com/keepmoving1113/p/14327357.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!