HTML tutorial
CSS3 tutorial
Bootstrap tutorial
JavaScript tutorial
JQuery tutorial
AngularJS tutorial
React tutorial
NodeJS tutorial
PHP tutorial
Python tutorial
Python3 tutorial
Django tutorial
Linux tutorial
Docker tutorial
Ruby tutorial
Java tutorial
C tutorial
C ++ tutorial
Perl tutorial
JSP tutorial
Lua tutorial
Scala tutorial
Go tutorial
ASP.NET tutorial
C # tutorial
A Perceptron is an Artificial Neuron
In 1957 he started something really big. He "invented" a Perceptron program, on an IBM 704 computer at Cornell Aeronautical Laboratory.
Scientists had discovered that brain cells (Neurons) receive input from our senses by electrical signals.
The Neurons, then again, use electrical signals to store information, and to make decisions based on previous input.
Frank had the idea that Perceptrons could simulate brain principles, with the ability to learn and make decisions.
The original Perceptron was designed to take a number of binary inputs, and produce one binary output (0 or 1).
The idea was to use different weights to represent the importance of each input, and that the sum of the values should be greater than a threshold value before making a decision like true or false (0 or 1).
Imagine a perceptron (in your brain).
The perceptron tries to decide if you should go to a concert.
Is the artist good? Is the weather good?
What weights should these facts have?
Criteria | Input | Weight |
---|---|---|
Artists is Good | x1 = 0 or 1 | w1 = 0.7 |
Weather is Good | x2 = 0 or 1 | w2 = 0.6 |
Friend will Come | x3 = 0 or 1 | w3 = 0.5 |
Food is Served | x4 = 0 or 1 | w4 = 0.3 |
Alcohol is Served | x5 = 0 or 1 | w5 = 0.4 |
Frank Rosenblatt suggested this algorithm:
If the weather weight is 0.6 for you, it might different for someone else. A higher weight means that the weather is more important to them.
If the threshold value is 1.5 for you, it might be different for someone else. A lower threshold means they are more wanting to go to the concert.
const threshold = 1.5;
const inputs = [1, 0, 1, 0, 1];
const weights = [0.7, 0.6, 0.5, 0.3, 0.4];
let sum = 0;
for (let i = 0; i < inputs.length; i++) {
sum += inputs[i] * weights[i];
}
const activate = (sum > 1.5);
Perceptron inputs are called nodes.
The nodes have both a value and a weight.
In the example above, the node values are: 1, 0, 1, 0, 1
The binary input values (0 or 1) can be interpreted as (no or yes) or (false or true).
Weights shows the strength of each node.
In the example above, the node weights are: 0.7, 0.6, 0.5, 0.3, 0.4
The activation functions maps the result (the weighted sum) into a required value like 0 or 1.
In the example above, the activation function is simple: (sum > 1.5)
The binary output (1 or 0) can be interpreted as (yes or no) or (true or false).
It is obvious that a decision is NOT made by one neuron alone.
Other neurons must provide input: Is the artist good. Is the weather good...
In Neuroscience, there is a debate if single-neuron encoding or distributed encoding is most relevant for understanding brain functions.
The Perceptron defines the first step into Neural Networks.
In the Neural Network Model, input data (yellow) are processed against a hidden layer (blue) and modified against more hidden layers (green) to produce the final output (red).