"MLP" is not to be confused with "NLP", which refers to natural language processing.
Part of a series on
Machine learning and data mining
Paradigms
Supervised learning
Unsupervised learning
Online learning
Batch learning
Meta-learning
Semi-supervised learning
Self-supervised learning
Reinforcement learning
Curriculum learning
Rule-based learning
Quantum machine learning
Problems
Classification
Generative modeling
Regression
Clustering
Dimensionality reduction
Density estimation
Anomaly detection
Data cleaning
AutoML
Association rules
Semantic analysis
Structured prediction
Feature engineering
Feature learning
Learning to rank
Grammar induction
Ontology learning
Multimodal learning
Supervised learning (classification • regression)
Apprenticeship learning
Decision trees
Ensembles
Bagging
Boosting
Random forest
k-NN
Linear regression
Naive Bayes
Artificial neural networks
Logistic regression
Perceptron
Relevance vector machine (RVM)
Support vector machine (SVM)
Clustering
BIRCH
CURE
Hierarchical
k-means
Fuzzy
Expectation–maximization (EM)
DBSCAN
OPTICS
Mean shift
Dimensionality reduction
Factor analysis
CCA
ICA
LDA
NMF
PCA
PGD
t-SNE
SDL
Structured prediction
Graphical models
Bayes net
Conditional random field
Hidden Markov
Anomaly detection
RANSAC
k-NN
Local outlier factor
Isolation forest
Artificial neural network
Autoencoder
Cognitive computing
Deep learning
DeepDream
Feedforward neural network
Kolmogorov–Arnold Network
Recurrent neural network
LSTM
GRU
ESN
reservoir computing
Restricted Boltzmann machine
GAN
Diffusion model
SOM
Convolutional neural network
U-Net
Transformer
Vision
Mamba
Spiking neural network
Memtransistor
Electrochemical RAM (ECRAM)
Reinforcement learning
Q-learning
SARSA
Temporal difference (TD)
Multi-agent
Self-play
Learning with humans
Active learning
Crowdsourcing
Human-in-the-loop
RLHF
Model diagnostics
Coefficient of determination
Confusion matrix
Learning curve
ROC curve
Mathematical foundations
Kernel machines
Bias–variance tradeoff
Computational learning theory
Empirical risk minimization
Occam learning
PAC learning
Statistical learning
VC theory
Machine-learning venues
ECML PKDD
NeurIPS
ICML
ICLR
IJCAI
ML
JMLR
Related articles
Glossary of artificial intelligence
List of datasets for machine-learning research
List of datasets in computer vision and image processing
Outline of machine learning
v
t
e
A multilayer perceptron (MLP) is a name for a modern feedforward artificial neural network, consisting of fully connected neurons with a nonlinear kind of activation function, organized in at least three layers, notable for being able to distinguish data that is not linearly separable.[1] It is a misnomer because the original perceptron used a Heaviside step function, instead of a nonlinear kind of activation function (used by modern networks).
Modern feedforward networks are trained using the backpropagation method[2][3][4][5][6] and are colloquially referred to as the "vanilla" neural networks.[7]
^Cybenko, G. 1989. Approximation by superpositions of a sigmoidal function Mathematics of Control, Signals, and Systems, 2(4), 303–314.
^Cite error: The named reference lin1970 was invoked but never defined (see the help page).
^Cite error: The named reference kelley1960 was invoked but never defined (see the help page).
^Rosenblatt, Frank. x. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Spartan Books, Washington DC, 1961
^Cite error: The named reference werbos1982 was invoked but never defined (see the help page).
^Rumelhart, David E., Geoffrey E. Hinton, and R. J. Williams. "Learning Internal Representations by Error Propagation". David E. Rumelhart, James L. McClelland, and the PDP research group. (editors), Parallel distributed processing: Explorations in the microstructure of cognition, Volume 1: Foundation. MIT Press, 1986.
^Hastie, Trevor. Tibshirani, Robert. Friedman, Jerome. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, New York, NY, 2009.
and 23 Related for: Multilayer perceptron information
A multilayerperceptron (MLP) is a name for a modern feedforward artificial neural network, consisting of fully connected neurons with a nonlinear kind...
multilayerperceptron) had greater processing power than perceptrons with one layer (also called a single-layer perceptron). Single-layer perceptrons...
to the weights, thus implementing a form of gradient descent. A multilayerperceptron (MLP) is a misnomer for a modern feedforward artificial neural network...
as sequence-prediction that are beyond the power of a standard multilayerperceptron. Jordan networks are similar to Elman networks. The context units...
backpropagation and helped to initiate an active period of research in multilayerperceptrons. Backpropagation computes the gradient in weight space of a feedforward...
of today, referring to Rosenblatt's 1962 book which introduced multilayerperceptron (MLP) with 3 layers: an input layer, a hidden layer with randomized...
but the standard perceptron unit weights are adjusted to match the correct output, after applying the Heaviside function. A multilayer network of ADALINE...
instead of multilayerperceptron. PNNs are much faster than multilayerperceptron networks. PNNs can be more accurate than multilayerperceptron networks...
analysis Decision trees K-nearest neighbor algorithm Neural networks (Multilayerperceptron) Similarity learning Given a set of N {\displaystyle N} training...
polynomials that permit additions and multiplications. It uses a deep multilayerperceptron with eight layers. It is a supervised learning network that grows...
application-level protocol for receiving the position of Mobile Stations Multilayerperceptron, a class of artificial neural network Multilink PPP, a networking...
discriminator function D {\displaystyle D} to be implemented by a multilayerperceptron: D = D n ∘ D n − 1 ∘ ⋯ ∘ D 1 {\displaystyle D=D_{n}\circ D_{n-1}\circ...
the network containing the neuron. Crucially, for instance, any multilayerperceptron using a linear transfer function has an equivalent single-layer...
amount of input information available to the network. For example, multilayerperceptron (MLPs) and time delay neural network (TDNNs) have limitations on...
could accelerate optimization without this constraint. Consider a multilayerperceptron (MLP) with one hidden layer and m {\displaystyle m} hidden units...
Deng, and Guang-Bin Huang (2016). "Extreme Learning Machine for MultilayerPerceptron" (PDF). IEEE Transactions on Neural Networks and Learning Systems...
{\displaystyle D} . In the original paper, the authors demonstrated it using multilayerperceptron networks and convolutional neural networks. Many alternative architectures...
in every iteration of propagation (as is the case in a typical multilayerperceptron network), but only when their membrane potentials reach a certain...
one-hot encodings and concatenated to form the raw input sequence. A multilayerperceptron network, which encodes the "CPU state", that is, the states of each...