The distribution over functions corresponding to an infinitely wide Bayesian neural network.
A Neural Network Gaussian Process (NNGP) is a Gaussian process (GP) obtained as the limit of a certain type of sequence of neural networks. Specifically, a wide variety of network architectures converges to a GP in the infinitely wide limit, in the sense of distribution.[1][2][3][4][5][6][7][8]
The concept constitutes an intensional definition, i.e., a NNGP is just a GP, but distinguished by how it is obtained.
^
Williams, Christopher K. I. (1997). "Computing with infinite networks". Neural Information Processing Systems.
^Lee, Jaehoon; Bahri, Yasaman; Novak, Roman; Schoenholz, Samuel S.; Pennington, Jeffrey; Sohl-Dickstein, Jascha (2017). "Deep Neural Networks as Gaussian Processes". International Conference on Learning Representations. arXiv:1711.00165. Bibcode:2017arXiv171100165L.
^
G. de G. Matthews, Alexander; Rowland, Mark; Hron, Jiri; Turner, Richard E.; Ghahramani, Zoubin (2017). "Gaussian Process Behaviour in Wide Deep Neural Networks". International Conference on Learning Representations. arXiv:1804.11271. Bibcode:2018arXiv180411271M.
^
Novak, Roman; Xiao, Lechao; Lee, Jaehoon; Bahri, Yasaman; Yang, Greg; Abolafia, Dan; Pennington, Jeffrey; Sohl-Dickstein, Jascha (2018). "Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes". International Conference on Learning Representations. arXiv:1810.05148. Bibcode:2018arXiv181005148N.
^
Garriga-Alonso, Adrià; Aitchison, Laurence; Rasmussen, Carl Edward (2018). "Deep Convolutional Networks as shallow Gaussian Processes". International Conference on Learning Representations. arXiv:1808.05587. Bibcode:2018arXiv180805587G.
^
Borovykh, Anastasia (2018). "A Gaussian Process perspective on Convolutional Neural Networks". arXiv:1810.10798 [stat.ML].
^
Tsuchida, Russell; Pearce, Tim; van der Heide, Christopher; Roosta, Fred; Gallagher, Marcus (2020). "Avoiding Kernel Fixed Points: Computing with ELU and GELU Infinite Networks". arXiv:2002.08517 [cs.LG].
^
Yang, Greg (2019). "Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes" (PDF). Advances in Neural Information Processing Systems. arXiv:1910.12478. Bibcode:2019arXiv191012478Y.
and 26 Related for: Neural network Gaussian process information
Bayesian neuralnetworks reduce to a Gaussianprocess with a closed form compositional kernel. This Gaussianprocess is called the NeuralNetworkGaussian Process...
methods based on neuralnetworks with representation learning. The adjective "deep" refers to the use of multiple layers in the network. Methods used can...
emerge: At initialization (before training), the neuralnetwork ensemble is a zero-mean Gaussianprocess (GP). This means that distribution of functions...
Chatzis, S. P.; Demiris, Y. (2011). "Echo State GaussianProcess". IEEE Transactions on NeuralNetworks. 22 (9): 1435–1445. doi:10.1109/TNN.2011.2162109...
developed by Ian Goodfellow and his colleagues in June 2014. In a GAN, two neuralnetworks contest with each other in the form of a zero-sum game, where one agent's...
types of artificial neuralnetworks (ANN). Artificial neuralnetworks are computational models inspired by biological neuralnetworks, and are used to approximate...
space and foregoing the need to query a neuralnetwork for each point. Instead, simply "splat" all the gaussians onto the screen and they overlap to produce...
visual operations. Gaussian functions are used to define some types of artificial neuralnetworks. In fluorescence microscopy a 2D Gaussian function is used...
Neural coding (or neural representation) is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the...
involves training a neuralnetwork to sequentially denoise images blurred with Gaussian noise. The model is trained to reverse the process of adding noise...
machine learning, Gaussianprocess approximation is a computational method that accelerates inference tasks in the context of a Gaussianprocess model, most...
neuralnetworks, marking a departure from the typical focus on learning mappings between finite-dimensional Euclidean spaces or finite sets. Neural operators...
analysis, information theory, data science, neuralnetworks, finance and marketing. A sample path of a diffusion process models the trajectory of a particle embedded...
Neural oscillations, or brainwaves, are rhythmic or repetitive patterns of neural activity in the central nervous system. Neural tissue can generate oscillatory...
learning, cellular neuralnetworks (CNN) or cellular nonlinear networks (CNN) are a parallel computing paradigm similar to neuralnetworks, with the difference...
trained 6 experts, each being a "time-delayed neuralnetwork" (essentially a multilayered convolution network over the mel spectrogram). They found that...
that hyper-parameter tunings can be cheaply transferred between large neuralnetworks without the need for re-training. In numerical analysis, random matrices...
regression splines smoothing splines neuralnetworks In Gaussianprocess regression, also known as Kriging, a Gaussian prior is assumed for the regression...
number of training samples, X {\displaystyle X} is the input to a deep neuralnetwork, and T {\displaystyle T} is the output of a hidden layer. This generalization...
methods, connecting a neural encoder network to its decoder through a probabilistic latent space (for example, as a multivariate Gaussian distribution) that...