This article is an orphan, as no other articles link to it. Please introduce links to this page from related articles; try the Find link tool for suggestions. (January 2024)
Neural operators are a class of deep learning architectures designed to learn maps between infinite-dimensional function spaces. Neural operators represent an extension of traditional artificial neural networks, marking a departure from the typical focus on learning mappings between finite-dimensional Euclidean spaces or finite sets. Neural operators directly learn operators between function spaces; they can receive input functions, and the output function can be evaluated at any discretization.[1]
The primary application of neural operators is in learning surrogate maps for the solution operators of partial differential equations (PDEs),[1] which are critical tools in modeling the natural environment.[2][3] Standard PDE solvers can be time-consuming and computationally intensive, especially for complex systems. Neural operators have demonstrated improved performance in solving PDEs[4] compared to existing machine learning methodologies while being significantly faster than numerical solvers.[5][6][7] Neural operators have also been applied to various scientific and engineering disciplines such as turbulent flow modeling, computational mechanics, graph-structured data,[8] and the geosciences.[9] In particular, they have been applied to learning stress-strain fields in materials, classifying complex data like spatial transcriptomics, predicting multiphase flow in porous media,[10] and carbon dioxide migration simulations. Finally, the operator learning paradigm allows learning maps between function spaces, and is different from parallel ideas of learning maps from finite-dimensional spaces to function spaces,[11][12] and subsumes these settings when limited to fixed input resolution.
^Evans, L. C. (1998). Partial Differential Equations. Providence: American Mathematical Society. ISBN 0-8218-0772-2.
^"How AI models are transforming weather forecasting: A showcase of data-driven systems". phys.org (Press release). European Centre for Medium-Range Weather Forecasts. 6 September 2023.
^Russ, Dan; Abinader, Sacha (23 August 2023). "Microsoft and Accenture partner to tackle methane emissions with AI technology". Microsoft Azure Blog.
^Hao, Karen (30 October 2020). "AI has cracked a key mathematical puzzle for understanding our world". MIT Technology Review.
^Ananthaswamy, Anil (19 April 2021). "Latest Neural Nets Solve World's Hardest Equations Faster Than Ever Before". Quanta Magazine.
^Sharma, Anuj; Singh, Sukhdeep; Ratna, S. (15 August 2023). "Graph Neural Network Operators: a Review". Multimedia Tools and Applications. 83 (8): 23413–23436. doi:10.1007/s11042-023-16440-4.
^Wen, Gege; Li, Zongyi; Azizzadenesheli, Kamyar; Anandkumar, Anima; Benson, Sally M. (May 2022). "U-FNO—An enhanced Fourier neural operator-based deep-learning model for multiphase flow". Advances in Water Resources. 163: 104180. arXiv:2109.03697. Bibcode:2022AdWR..16304180W. doi:10.1016/j.advwatres.2022.104180.
^Choubineh, Abouzar; Chen, Jie; Wood, David A.; Coenen, Frans; Ma, Fei (2023). "Fourier Neural Operator for Fluid Flow in Small-Shape 2D Simulated Porous Media Dataset". Algorithms. 16 (1): 24. doi:10.3390/a16010024.
^Jiang, Chiyu Lmaxr; Esmaeilzadeh, Soheil; Azizzadenesheli, Kamyar; Kashinath, Karthik; Mustafa, Mustafa; Tchelepi, Hamdi A.; Marcus, Philip; Prabhat, Mr; Anandkumar, Anima (2020). "MESHFREEFLOWNET: A Physics-Constrained Deep Continuous Space-Time Super-Resolution Framework". SC20: International Conference for High Performance Computing, Networking, Storage and Analysis. pp. 1–15. doi:10.1109/SC41405.2020.00013. ISBN 978-1-7281-9998-6.
^Lu, Lu; Jin, Pengzhan; Pang, Guofei; Zhang, Zhongqiang; Karniadakis, George Em (18 March 2021). "Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators". Nature Machine Intelligence. 3 (3): 218–229. arXiv:1910.03193. doi:10.1038/s42256-021-00302-5.
Neuraloperators are a class of deep learning architectures designed to learn maps between infinite-dimensional function spaces. Neuraloperators represent...
discovery, scientific simulations and engineering design. She invented NeuralOperators that extend deep learning to modeling multi-scale processes in these...
The Open Neural Network Exchange (ONNX) [ˈɒnɪks] is an open-source artificial intelligence ecosystem of technology companies and research organizations...
the sample model neural network above. Since the Quantum neural network being discussed uses fan-out Unitary operators, and each operator only acts on its...
Deep learning is the subset of machine learning methods based on neural networks with representation learning. The adjective "deep" refers to the use of...
A graph neural network (GNN) belongs to a class of artificial neural networks for processing data that can be represented as graphs. In the more general...
An AI accelerator, deep learning processor, or neural processing unit (NPU) is a class of specialized hardware accelerator or computer system designed...
Artificial neural networks (ANNs) are models created using machine learning to perform a number of tasks. Their creation was inspired by neural circuitry...
Brain implants, often referred to as neural implants, are technological devices that connect directly to a biological subject's brain – usually placed...
Neural tube defects (NTDs) are a group of birth defects in which an opening in the spine or cranium remains from early in human development. In the third...
with the translation operators. Consider the family S of operators consisting of all such convolutions and the translation operators. Then S is a commuting...
This caused the field of neural network research to stagnate for many years, before it was recognised that a feedforward neural network with two or more...
replacements for basic operators AND, OR, NOT must be available. There are several ways to this. A common replacement is called the Zadeh operators: For TRUE/1 and...
The Stochastic Neural Analog Reinforcement Calculator (SNARC) is a neural-net machine designed by Marvin Lee Minsky. Prompted by a letter from Minsky,...
(reflexive) processes and top-down (voluntary) processes converge on a common neural architecture, in that they control both covert and overt attentional systems...
learning, cellular neural networks (CNN) or cellular nonlinear networks (CNN) are a parallel computing paradigm similar to neural networks, with the difference...
macro-operators—i.e., searching for useful macro-operators to be learned from sequences of basic problem-solving actions. Good macro-operators simplify...
thus perform tasks without explicit instructions. Recently, artificial neural networks have been able to surpass many previous approaches in performance...
genetic operators such as structural mutation and crossover.) These properties have been formally proven. For evolving the structure and weights of neural networks...
programming, as well as more recent methods utilizing Bayesian methods and neural networks. Another non-classical alternative method to SR is called Universal...
(1999). "Towards a High Performance Neural Branch Predictor" (PDF). Proceedings International Journal Conference on Neural Networks (IJCNN). Archived from...
The softmax function is often used as the last activation function of a neural network to normalize the output of a network to a probability distribution...