
Basic building blocks of neural models.
Learn weighted combinations of inputs.
Foundation of all modern neural networks.
Introduce non-linearity into networks.
Enable learning of complex patterns.
Strongly affect training dynamics.
Algorithm for computing gradients efficiently.
Enables learning through error correction.
Core mechanism behind neural network training.
Networks with multiple hidden layers.
Capture hierarchical feature representations.
Power many modern AI applications.
Specialized for grid-like data such as images.
Learn spatial and local patterns efficiently.
Widely used in computer vision.
Designed for sequential and temporal data.
Handle language, time series, and signals.
Transformers dominate modern NLP tasks.
Includes learning rates, batch normalization, and dropout.
Improves convergence and stability.
Essential for scaling deep models.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.