more...Contents
- Introduction
- 1. Attention Mechanisms
- 2. Self-Attention Algorithm
- 3. Implementation
- 3.1. Upgrading the Convolutional Layer
- 3.2. Self-Attention Block Class
- 3.3. Self-Attention Feed-Forward
- 3.4. Self-Attention Feed-Backward
- 3.5. Changes in the Neural Network Base Classes
- 4. Testing
- Conclusions
- References
- Programs Used in the Article
----------------
- Neural networks made easy - MT5
- Neural networks made easy (Part 2): Network training and testing - MT5
- Neural networks made easy (Part 3): Convolutional networks - MT5
- Neural networks made easy (Part 4): Recurrent networks - MT5
- Neural networks made easy (Part 5): Multithreaded calculations in OpenCL - MT5
- Neural networks made easy (Part 6): Experimenting with the neural network learning rate - MT5
- Neural networks made easy (Part 7): Adaptive optimization methods - MT5
- Neural networks made easy (Part 8): Attention mechanisms - MT5
- Neural networks made easy (Part 9): Documenting the work - MT5
- Neural networks made easy (Part 10): Multi-Head Attention - MT5
Bookmarks