In the previous article, we discussed relational models which use attention mechanisms in their architecture. We used this model to create an Expert Advisor, and the resulting EA showed good results. However, we noticed that the model's learning rate was lower compared to our earlier experiments. This is due to the fact that the transformer block used in the model is a rather complex architectural solution performing a large number of operations. The number of these operations grows in a quadratic ...
MQL5 channel - subscribe to forum channel: https://www.mql5.com/en/channels/interesting more: https://www.mql5.com/en/channels/forecast
I think that when people learn about this kind of technology, they all start to fall into roughly three subgroups: "Now we will make a super algorithm"Those who are wary of AI and questioning its usefulnessMachine cannot be better than a human. It's all just another hype more...