Neural networks made easy (Part 19): Association rules using MQL5
by
, 09-15-2022 at 02:11 AM (317 Views)
more...In the previous article, we started learning association rules mining algorithms which belong to unsupervised learning methods. We have considered two algorithms for solving this type of problems: Apriori and FP Growth. The bottleneck of the Apriori algorithm is its large number of database calls aimed at determining the support of Frequent Pattern candidates. The FP Growth method solves this issue by building a tree which includes the entire database. All further operations are carried out with the FP tree, without accessing the database. This increases the problem solving speed, as the FP tree is located in RAM. Accessing it is much faster than a full iteration of the database.
---------------------
- Neural networks made easy
- Neural networks made easy (Part 2): Network training and testing
- Neural networks made easy (Part 3): Convolutional networks
- Neural networks made easy (Part 4): Recurrent networks
- Neural networks made easy (Part 5): Multithreaded calculations in OpenCL
- Neural networks made easy (Part 6): Experimenting with the neural network learning rate
- Neural networks made easy (Part 7): Adaptive optimization methods
- Neural networks made easy (Part 8): Attention mechanisms
- Neural networks made easy (Part 9): Documenting the work
- Neural networks made easy (Part 10): Multi-Head Attention
- Neural networks made easy (Part 11): A take on GPT
- Neural networks made easy (Part 12): Dropout
- Neural networks made easy (Part 13): Batch Normalization
- Neural networks made easy (Part 14): Data clustering
- Neural networks made easy (Part 15): Data clustering using MQL5
- Neural networks made easy (Part 16): Practical use of clustering
- Neural networks made easy (Part 17): Dimensionality reduction
- Neural networks made easy (Part 18): Association rules
- Neural networks made easy (Part 19): Association rules using MQL5