Entries with no category
Forecasting timeseries is one of the most important elements in building an effective trading strategy. When performing a trading operation in one direction or another, we proceed from our own vision (forecast) of the upcoming price movement. Recent advances in deep learning models, especially architecture-based Transformer models, have demonstrated significant progress in this area, offering a great potential for solving the multifaceted problems associated with long-term timeseries forecasting. ...
For a few time-series, it is possible to devise a formula for the next value in the sequences basing off of previous values that appeared within it. Number walls allow this to be accomplished by preliminarily generating a ‘wall of numbers’, in the form of a matrix via what is referred to as the cross-rule. In generating this matrix, the primary goal is to establish if the sequence in question is convergent and the number wall cross rule algorithm gladly answers this question, if after a few rows ...
Adaboost, short for adaptive boosting is an ensemble machine learning model that attempts to build a strong classifier out of weak classifiers. more...
Support Vector Regression (SVR) is a form of regression derived from Support Vector Machines. At its core, SVR uses kernel methods to map input data into higher-dimensional spaces, allowing for more complex relationships to be captured, which contrasts with dimensionality reduction. For this article though we are exploring strictly its loss function role when used with a multi-layer perceptron. A related but different form of regression we looked at in an earlier article was Gaussian Process ...
According to Wikipedia, Dimensionality Reduction is the transformation of data from a high-dimensional space into a low dimensional space so that the low-dimensional representation retains some meaningful properties of the original data, ideally close to its intrinsic dimension. more...