Analysis of financial markets
The research focused on financial markets was carried out in order to analyse the current market situation in terms of volatility and trends, the aim being automated support for buying or selling a commodity or stock.
At the beginning, we defined simple rules for making buy or sell decisions. The rules were based on the current price development of a given commodity/stock, the price rise or drop as well as on the current commodity/stock price. The aim of the research was to improve, streamline and, most importantly, automate the decision making. When carrying out the analysis of financial markets, we did our best to minimize the involvement of subjective feelings in the decision-making process.
Historical data, extremes, patterns
The first step was to analyse the historical data of financial markets. Characteristic features such as local and global extremes, significant declines and increases within shorter periods or full time series, and repeating patterns were searched in the data. Based on what followed a pattern in the past, future development can be predicted if the pattern reappears.
Using these features, we were then able to predict what future development could be expected for a particular commodity/stock. Knowing both the predicted price development and the current price makes it possible to decide whether it is a good moment to sell or, conversely, to buy.
Figure 1: The trading account overview after applying various trading rules. The red curve represents the most successful investment strategy while the blue curve represents the least successful one.
When predicting future development in financial markets, we used historical data, characteristic features and repeating patterns. Based on this data and the definitions of simple rules, we have automated and streamlined decision-making processes and eliminated the involvement of subjective feelings.
- Jupyter Notebook: The useful Jupyter Notebook tool helped carry out the research. The tool facilitates work and, if the defined principles are followed, enables to create clear code.
- TA-Lib: The TA-Lib open source library, which is widely used for analyses of financial markets, mainly due to the high number of suitable functions, was employed.
- Scikit-learn: The subsequent prediction of future development based on the characteristic features identified for individual courses was made possible with the help of classifiers from the Scikit-learn library.
- Pandas: Auxiliary operations were carried out using the well-known open-source Pandas library.
- Matplotlib, Plotly: The Matplotlib and Plotly libraries were used to show the relationships between the analysed currencies and to visualize both historical and resultant predicted data.
Related products and services
Big data analysis helps organizations understand the behavior of customers, hidden characteristics of markets, or may reveal important risk factor...
We have successfully completed both internal and external data and AI monetization projects in industries such as telecommunications, banking, onl...
Read our blog
Revealing Sensitive Documents with NER: Case Study06. 8. 2019Read more Revealing Sensitive Documents with NER: Case Study
Revealing Sensitive Documents with Entity Recognition22. 7. 2019Read more Revealing Sensitive Documents with Entity Recognition
We must know, we will know
Expert team in big data and AI
Our team has presented hundreds of insights in many possible formats. We use tools and methods developed and used by scientific teams dedicated to research.
We strongly consider the existing business environment, capabilities to execute and skill of the staff. This enables us to provide minimum risk and bring quick success to your company.
Working with the best innovators
Cloudera, Microsoft, Clever Analytics, Apache Kafka, Apache Spark, Power BI, Tableau, Jupyter Notebooks