Scaling backtest capacity with Dask and Kubernetes
Two months ago I upgraded the motherboard in my home Linux PC to accommodate an Intel i9 CPU, specifically the Intel Core i9-10900 Comet Lake 10-Core CPU running at 2.8 GHz paired with the Cooler Master MasterLiquid ML240R CPU liquid cooler. This box runs production market data capture as well as a test trading strategy that I am working on, . . .
Building a tick-by-tick backtester for Bitcoin
I have collected about half a year of trade price data from the Phemex futures exchange, but until now I have not had the means to leverage this data to systematically test strategy performance. With the addition of a tick-by-tick backtester to Serenity with the 0.4.0 release, it is now possible to write a Serenity trading strategy in 100 . . .
Cloud tick storage with Azure and Python
Previously I built a local tickstore (covered in the blog post "Building Behemoth") based on HDF5, pandas and RAID storage. While the RAID storage gives me a degree of confidence, depending on a single Linux server sitting on the floor next to my desk is not ideal. Furthermore, it would be even better if researchers didn't have . . .
Starting Serenity's core FIX engine
After reviewing the available Python FIX implementations I decided to create my own, both as a learning exercise and to fill some functional gaps. In particular I wanted Python-friendly code generation from any FIX specification from 4.0 to 5.0SP2 with a rich metadata model so later on I can build tools like a FIX schema validator if I want . . .
Creating a minimum viable trading engine
This weekend I finally got Serenity 0.2.0 up-and-running with an example trading strategy. The code is messy; the example strategy is brittle and likely dangerous to your financial health; it's missing unit tests all over the place; etc.. But it starts up and trades the Phemex testnet based on a hypothetical trading signal using live . . .
Revisiting Jupyter Lab with Kubernetes
In Research-to-Go with Docker we used Docker and Docker Compose to create a Serenity research environment. Unfortunately all good things must come to an end, and what's described in that previous article is currently not working because Jupyter Lab has moved on to the 2.x series, but the Git extension has not kept up; the Docker Compose . . .
Applying Tau for real-time market data recording
In the last post on Serenity we looked at the Tau functional reactive programming library. In this post we will move beyond simple math processing with Tau and start connecting it to I/O, in particular the Websocket channels commonly used for distributing marketdata from cryptocurrency exchanges. To do this we'll need to make some big . . .