My advice here was pretty standard - profile, find bottlenecks, use caching/vectorization/numpy/numba to speed up the hot code.
But even better advice is "do less backtesting" (as pointed out by @therobotjames)https://twitter.com/Overlevered_AM/status/1382201962687557638 …
-
-
Third point is extremely important (and tragically often neglected) because without an accurate simulator, you may as well not bother running a backtest at all (and it will uncover bugs in either your simulation or live code)
Show this thread -
Last point is important because you *really* want to know if your turnover/risk/exposure stats differ from your expectations. If you design a strategy based on a 1-2 day forecast and in simulation it holds positions for months, something is broken!
Show this thread -
There are many ways to assess quality of an alpha/signal/strategy without running a full blown backtest. For example for binary signals you can look at event studies/markouts. For continuous signals you can regress future returns on your signal, or ...
Show this thread -
... use factor regressions or quintile/decile charts. What these have in common is they measure what you care about, i.e. how good the signal is at predicting future returns.
Show this thread -
Monetizing the prediction is the job of portfolio construction (where you trade off alpha vs. risk, costs and constraints).
Show this thread -
The backtest is just to make sure that all these parts are working together as expected. In many ways a slow backtest can be a feature, because it discourages you from running hundreds of simulations and optimizing based on the results!
Show this thread
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.