Hi, I'm implementing a strategy where I need to calculate Bollinger band values at a faster rate than the candle time frame the strategy is running at.
For example: 15 minute time frame for EUR/USD, and calculating bollinger band values on the 1 minute candle using this version of the indicator: double[][] bbands(Instrument instrument, Period period, OfferSide side, IIndicators.AppliedPrice appliedPrice, int timePeriod, double nbDevUp, double nbDevDn, IIndicators.MaType maType, Filter filter, int numberOfCandlesBefore, long time, int numberOfCandlesAfter) throws JFException
with these parameters: bbands(EURUSD, 15Mins, Bid, ClosePrice, 593, 1.6, 1.6, EMA, Filter.Weekends, 1, bar.getTime(), 0); So calculating the BBand values on the 15 minute time frame using 1 candle before and 0 candles after every minute.
This is typically taking 50 milliseconds to calculate. When you calculate 15 times per 15 minute candle, this works out at 0.75 seconds. Very quickly, this slows the historical back testing down to a crawl. For some testing, I need to run at 20 second candle, which means for every 15 minute candle, there is a delay of 2.25seconds of BB Calcs / 15 minute candle. It doesn't become feasible to run this strategy on the historical back tester over long time periods at this rate.
So, my question is, why is this Bollinger Band calculation taking 50 milliseconds each time? Surely with the historical test data local on my PC this should fly?
|