In a historical test using the JForex API, calling the setMarginCutLevel method only sets a new value in the ITesterClient object, not in the strategy object where it's actually used.
The following is the relevant code I used to test.
In the tester class:
final ITesterClient client = TesterFactory.getDefaultInstance();
client.setLeverage(leverage);
client.setMarginCutLevel(50);
LOGGER.info("In ITesterClient: Leverage = " + client.getLeverage() + ", MarginCutLevel = " + client.getMarginCutLevel());
In the straetgy class:
console.getOut().println("In IStrategy: Leverage = " + context.getAccount().getLeverage() + ", MarginCutLevel = " + context.getAccount().getMarginCutLevel());
Here is the output:
2011-02-16 22:40:04.865 INFO TestMarginCallMain - In ITesterClient: Leverage = 300, MarginCutLevel = 50
2011-02-16 22:40:04.874 INFO TestMarginCallMain - 1 strategies started.
In IStrategy: Leverage = 300.0, MarginCutLevel = 200
You can see above that setLeverage is working fine, but the MarginCutLevel in the strategy object should be 50 but remains at the default value of 200. This is incorrect so please fix this. Thank you.