I have noticed it for a while, but never got around to figuring out why or when it happens. Now I finally have.
When you run an optimization let’s say between 1 and 1.2, the values it will use will be 1.0, 1.1 and finally 1.2.
When you run an optimization between 1 and 1.3, the values it will use will be 1.0, 1.1, 1.2, 1.3 and then finally 1.4.
So, it’s when the end value is an uneven decimal. Anyone else with this problem and why does it happen?