I didn't said that making the array bigger solved the problem, but that the test I made, enlarging the array, demonstrates that there was an array bounds problem, and asked him to check the program correctness (I cannot waste time studying his whole program). But from the answer seems that he really made an error on array dimensioning.
With optimization none all variables are layout in memory while with optimizations on many are allocated in registers, memory layout could be different more dead space can exist in data structures.
In the first case maybe the overwriting involved different variables that weren't used after, while with optimization the compiler kept almost only the big arrays on the stack so they were overwrited (the bloated counter is on the calling function stack, the local value seems correct). The casualties are so frequent that some programs can work for years and the bug remains hidden (there are many cases with M$).
To exactly understand what happened you should trace the assembler and check memory writes, unfortunately with large arrays and so many data manipulation to do it would really be a nightmare