Excellent, now I see my post was about 70% rubbish, I was clearly not keeping the actual problem in mind as I read your post.
I don't think line 30 is better, it adds complexity without removing the need for the % in the first subscript.
I retract points 1, 2 and 4 that I made unequivocally. They are just wrong.
On point 3, the point I was trying to make n is that it is not used in the algorithm, if the file contains more data than expected then your algorithm will just run off the end of the array. You can simulate this with your test code by just changing line 10 to initialise count to say 100 while leaving line 8 alone.
I accept your point that both this and the fact you have put the algorithm on a single line stem from your competition coding; coding for production and coding for competition are 2 different things.
Point 5 still stands in its entirety, you may have tested it on several platforms but one of the most tricky things about undefined behaviour is that the system is free to do
anything. That includes working as expected right up until you ship it to a customer; so it is important to be able to recognise constructs that produce it or may produce it and avoid them.
<anecdote>
Anecdote About Undefined Behaviour
Early in the 1990's I was working on a SCADA system. One of our customers reported that the computer only worked at one end of their office.
Slightly bemused we requested more details and were told that the system had been working fine for several months, then the client had decided to reorganise the office and as part of that had move our system from the eastern end of the office to the western end (a move of about 15m). When they switched it back on it crashed and continued to consistently crashed every time. So they moved the computer back to the eastern end of the office were upon it started working again without issue.
They proffered the opinion that they thought there was a
Ley Line running under the western end of the office effecting operation and we need to harden our software/computer against Ley Lines.
I was dispatched to investigate, equipped with a development machine. And after a day or so's reproduction of the problem and poring over the code base I found this
-
void someFunction()
-
{
-
int *p=0;
-
-
// 30-40 lines of code that doesn't reference p
-
-
*p = 5;
-
-
// Another 30-40 lines of code that doesn't reference p
-
}
-
As you can see this is definitely undefined behaviour (although at the time I didn't know the term) as we are dereferencing a NULL pointer. Back in the 90s dereferencing a NULL pointer like that didn't produce the instant segmentation fault what it would today because of the segmented memory map that existed and the fact that virtual memory spaces didn't exist so there was less checking and verification.
Obviously I removed these 2 lines of code and all was good, Ley Line Hardening
done. Back at the office I checked and discovered that they had been in the code for a couple of years with no problems reported.
Obviously the moral of this story is always test your software on top of a Ley Line 😁
</anecdote>