That's not what I remember programs looking like. You would have counters that went to 99 and no more because the UI wouldn't accommodate another digit, and programmers didn't know what they were doing, and figured it would be clearer if the software reflected its UI and couldn't handle over 99 at all, invalid UI means unrepresentable state.
Lots of stuff didn't care about the day, too, like you'd get 0997 and that meant September 1st 1997 because the first (or last) of the month was inferred, and you'd get goofy logic around year++ where you add 100 years whenever you want to increment the month, and the whole thing is written as a modulo 1200 but whenever the date is about to be 0000 you look at what's stored in year instead and add one to it instead of 100, because that way you have less variables to allocate and every "little bit" counts.
Everyone knows now, but lots of people writing programs didn't have any prior art, they were just good at messing with computers and sort of fell into programming by accident.
It was also to do with how dates were stored in very old databases that were conceived in the 70s when storage was at a premium - a lot of them just used two digits for year.
"Most" programs used epoch time, like video games. The minority that didn't were the big, stable, old programs that ran on big computers. Things like payroll and flight schedules.
Guess we'll find out in 2038.