Comment by ndriscoll
It's very clear you're out of your element on this, and you have multiple people with an actual math background telling you the objection is somewhere between meaningless and wrong.
The takeaway from trying to really nail down a definition of "integers" (or anything, really) is going to be something along the lines of "if it quacks like a duck up to unique isomorphism, it's a duck". The encoding is not important and one frequently swaps among encodings when convenient. In any case, no one who knows any math is going to say to a child that 3 and 3.0 aren't interchangable outside of some extremely specific contexts. In fact that's not even encoding: it's notation. They can be literally equal, not just equivalent. Those particular contexts aren't ordained, and e.g. propagation of uncertainty is "better" than significant figures if you're doing engineering anyway.
Writing something like '10/3=3' is likely to trigger the mathematicians because lots of people get confused about what '=' is supposed to mean (and often use it to mean something like "next step indicator"). '3=3.0' not so much.
> outside of some extremely specific contexts.
The exact context was given. They wanted only whole numbers.
> Writing something like '10/3=3' is likely to trigger the mathematicians
Sure, when lacking the context of all answers should be rounded to the nearest whole number. But that was the context, and it's astounding so many people with alleged math backgrounds arguing things like intergers aren't a thing to understand.