The problem with this is that incoming communications likely trigger the next outgoing communication, which would mean there is a high potential for r2 to remain close to the same bias without ever having a chance to balance out. This would require specific testing, unless anyone's reverse engineered the packet generating functions or has in-depth knowledge of SE's server code.However, if I did as byrth suggested and put random delays into my between shots, it would randomize any syncing to the server and over enough samples we could see the image of what is going on despite the bad resolution of packets.
No matter how you look at it, adding additional delay is the most effective way to minimize interference from packet intervals.