I have never seen anybody ponder this and don't think I have ever seen in discussed in homebrew (vblank timings being all the rage/basic intro to graphics systems
http://coranac.com/tonc/text/video.htm ,
https://problemkaputt.de/gbatek.htm#lcddimensionsandtimings mentioning it only in passing and all too keen to talk about oddities and failures in other aspects, latency discussions being had for the gb player in stuff like
https://www.retrorgb.com/gameboyinterface.html but possibly a different discussion) or emulator dev circles (including the otherwise massively anal about timings audio hacking aspects). I have seen people measure the different clock speeds between hardware revisions but screen wise... a few might measure frame rates in the games themselves but eh.
In general video the drop of frame rate from the line frequency was usually a thing done to allow colour on CRT screens to work better and kind of irrelevant even for the LCDs of the GBA era. Not to mention most screens don't have an internal clock (certainly not one the system itself cares about) and will instead take a clock pulse/sync pulse from the device itself.
Similarly on "too fast" are you really able to detect 0.3Hz?
With the previous I am obviously unable to comment on the realities other than to say I played several tight timings based games as part of my analogue pocket review, having played said same on stock hardware (give or take flash cart*) GBAs predominantly for many years. Would have picked up something if it was a thing.
*what, if anything, changes here I can speculate a bit on. Some complained about Mother 3 timings in some things but I have never found anything drilling into that and whether it is the game, the translation, the flash cart or something else that is getting in the way.