Of course it ain’t real users.
Reddit has had a major bot problem a decade ago and little has been done to mitigate it - beyond banning legitimate users who dared to be too loud about it.
I’ve moderated a relatively small sub, and pretty much for every legitimate post a day, you’d get 6 to 10 bot posts literally pulling an older post verbatim word for word, or maybe introducing a typo just to make detection harder…
Reddit’s response to the issue? “Hey, why don’t you pay us ~$25 a month just so you can continue using that open source automatic bot detection system we refuse to build into the site itself?”.






It’s not the navigation that requires the server but the processing of the mapping data.
Which in itself is BS because most of these vacuums come with hardware roughly equivalent of a top of the line smartphone from about 5-6 years ago. They can easily do the raw data to map conversion, even if it’s a bit slow and takes 20-30 seconds.
Also if you read the article it specifies that the damn thing is already running Google Cartographer which is a SLAM 3D map builder software - one of the better pro-grade mapping software suites, mind you. So the whole claim of cloud needed for processing is BS.