Doesn't surprise me one bit.
Use puppeteer, use residential proxies and the only remaining fingerprint is tracking and behaviour. If data is public, just a case of how much is it worth jumping through the hoops for the bot to complete its job.
Seen a few alt search engines that don't even bother having their own bot and just crawl masquerading as browsers, and only observe googlebot robots.txt rules etc.
If a page isn't paywalled it's public and a bunch of the time, re-used for purposes you probably wouldn't like. Bit of a free for all.