There's a trick companies like Facebook use to try and protect users from copy pasting malicious scripts in devtools: when they detect it opening (probably keyboard event), they print a big scary warning using console.log/error [1]
Assuming the first things most scrapers do is open the site in devtools, this would be a great place to print some text with a page specific Wikidata query that will pull in the exact same information as the current page along with a link to a really good hacker style tutorial + appendix of how to guides. Even better would be an option to turn on some sort of dev mode with mouseover tool tips that show queries for every bit of info on the page. Anything that breaks the feedback loop between the code and the browser will decrease the probability that the scraper will use wikidata. Think of it as a weird inverse user retention problem
Assuming the first things most scrapers do is open the site in devtools, this would be a great place to print some text with a page specific Wikidata query that will pull in the exact same information as the current page along with a link to a really good hacker style tutorial + appendix of how to guides. Even better would be an option to turn on some sort of dev mode with mouseover tool tips that show queries for every bit of info on the page. Anything that breaks the feedback loop between the code and the browser will decrease the probability that the scraper will use wikidata. Think of it as a weird inverse user retention problem
[1] https://imgur.com/a/0Xn1qIb