I do. Any questions?
I am Steve
I do. Any questions?
My tip is: Instead of $3/month, donate $35/year. That way it’s only 1 transaction.
Apple hasn’t been for professionals, for like a decade now.
That’s when you go to a federation white-list, instead of black-list.
That’s where simple defederation happens. It’s mostly why behaww cut off lemmy.world.
Which is why you’d need something else for popular sites worth targeting directly. But there are more options than standard capta’s. Replacing them isn’t necessarily a bad idea.
The bots for the most part are generic. They fill in all fields with randomly generated nonsense mostly. If the site is large enough you could make a bespoke script, which is why I’m not sure how well it will scale to large sites.
But that’s only the simplest option. Annother I’ve see is using a collection of movie posters, you have the user pick the title from 5 or 6 options. There are lots of simple ways to defeat bots of all kinds.
You’d think so.
But it’s not flagged as hidden. Instead you use CSS to set display as none. So the bot needs to do more than look at the direct HTML. It needs to fully analyze all the linked HTML, CSS, and even JavaScript files. Basically it needs to be as complex as a whole browser. It can’t be a simple script anymore. It becomes impracticality complicated for the not maker.
There are other options.
I’m just a hobbyist, but I have built a couple websites with a few hundred users.
A stupidly simple and effective option I’ve been using for several years now, is adding a dummy field to the application form. If you add an address field, and hide it with CSS, users won’t see it and leave it blank. Bots on the other hand will see it and fill it in, because they always fill in everything. So any application that has an address can be automatically dropped. Or at least set aside for manual review.
I don’t know how long such a simple trick will work on larger sites. But other options are possible.
Slightly longer AI summary by Kagi