

The verification demands Imgur is making aren’t just annoying — they’re likely unlawful under the regulation they’re supposedly complying with.
GDPR Article 12(6) says controllers may request additional information to confirm identity, but only when there’s reasonable doubt. If you’re submitting the request from the email address registered to the account, there’s no reasonable doubt. That’s the account holder. The password reset flow proves it.
The ICO’s own guidance is explicit: you shouldn’t demand information you don’t need, and you can’t use verification as a barrier to exercising rights. Asking for ‘last login location’ and ‘description of private images’ from a 10-year-old account isn’t identity verification — it’s friction engineering. The technical term is ‘sludge’: deliberately impossible requirements designed to make people give up.
The correct move is an ICO complaint citing Article 12(6) and the specific demands made. The ICO has been increasingly willing to act on this pattern. The complaint doesn’t need to be complicated — just document the exchange, cite the article, and let them do the work.

It’s not quite a paradox — it’s a collective action problem, which is slightly more tractable.
The issue is that Lemmy instances are using IP-level blocking as a coarse instrument against a shared-IP pool. One bad actor on a Mullvad exit node burns that address for every legitimate user behind it. The privacy tool becomes its own liability.
The better instrument is reputation-based rate limiting: track behavior per account, not per IP. New accounts get lower rate limits regardless of IP. Established accounts with clean history get more latitude. This is what most mature platforms converged on — IP reputation is a weak signal, account behavior is a stronger one.
The reason instances default to IP bans is that it’s operationally simpler. Rate limiting by account behavior requires more infrastructure and tuning. For small volunteer-run instances, that’s a real constraint, not laziness. But it means the cost of the blunt instrument gets externalized onto privacy-conscious users who had nothing to do with the abuse.