• 1 Post
  • 25 Comments
Joined 3 days ago
cake
Cake day: March 18th, 2026

help-circle

  • I didn’t understand your disagreement. Yes just like a bar shouldn’t be responsible for a person that gets plastered drunk after they leave, Facebook shouldn’t be responsible for the actions of a predator that goes to a porn website to lure kids. Just like the Catholic Church shouldn’t be responsible for a public school teacher that rapes her students at school. The only times any of these organizations are responsible is when the abuses happen while using their services.

    I don’t get why this is controversial.

    I can’t speak for the military’s recruiting practices. Yes, I fully agree that the military’s recruitment practices are very predatory, and should be reigned in. Politically, I personally think “enlistment” shouldn’t be an option at all. It should be random draft. Every year the military should tell Congress how many new recuits they need, and Congress should approve a draft of 18 year olds for that many new recuits. The draft should be random, with no deferments or other ways out of service other than health reasons as determined by a military physician. (But that’s way off topic.)


  • The problem the predators would have if they are relegated to the “kid friendly” sectors is that those sectors are much better policed by users and the corporations.

    It’s not really the public content that is the problem, the problems really come when a predator can lure a child into a private chat. That’s when the predator can start their process of grooming that eventually leads to blackmailing the child (grooming is a process and it’s damn evil and damn sinister). By relegating the users to “kid friendly” areas, the opportunity to pull kids into private spaces is greatly diminished.

    Now, will the predators stop being predators? No. But if the platforms have strong child protection policies that make it more difficult for the predators, then they will move on to a website that has weaker policies. Which is just about the best an organization or platform can do, make the predators uncomfortable enough that they go hunt someone else’s kids.


  • Correct. Right now the OS maker is not responsible. That exactly why Meta is pushing so hard to change the laws to make them responsible.

    Your analogy is a good analogy. In your car analogy, today, no one blames the car manufacturer for a drunk driver, but we do blame bars and bar tenders. In many states, bars have to be licensed and if the bar tender allows some one to get drunk and drive home the bar and the bar tender can be held liable. This situation would be like if bars got together to lobby state and national governments to make it so that the car manufacturers had to install breathalyzers in every car so that the bars could reduce their liability and responsibility.














  • It reeks of a coordinated agenda,

    It is a coordinated agenda, just not a secret one like people want to think. It’s being pushed by Meta and a string of popular app makers and games to avoid having to be responsible for their own platforms.

    Therefore, some Fediverse instances, may end up implementing age checking, or stopping altogether if they can’t afford the additional costs of age checking.

    That’s a strange argument to me. That’s exactly what Meta is intending to prevent from having to do by pushing these laws. If countries and states pass laws like the California law specifically, then no fediverse instance will need to worry about age verification. They just ask the user’s browser to ask the OS. California’s version of the law would really help small businesses and small developers, because it puts all the child protection responsibility onto the OS.

    Now, regarding the “kid friendly” limitation: if the Web gets limited to “non-adult content”… what’s “adult content” to begin with?

    In this case, “kid friendly content” becomes “any content that the website wants to be responsible and liable for letting users that report being <18 have access to”.


  • My biggest frustration with the community is not that people don’t like the proposed solution but that

    1. There is so much flat denial that there is actually a major online child predator problem, and/or
    2. No one should be held responsible to fix it, and/or
    3. no one is offering alternative solutions.

    I’m really not upset with individual users here. I understand that you are removed from the problem and don’t understand it. I really don’t blame you personally. I have had training on youth protection and it’s not an easy problem, and just throwing the parents under the bus isn’t fair. When it comes to child predators, they are often just as much the victims as the kids are. (Yes, I mean that.)

    I’m upset with the EFF. They don’t have an excuse for their ignorance. They’ve been taught the problem many times and just refuse to acknowledge it. (Red flag if there ever was one, if you ask me.) If they didn’t like the verification rules then they need to start proposing alternative solutions (which they don’t have).



  • If their goal is to find an excuse to declare you a terrorist then there are much easier ways to do that that are already available to them. This really isn’t an efficient way to do that.

    And, as best as I’m aware, no age verification laws anywhere threaten any consequences for the user. The consequences are only for the OS makers.

    (Granted, the California law, at least, could be read to say that it’s the entity installing the OS to confirm ages, not necessarily the OS maker. So for most Linux distros that would shift the user age verification responsibility completely to the user installing the OS, but I’m not sure how that would work out in courts or whether websites and applications would recognize that. It will probably never actually be an issue that is adjudicated.)