• BygoneNeutrino@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    16
    ·
    11 hours ago

    I’m not necessarily against this. Right now, social media companies can knowingly target children with sophisticated conditioning techniques that can and will steer their cognitive development. Although they know for a fact that they are manipulating children, they hide behind plausible deniability.

    I don’t like age verification, but I don’t see an alternative. Between the robots and the advanced manipulation techniques, something has to give.

    • JackFrostNCola@aussie.zone
      link
      fedilink
      English
      arrow-up
      18
      ·
      10 hours ago

      No alternative? How about they start regulating the companies who are writing the algorithms.
      They dont just hurt kids, they are detrimental to adults too, why allow the harmful companies to keep the status quo.
      This has always been more about increasing policing of the internet and slowly de-anonomising anyone who is online.

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 hours ago

      but I don’t see an alternative

      An alternative to what? This will not prevent children from accessing content. The weak point is the humans that willingly allows it in the first place. Having an extra step won’t stop them if it can help them not be bothered by actually having to look after their kid.

      This is not sacrificing freedom to gain security; it’s sacrificing freedom to not gain anything.