• @petunia@lemmy.world
    link
    fedilink
    English
    39
    edit-2
    2 years ago

    Speaking from experience, they could fix their spam and abuse woes very easily by just closing new signups or restricting it in some way. Simplest would be invite-only (built-in feature of Mastodon), or restrict the signups page based on IP range whitelist/blacklist.

    EDIT: Their domain has been reinstated, and they disabled open signups. New registrations now require moderator approval https://pawoo.net/@pawoo_support/111249170584706318

    :pawoo: Announcement! Thank you for always using Pawoo. Due to server congestion, new registrations will now require approval by a moderator. Thank you very much for your cooperation.

      • @petunia@lemmy.world
        link
        fedilink
        English
        62 years ago

        Absolutely brain-dead speculation based on literally nothing. Complicit in what??? The current owner is a very public figure, so they gain nothing and have everything to lose. It’s just pure incompetence and mismanagement.

  • Dame
    link
    fedilink
    English
    9
    edit-2
    2 years ago

    Why are there people downvoting people commenting about not wanting CSAM?

    • @endhits@lemmy.world
      link
      fedilink
      English
      112 years ago

      CSAM is Child Sexual Exploitation Material

      People prefer using this term over CP because the word “porn” is considered too soft. Porn is generally a consensual, adult medium made with adults for adults. CP is not that, it’s first and foremost harm of a child.

      • @fsxylo@sh.itjust.works
        link
        fedilink
        English
        6
        edit-2
        2 years ago

        That’s strange to me, because Child Porn sounds revolting to me but CSAM sounds like something I can treat with aspirin.

    • Clay_pidgin
      link
      fedilink
      English
      112 years ago

      “Child Sexual Abuse Material”. It’s an awkward acronym that’s mostly overtaken “Child Pornography”.

    • @TheGreatFox@lemm.ee
      link
      fedilink
      English
      132 years ago

      Nah. They’re lax about loli. Which, as distasteful as it is, does not involve any harm to actual children. They do go after actual CSAM.

      • Dame
        link
        fedilink
        English
        02 years ago

        Not really, they’re lax about CSAM and didn’t even have laws until 2016-2017. Even then the laws are lax

      • @nandeEbisu@lemmy.world
        link
        fedilink
        English
        02 years ago

        They go after it mainly to appease external forces like other countries objecting to it, but people who are convicted often get very light sentences.