• 10 Posts
  • 26 Comments
Joined 2 years ago
cake
Cake day: July 8th, 2023

help-circle
  • “90% of content moderators are foreigners. What we have experienced during the process is very hard… spending three months without receiving a salary, in a country that isn’t yours. You cannot pay the rent, you cannot buy food,” Nkuzimana explains. Cori Crider – co-director of Foxglove, a British organization that is supporting the workers in this process – adds that this situation “forces [the content moderators] to continue accepting insecure jobs to remain in [Kenya], despite the serious risk to their mental health.” Moderators have resorted to crowdfunding, so that they can support their families as the legal fight unfolds.

    Just highlighting the exploitation of migrant workers here, like much of Twitter’s remaining workforce, apparently. It also reminded me of this story: The fishermen:

    On November 22, Joanne circulated a letter among the migrant crew. “I have been made aware the crew members are contacting an outside representative,” it read, possibly referencing a call Quezon made to Stella Maris seeking help for Susada. “I am also aware that crew members have been leaving their port without permission or making our office aware. Sadly the actions by these crew members are beginning to ruin the trust and faith we have placed in our Filipino crew.” It concluded by noting they would make reports to local police and UK immigration authorities “if necessary”.

    These people are fucking sick. The whole system that denies people the legal right to work just so they can be more easily exploited is fucking sick.

    I’m going to go and punch some walls. Laters.







  • I have trouble believing that humans can’t get by without meat, or cars, or carbon fuel, or mass-produced clothes, or supermarkets, or .

    It does not matter what you believe, or what you prioritise. Other people have different beliefs and have made different choices. If you want them to think and choose differently, don’t start off by telling them that they’re scum while you polish your imaginary halo.

    And for fucks sake don’t fill the Fediverse up with so much narcissistic, whiny crap that everyone who isn’t you fucks off somewhere else.

    This is not hard.




  • #explore on Mastodon is a good way to find stuff you wouldn’t see on your own feed. (It’s how I found this article.)

    And there are various bots that allow you to follow people on Twitter (birdsite.makeup etc). Although my instance has decided they don’t like that so it’s a bit harder to find them than it was.

    But yes, I think the article does a good job of articulating the problems. I hope they get solved because there’s a lot I like very much about Mastodon but it does not have the depth and breadth of content (yet). And hashtags do not work well enough as a replacement for search (I followed #BBC to get more news in my feed and ended up with a bit of news and a lot of porn).










  • They don’t seem to list the instances they trawled (just the top 25 on a random day with a link to the site they got the ranking from but no list of the instances, that I can see).

    We performed a two day time-boxed ingest of the local public timelines of the top 25 accessible Mastodon instances as determined by total user count reported by the Fediverse Observer…

    That said, most of this seems to come from the Japanese instances which most instances defederate from precisely because of CSAM? From the report:

    Since the release of Stable Diffusion 1.5, there has been a steady increase in the prevalence of Computer-Generated CSAM (CG-CSAM) in online forums, with increasing levels of realism.17 This content is highly prevalent on the Fediverse, primarily on servers within Japanese jurisdiction.18 While CSAM is illegal in Japan, its laws exclude computer-generated content as well as manga and anime. The difference in laws and server policies between Japan and much of the rest of the world means that communities dedicated to CG-CSAM—along with other illustrations of child sexual abuse—flourish on some Japanese servers, fostering an environment that also brings with it other forms of harm to children. These same primarily Japanese servers were the source of most detected known instances of non-computer-generated CSAM. We found that on one of the largest Mastodon instances in the Fediverse (based in Japan), 11 of the top 20 most commonly used hashtags were related to pedophilia (both in English and Japanese).

    Some history for those who don’t already know: Mastodon is big in Japan. The reason why is… uncomfortable

    I haven’t read the report in full yet but it seems to be a perfectly reasonable set of recommendations to improve the ability of moderators to prevent this stuff being posted (beyond defederating from dodgy instances, which most if not all non-dodgy instances already do).

    It doesn’t seem to address the issue of some instances existing largely so that this sort of stuff can be posted.







  • It will almost always be detectable if you just read what is written. Especially for academic work. It doesn’t know what a citation is, only what one looks like and where they appear. It can’t summarise a paper accurately. It’s easy to force laughably bad output by just asking the right sort of question.

    The simplest approach for setting homework is to give them the LLM output and get them to check it for errors and omissions. LLMs can’t critique their own work and students probably learn more from chasing down errors than filling a blank sheet of paper for the sake of it.