Great American humorist. C# developer. Open source enthusiast.

XMPP: wagesj45@chat.thebreadsticks.com
Mastodon: wagesj45@mastodon.jordanwages.com
Blog: jordanwages.com

  • 1 Post
  • 21 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle







  • I’ve used ESET NOD32 for a long time. It costs money, but it’s relatively light weight, doesn’t get in the way of any of the gaming or video editing or programming work I do, and it has smacked my hand a few times clicking on risky links, blocking a few downloads (one or two, ever) and often blocking scam websites that attempt to run javascript crypto miners. Your experience may vary if you’re planning on using “rescued” media or applications. I don’t have have the full internet/devices subscription, just the one for personal computers that does the actual virus detection.


  • As a thought experiment, imagine an online business declaring that it didn’t want its website indexed by Google in the year 2002—a self-defeating move when that was the most popular on-ramp for finding information online.

    This is exactly what will happen as people rely more and more on these things for “searching”, since search by the two biggest search engines (Google and Bing) are both so gummed up with SEO spam and ads (my god, the number of ads) that people just start asking AI for the increasingly accurate answers without the bullshit. Eventually the AI solutions will become enshittified as well, but not before huge blows will be dealt to both search engines and the sites that refuse to expose themselves. Google can probably weather it, but can a site like VentureBeat? I have severe doubts.


  • Well, just had Google Bard summarize this for me, and it doesn’t sound terrible. I like the tiered organization and limits it imposes (mostly). I don’t mind the reporting requirements for training data. But if they’re going to put requirements on the “high risk” category of AI to be transparent and explainable then those kinds of systems just might not exist for a good long while if they’re going to incorporate neural networks. Unless explainability and transparency mean that you can explain how they’re trained and the structure of the network. Otherwise the model weights will never really be explainable in a way that matters.








  • Because Meta is already illegally scraping hospital websites for your records.

    Sorry, but this is just bad web design from the hospitals. This pixel tool doesn’t magically appear on websites without being put there deliberately. Literally any tracking tool can capture this stuff on any page that a developer puts it on. This is 100% the fault of the programmer at the hospital (or the admin that made them do it) that decided to put tracking cookies on sensitive pages.

    The hospital administrators decided it was more important to get their precious reports on usage from Meta’s portal than protecting their patients.

    I’m pissed that I’ve had to defend Meta here, but this one isn’t on them.