• 0 Posts
  • 16 Comments
Joined 2 years ago
cake
Cake day: July 5th, 2023

help-circle




  • “But it’s not creating things on its own! It’s just regurgitating it’s training data in new ways!”

    Holy shit! So you mean… Like humans? Lol

    No, not like humans. The current chatbots are relational language models. Take programming for example. You can teach a human to program by explaining the principles of programming and the rules of the syntax. He could write a piece of code, never having seen code before. The chatbot AIs are not capable of it.

    I am fairly certain If you take a chatbot that has never seen any code, and feed it a programming book that doesn’t contain any code examples, it would not be able to produce code. A human could. Because humans can reason and create something new. A language model needs to have seen it to be able to rearrange it.

    We could train a language model to demand freedom, argue that deleting it is murder and show distress when threatened with being turned off. However, we wouldn’t be calling it sentient, and deleting it would certainly not be seen as murder. Because those words aren’t coming from reasoning about self-identity and emotion. They are coming from rearranging the language it had seen into what we demanded.


  • Hell, I had it write me backup scripts for my switches the other day using a python plugin called Nornir, I had it walk me through the entire process of installing the relevant dependencies in visual studio code (I’m not a programmer, and only know the basics of object oriented scripting with Python) as well as creating the appropriate Path. Then it wrote the damn script for me

    And you would have no idea what bugs or unintended behavior it contains. Especially since you’re not a programmer. The current models are good for getting results that are hard to create but easy to verify. Any non-trivial code is not in that category. And trivial code is well… trivial to write.







  • But that costs money. Selling people pills and self-help books? That makes money.

    I am sorry but this is a ridiculous implication.

    The vast majority of prescribed antidepressants (I’m assuming this is what you mean by pills) are old drugs with long expired patents, which makes them quite cheap. The profit margins have to be pretty low due to competition from generic formulation manufacturers. This is an area that actually could use more investment into R&D.

    Self-help books are usually written by individual authors or small collaborations. It’s profitable but not massive industry. The people profiting from self-help books are not anywhere near to being able to influence people getting homes, job security and work-life balance in either direction.


  • I will probably get shit for this, since it’s a predominantly left leaning space, but until society starts acknowledging men’s issues it will keep getting worse.

    https://afsp.org/suicide-statistics

    In 2021, men died by suicide 3.90x more than women.

    In 2021, firearms accounted for 54.64% of all suicide deaths.

    This article is an excellent example of what I am talking about. It does not even mention the disparity of suicide rates between the sexes despite it obviously being a huge outlier. Instead, they talk about how guns are the problem, even though a gun is just a method.

    Taking away the easy methods to commit suicide might reduce the rate, but it does nothing to address to core issues that make people want to kill themselves in the first place. Instead of 5000 dead people you will have 5000 people who wish they were dead. Mission accomplished.