• horse_battery_staple@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    2 months ago

    This is the web chat client/app, just like OpenAI sharing data with Microsoft, or Copilot doing the same. If you self host these LLMs your data stays within your LAN.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      2 months ago

      You can’t practically self-host Deepseek R1.

      Look, I use the 32B distil on my 3090 every day, but it is not the same thing as full R1. And people need to stop conflating the two.

      And (theoretically) API usage through one of many R1 providers is private.

      • jwiggler@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I dont really use LLMs so I didn’t even realize there were versions with different weights and stuff. I was using 7b, but found it pretty useless. Pretty sure I’m not going to be able run 32B on my rig. lmao.

        guess ill continue being an LLMless pleb.

      • horse_battery_staple@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 months ago

        I use 32b and the 672b side by side. The performance hit is around 20% and I keep all my data local. I am not conflating the two however self hosting works for me just fine. Your usecase is your own certainly. However I’d rather take the performance hit for the added data privacy.

        Also it’s nice to he able to set my own weights and further distil R1

        I have a local python expert a local golang expert and both have my local gitlab repository and I’ve tied their respective Ollama keys to my VSCode IDE.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 months ago

          Depends for sure. I usually try the 32B first, but give really “hard” queries to some API model.

          • horse_battery_staple@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            With the distilled models I have, I’ve been able to build and troubleshoot pretty complicated apps in Golang and Python. However, these distilled models are very specialized and will not do things like write me a story about a duck made out of duct tape or properly summarize articles. There are absolutely limits to my workflow and setup. But I’m pretty happy with it.

    • jwiggler@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      2 months ago

      I’m moving to self host all my streaming stuff. Switching from local-only plex to self hosting all my media (spotify, google photos, LLMs) and tools behind a reverse proxy so i can access outside my home. It’s pretty sweet and a good learning experience using reverse proxies

      Edit: Plus fuck these technofeudal lords who enclose access to markets, information, and culture.