this post was submitted on 16 Jul 2023
54 points (96.6% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54476 readers
419 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
54
Pirating AI models (lemmy.dbzer0.com)
submitted 1 year ago* (last edited 1 year ago) by zaknenou@lemmy.dbzer0.com to c/piracy@lemmy.dbzer0.com
 

So it is convenient and all to use chatGPT from open AI site and to use other AI models on their official sites, but doesn't feel like a pirate when doing this am I wrong? like OpenAI staff spying on my discussion with my waifu persona I given to ChatGPT, or Midjourney creators knowing about every picture of John Oliver I created with their discord bot, Also you are only allowed to use chatGPT 3.5 and need to pay for GPT 4 (20$ a month for a limited use wtf) so are there any islands where the pirate can do what he does comfortably?

there are Telegram bots. Also Quora suggests multiple AI models on https://poe.com/ . but I'm curious if there exists some compilation of pirated useful AI tools

EDIT: Thank you everyone from inside and outside this instance.

all 19 comments
sorted by: hot top controversial new old
[–] rikudou@lemmings.world 24 points 1 year ago* (last edited 1 year ago) (1 children)

Why pirating? Most of the stuff is open source. For images use Stable Diffusion and download models for example from civitai.com. Don't know what to use for chat, I'm using GPT.

[–] ApplePie@sh.itjust.works 1 points 1 year ago (1 children)

You don’t even have to get it from there… it’s pretty easy to just go to SD’s GitHub and run the whole model yourself with just a few commands.

[–] Pulp@lemmy.dbzer0.com 3 points 1 year ago (1 children)

??? The original model is trash compared to those finetuned models

[–] AngrilyEatingMuffins@kbin.social 23 points 1 year ago (1 children)

Orca 13b is coming out soon. It’s open source and you can run it on your computer. Stable diffusion can also be run by a decent enough rig.

[–] SpringStorm@lemmy.dbzer0.com 9 points 1 year ago (1 children)

After some quick googling, seems like it's not open source yet, just soon™. They can pull their statement back.

[–] deCorp0@lemmy.dbzer0.com 2 points 1 year ago (1 children)

Try another search engine partner, Google’s SEO has become increasingly less likely to find open source projects as their referral commission algorithm grows more greedy. Google is also no friend of our community as they continually remove our programs like “Kodi” from search results and target projects like YT-dl with takedowns. We monetize them simply by searching on their platform, genius for shareholders, terrible for consumers.

[–] SpringStorm@lemmy.dbzer0.com 1 points 1 year ago (1 children)

Actually I use duckduckgo nowadays. It just sounds clunky to say "duckduckgoing" or "searching on duckduckgo". Also that info I got is from a Reddit post.

[–] deCorp0@lemmy.dbzer0.com 2 points 1 year ago

Aye matey, I’m so old I was using the internet back when we just said “Search online”. Kids in the US are literally taught in school to say Google, talk about product placement. Sometimes I say DDG, but honestly I’ve been using Brave Search lately.

[–] PrimaCora@lemmy.fmhy.ml 20 points 1 year ago

You can't pirate their models, and even if they leaked, running them would need an expensive machine.

There are lots of open source models. They can get close but are limited by your hardware.

If you want close to GPT, there is the falcon 40b model. You'll need something with more than 24 GB VRAM or deep down cpu offload with 128 GB RAM, I think, maybe 64.

With 24 GB VRAM you can do a 30B and so on...

For reference, the GPT models are like 135B. So a100 nvlink territory.

[–] luthis@lemmy.nz 13 points 1 year ago

You can run plenty of LLM models locally or on google colab for free. It's not piracy though, and you will never be able to run chatGPT yourself.

There's a ton of options for this, piracy is not required.

Stable diffusion and Oobabooga will do what you need, and there are communities for both on lemmy world

[–] ArkyonVeil@lemmy.dbzer0.com 12 points 1 year ago

Tip for anyone over here who wants to ask GPT-4 questions on the cheap. Applying for access to their API will give you access to both chat GPT 3.5 as well as GPT-4 with a different interface. There you pay what you use, which is insanely cheaper with GPT-3.5, and... mildly affordable with GPT-4 so long as you keep contexts short and conversations brief.

Been making use of their playground for months now, probably paid 20 bucks tops for months of use. Worked for my case.

If I need creativity without intelligence, I'll just use WIzardLM on my 3090.

Do note "Pirating AIs" is not really practical due to the extreme hardware requirements, you'll hardly find someone willing to foot the bill for free.

[–] melmi@lemmy.blahaj.zone 6 points 1 year ago (1 children)

The only thing you can "pirate" is LLaMA, and it's not really piracy you can just grab it from HuggingFace on the honor system. Check out GPT4All or oogabooga, which will both give you nice web interfaces to selfhosted LLMs.

You'll need some beefy hardware to get responses comparable to the commercially available models, but it is possible—if impractical.

[–] blake@monero.town 1 points 1 year ago

yes LLaMA got leaked and has opened up a tool for the people. I see the project openLLaMA making headway in putting out a chatbot 'for the people' as opposed to the centralised control. don't confuse it with LLaMA 2, a confab between mta and mcrosoft which is 'open source' but you have to let them have all of your data. seems to be an effort at damage control and to try and still harvest some of the data that people are shovelling into machine learning programs these days. I personally don't think they're great at all, far from intelligent and I think they're going to be bad for humanity down the line. but pandora's box is open so we might as well democratise by at least breaking the corporate chokehold on A"I".

[–] HumanPerson@sh.itjust.works 2 points 1 year ago

It isn't piracy but you can run gpt4all on your local computer. It is open source and can run on most reasonably powerful cpus.

[–] user@lemmy.one 2 points 1 year ago

I too wish to use gpt 4 for free but ney... :(

[–] wolfshadowheart@kbin.social 1 points 1 year ago

I think you're looking for a website like civitai