chayleaf
people always joke about this but defenestration has never been that common in neither the Russian Empire, USSR nor Russian Federation
the code is FOSS, the weights aren't, this is pretty common with e.g. FOSS games, the only difference here is weights are much costlier to remake from scratch than game assets
huh? I'd say email was quite popular there, it was just tied to the mobile operator (and has then been replaced with Line)
The right of self-determination means that a nation may arrange its life in the way it wishes. It has the right to arrange its life on the basis of autonomy. It has the right to enter into federal relations with other nations. It has the right to complete secession. Nations are sovereign, and all nations have equal rights.
different neural network types excel at different tasks - image recognition was invented way before LLMs, not only for lack of processing power, but also because the previous architectures didn't work with languages. New architectures don't appear out of thin air, they are created with a rough idea of what we could need to make the network do a certain task (e.g. NLP) better. Even tokenization isn't blind codepoint separation but is based on an analysis of languages. But yes, natural languages aren't "parsed" for neural networks, they don't even have a formal grammar.
i'm not talking about knowing about how humans perceive/learn languages, i'm talking about language structure. Perhaps it's wrong to call it "how languages work"
While I agree that LLMs can achieve human-tier efficiency at most tasks eventually (some architectural changes will be necessary, but the core approach seems sound), it's wrong to say it's modeled after the human brain. We have no idea how brains work as they're super complex, we're building artificial neural networks from the ground up. AI uses centuries' worth of math, but with our current maths knowledge the code isn't too complicated. Human brains aren't like that, they can't be summed up in a few lines of code because DNA is a huge mess that contains so much more than just "learning", so many inactive or redundant bits and pieces. We're building LLMs with knowledge of how languages work, not how brains work.
it might work with obfuscation, in general my preferred solution is VPN+proxy, the proxy is used for bypassing the DPI and doesn't have to adhere to particularly high standards and can be easily swapped, and the VPN is used via the proxy for actually routing L3 traffic
Well, Tor (with bridges) still works just fine, I don't really know any other "crowdsourced" proxy networks. Telegram isn't blocked (it used to be, but everyone used it anyway, including people in the government, so they unblocked it), so any info there is freely available. Wireguard and OpenVPN are blocked (even within Russia for some reason), shadowsocks is throttled on certain connections but works fine, and I haven't extensively tested anything else.
Also, mobile networks are used for testing stricter blocking measures before rolling them out to landline connections
not "any", but some very specific ones
I use sway on my phone, had to add a secondary menu bar with a few keys for stuff like opening rofi, but it works perfectly fine otherwise