So I’m no expert, but I have been a hobbyist C and Rust dev for a while now, and I’ve installed tons of programs from GitHub and whatnot that required manual compilation or other hoops to jump through, but I am constantly befuddled installing python apps. They seem to always need a very specific (often outdated) version of python, require a bunch of venv nonsense, googling gives tons of outdated info that no longer works, and generally seem incredibly not portable. As someone who doesn’t work in python, it seems more obtuse than any other language’s ecosystem. Why is it like this?

  • it_depends_man@lemmy.world
    link
    fedilink
    Deutsch
    arrow-up
    2
    arrow-down
    2
    ·
    10 days ago

    The difficulty with python tooling is that you have to learn which tools you can and should completely ignore.

    Unless you are a 100x engineer managing 500 projects with conflicting versions, build systems, docker, websites, and AAAH…

    • you don’t really need venvs
    • you should not use more than on package manager (I recommend pip) and you should cling to it with all your might and never switch. Mixing e.g. conda, on linux system installers like apt, is the problem. Just using one is fine.
    • You don’t “need” need any other tools. They are bonuses that you should use and learn how to use, exactly when you need them and not before. (type hinting checker, linting, testing, etc…)

    Why is it like this?

    Isolation for reliability, because it costs the businesses real $$$ when stuff goes down.

    venvs exists to prevent the case that “project 1” and “project 2” use the same library “foobar”. Except, “project 1” is old, the maintainer is held up and can’t update as fast and “project 2” is a cutting edge start up that always uses the newest tech.

    When python imports a library it would use “the libary” that is installed. If project 2 uses foobar version 15.9 which changed functionality, and project 1 uses foobar uses version 1.0, you get a bug, always, in either project 1 or project 2. Venvs solve this by providing project specific sets of libraries and interpreters.

    In practice for many if not most users, this is meaningless, because if you’re making e.g. a plot with matplotlib, that won’t change. But people have “best practices” so they just do stuff even if they don’t need it.

    It is a tradeoff between being fine with breakage and fixing it when it occurs and not being fine with breakage. The two approaches won’t mix.

    very specific (often outdated) version of python,

    They are giving you the version that they know worked. Often you can just remove the specific version pinning and it will work fine, because again, it doesn’t actually change that much. But still, the project that’s online was the working state.

    • ebc@lemmy.ca
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      10 days ago

      Coming at this from the JS world… Why the heck would 2 projects share the same library? Seems like a pretty stupid idea that opens you up to a ton of issues, so what, you can save 200kb on you hard drive?

      • jacksilver@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        9 days ago

        Yeah, not sure I would listen to this guy. Setting up a venv for each project is about a bare minimum for all the teams I’ve worked on.

        That being said python env can be GBs in size (especially when doing data science).

        • NostraDavid@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          9 days ago

          especially when doing data science

          500MB for Ray, another 500MB for Polars (though that was a bug IIRC), a few more megs for whatever binaries to read out those weird weather files (NetCDF and Grib2).

      • it_depends_man@lemmy.world
        link
        fedilink
        Deutsch
        arrow-up
        3
        ·
        edit-2
        9 days ago

        Why the heck would 2 projects share the same library?

        Coming from the olden days, with good package management, infrequent updates and the idea that you wanted to indeed save that x number of bytes on the disk and in memory, only installing one was the way to go.

        Python also wasn’t exactly a high brow academic effort to brain storm the next big thing, it was built to be a simple tool and that included just fetching some library from your system was good enough. It only ended up being popular because it is very easy to get your feet wet and do something quick.