Sorry Python but it is what it is.

  • @waz@lemmy.world
    link
    fedilink
    62 years ago

    Getting into rust is still on my to-do list, otherwise I’ve no major problem with pip or npm. They both have their flaws, but both work well enough to do what I need them for. If I had to prefer one it would be pip simply to sustain my passionate hate for all things JavaScript.

  • @gerryflap@feddit.nl
    link
    fedilink
    192 years ago

    This is why I use poetry for python nowadays. Pip just feels like something ancient next to Cargo, Stack, Julia, npm, etc.

    • @hatchet@sh.itjust.works
      link
      fedilink
      62 years ago

      I actually vastly prefer this behavior. It allows me to jump to (readable) source in library code easily in my editor, as well as experiment with different package versions without having to redownload, and (sort of) work offline too. I guess, I don’t really know what it would do otherwise. I think Rust requires you to have the complete library source code for everything you’re using regardless.

      I suppose it could act like NPM, and keep a separate copy of every library for every single project on my system, but that’s even less efficient. Yes, I think NPM only downloads the “built” files (if the package uses a build system & is properly configured), but it’s still just minified JS source code most of the time.

      • @Espi@lemmy.world
        link
        fedilink
        52 years ago

        With python and virtualenv you can also keep the entire source of your libraries in your project.

    • @barsoap@lemm.ee
      link
      fedilink
      102 years ago

      cached copies of crates that you downloaded

      Meh, what else is it supposed to do, delete sources all the time? Then people with slow connections will complain.

      Also size-wise that’s actually not even much (though they could take the care to compress it), what actually takes up space with rust is compile artifacts, per workspace. Have you heard of kondo?

            • @Anafabula@discuss.tchncs.de
              link
              fedilink
              22 years ago

              You can globally share compile artifacts by setting a global target directory in the global Cargo config.

              In $HOME/.cargo/config.toml:

              [build]
              target-dir = "/path/to/dir"
              

              The only problems I had when I did it where some cargo plugins and some dependencies with build.rs files that expected the target folder in it’s usual location.

    • @Pipoca@lemmy.world
      link
      fedilink
      42 years ago

      Python virtual environments feel really archaic. It’s by far the worst user experience I’ve had with any kind of modern build system.

      Even a decade ago in Haskell, you only had to type cabal sandbox init only once, rather than source virtualenv/bin/activate.sh every time you cd to the project dir.

      I’m not really a python guy, but having to start touching a python project at work was a really unpleasant surprise.

  • spez
    link
    fedilink
    English
    52 years ago

    Pip has a good looking loading thingy though.

    • @Sprout4426@lemmy.world
      link
      fedilink
      12 years ago

      I really dislike pnpm, if everyrhing you do is install and build then if doesnt matter what you use, if you do anything complex pnpm will come back to bite you. Yarn is a good middle ground

      • Andrew
        link
        fedilink
        12 years ago

        You literally didn’t gave any arguments why you really dislike pnpm. The most obvious benefit is several times faster installations. It also have resolved some peer dependencies (I don’t remember details).

    • @olutukko@lemmy.world
      link
      fedilink
      32 years ago

      What’s the difference? I’m currently doing my web developement 2 course where we started using react so I’m typing npm to terminal all the time :D

  • Oha
    link
    fedilink
    302 years ago

    npm is just plain up terrible. never worked for me first try without doing weird stuff

    • @goatbeard@lemm.ee
      link
      fedilink
      372 years ago

      Try not to learn too much from memes, they’re mostly wrong. Conda is good, if you’re looking for something more modern (for Python) I’d suggest Poetry

        • @mog77a@lemmy.ml
          link
          fedilink
          English
          22 years ago

          100% this. I remember really really trying to get the hang of them and eventually just giving up and doing it manually every time. I somehow always eventually mess something up or god forbid someone who isn’t me messes it up and I end up spending 4 hours dependency hunting. Venv and pip while still annoying are at least reliable and dead simple to use.

          However, a container is now my preferred way of sharing software for at least the past 6 years.

          • @Pantoffel@feddit.de
            link
            fedilink
            12 years ago

            Yup. A container i slow to rebuild, but at least the most robust. This is my preferred way to share python code when there are system dependencies involved.

          • Farent
            link
            fedilink
            62 years ago

            Isn’t it called a requirements.txt because it’s used to export your project requirements (dependencies), not all packages installed in your local pip environment?

          • @JakobDev@feddit.de
            link
            fedilink
            52 years ago

            Yes, but this file is created by you and not pip. It’s not like package.json from npm. You don’t even need to create this file.

            • Well if the file would be created by hand, that’s very cumbersome.

              But what is sometimes done to create it automatically is using

              pip freeze > requirements. txt

              inside your virtual environment.

              You said I don’t need to create this file? How else will I distribute my environment so that it can be easily used? There are a lot of other standard, like setup.py etc, so it’s only one possibility. But the fact that there are multiple competing standard shows that how pip handles this is kinds bad.

              • Vash63
                link
                fedilink
                22 years ago

                I work with python professionally and would never do that. I add my actual imports to the requirements and if I forget I do it later as the package fails CI/CD tests.

              • @JakobDev@feddit.de
                link
                fedilink
                English
                22 years ago

                If you try to keep your depencies low, it’s not very cumbersome. I usually do that.

                A setup.py/pyproject.toml can replace requirements. txt, but it is for creating packages and does way more than just installing dependencies, so they are not really competing.

                For scripts which have just 1 or 2 packges as depencies it’s also usuall to just tell people to run pip install .

      • SSUPII
        link
        fedilink
        42 years ago

        Honestly its a simple and straightforward solution. What’s wrong with it?

        • @theFibonacciEffect@feddit.de
          link
          fedilink
          1
          edit-2
          2 years ago

          If newer versions are released and dependencies change you would still install the old dependencies. And if the dependencies are not stored you can’t reproduce the exact same environment.

    • @ExLisper@linux.communityOP
      link
      fedilink
      English
      02 years ago

      cargo just works, it’s great and everyone loves it.

      npm has a lot of issues but in general does the job. When docs say do ‘npm install X’ you do it and it works.

      pip is a mess. In my experience doing ‘pip install X’ will maybe install something but it will not work because some dependencies will be screwed up. Using it to distribute software is pointless.

      • @krimson@feddit.nl
        link
        fedilink
        212 years ago

        I use pip extensively and have zero issues.

        npm pulls in a million dependencies for even the simplest functionality.

        • qaz
          link
          fedilink
          42 years ago

          You’ve never had broken dependencies?

          • @krimson@feddit.nl
            link
            fedilink
            52 years ago

            Nope. I know mixing pip with python packages installed through your systems package manager can be a problem but that’s why I containerize everything.

            • qaz
              link
              fedilink
              12 years ago

              I separate everything in virtual environments myself, but in my opinion you shouldn’t need to that to simply avoid breaking your system.

        • @ExLisper@linux.communityOP
          link
          fedilink
          English
          -12 years ago

          It probably works for your own local project. After using it for couple of days to install some 3rd party tool my conclusion is that it has no idea about dependencies. It just downloads some dependencies in some random versions and than it never works. Completely useless.

    • Lucky
      cake
      link
      fedilink
      92 years ago

      I’ve never had an issue with nuget, at least since dotnet core. My experience has it far ahead of npm and pip

      • @jubilationtcornpone@sh.itjust.works
        link
        fedilink
        English
        8
        edit-2
        2 years ago

        I’ll second this. I would argue that .Net Core’s package/dependency management in general is way better than Python or JavaScript. Typically it just works and when it doesn’t it’s not too difficult to fix.

        • @dan@upvote.au
          link
          fedilink
          22 years ago

          It’s also much faster to install packages than npm or pip since it uses a local package cache and each package generally only has a few DLL files inside.

    • Pxtl
      link
      fedilink
      English
      2
      edit-2
      2 years ago

      what’s wrong with nuget? I have to say I like the “I want latest” “no, all your dependencies are pinned you want to update latest you gotta decide to do it” workflow. I can think of some bad problems when you try to do fancy things with it but the basic case of “I just want to fetch my program’s dependencies” it’s fine.

      • Lucky
        cake
        link
        fedilink
        2
        edit-2
        2 years ago

        I’m guessing they only used it 10 years ago when it was very rough around the edges. It didn’t integrate well with the old .NET Framework because it conflicted with how web.config managed dependencies and poor integration with VS. It was quite bad back then… but so was .NET Framework in general. Then they rebuilt from the ground up with dotnet core and it’s been rock solid since

        Or they just hate Microsoft, which is a common motif to shit on anything Microsoft does regardless of the actual product.

        • Pxtl
          link
          fedilink
          English
          22 years ago

          Imho the VS integration has always been good, it’s the web config that’s always been a trash fire, and that’s not new.

          • Lucky
            cake
            link
            fedilink
            12 years ago

            The project I’m on right now originally had the nuget.exe saved in source because they had to manually run it through build scripts, it wasn’t built in to VS until VS2012