• @Devouring@lemmy.world
    link
    fedilink
    192 years ago

    If you’re doing something extremely skillfully, chat gpt will make the dumbest suggestions ever…

    Chatgpt is good for learning ideas and new things as an aggregate of what everyone thinks about it. But as a coding tool it cannot reason properly and has rubber stamp solutions for everything.

    • @DudeDudenson@lemmings.world
      link
      fedilink
      102 years ago

      Well yes it’s responses are based on what the average of the internet would say.

      I’m surprised it doesn’t constantly tell you to format windows and reinstall no matter what you ask

  • @guywithoutaname@lemm.ee
    link
    fedilink
    352 years ago

    I strongly advise not to do that. As others pointed out, it really is just predicting the next word. It is worth learning about how to problem solve and to recognize that the only way to become a better program is with practice. It’s better to get programming advice from real people online and read the documentations for the functions and languages you are trying to use.

  • @9thSun@midwest.social
    link
    fedilink
    English
    72 years ago

    As someone who is learning, I think it’s imperative to understand that chatgpt has limitations that cannot be overlooked. It’s pretty good if I make some silly syntax or formatting errors, but at the core I have to understand what I’m working with if I want to be a better programmer. I love the conversational nature because I often have a hard time wording questions, so it helps me in that regard as well. Idk if you want to be truly good at something you have to be more reliant on yourself than external tools.

    • @1984@lemmy.today
      link
      fedilink
      2
      edit-2
      2 years ago

      The thing is, in some fields like devops, there are so many tools that you can’t remember or know all of them very well. So asking chatgpt how to do something saves very much time. It can write ansible playbooks, docker files, web server configurations etc etc. They almost never work perfectly but they give a very good starting point to modify.

      It used to be that you could be very good at specific languages or tools but today, there isn’t enough time. Everyone is always in a hurry to get something out as quickly as possible too.

    • @wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      22
      edit-2
      2 years ago

      I’ve got no issues with people using stackoverflow or chatGPT as a reference. The problem has always been when anyone just skims what they found and just paste it in without understanding it. Without looking at the rest of the comments, further discussion, or looking at any other search results for further insight and context.

      I think chatGPT makes this sort of “carelessness” (as opposed to carefulness) even easier to do, as it appears to be responding with an answer to your exact question and not just something the search algorithm thinks is related.

  • UnfortunateShort
    link
    fedilink
    32 years ago

    ChatGPT was never made for programming and is horrible at generating code. It is nice for a peer-programming kinda setup tho, because it can quickly point you towards tools, libraries, APIs etc. to use

    • @oldfart@lemm.ee
      link
      fedilink
      4
      edit-2
      2 years ago

      It generated a custom needs GUI OCR tool in Qt5. I don’t know a single bit of Qt5 and went from zero to working tool in half an hour.

      The tool takes a screenshot, lets me select an area on the screen, OCRs it and displays the text in a window.

      If ChatGPT isn’t made for programming then I’m looking forward for a product that is.

  • Semi-Hemi-Demigod
    link
    fedilink
    492 years ago

    In days of yore, before Google or even Altavista, you could tell the quality of a team by how many O’Reilly books they had on the shelves.

    • @corsicanguppy@lemmy.ca
      link
      fedilink
      12 years ago

      I should sell mine. Maybe I’ll keep the crab book and the white book, but the latter’s not even an O’Reilly.

  • @Mr_Lobster@lemm.ee
    link
    fedilink
    English
    222 years ago

    I literally cannot comprehend coding with ChatGPT- How can I expect something to work if I don’t understand it, and how can I understand it if I don’t code and debug it myself? How can you expect to troubleshoot any issues afterwards if you don’t understand the code? I wouldn’t trust GPT for anything more complex than Hello World.

    • @Psythik@lemm.ee
      link
      fedilink
      42 years ago

      I haven’t been in web development in over 20 years; thanks to ChatGPT, I was able to get up-to-speed and start building websites again, when in the past I would have never been able to do so.

      GPT is a powerful tool that can allow anyone to do anything if they’re willing to put in the effort. We should be praising it, not making fun of it. It’s as revolutionary as the internet itself.

    • @worldsayshi@lemmy.world
      link
      fedilink
      11
      edit-2
      2 years ago

      You shouldn’t use code that you don’t understand. Chatgpt outputs quite readable and understandable code and makes sure to explain a lot of it and you can ask questions about it.

      It can save quite a lot of effort, especially for tasks that are more tedious than hard. Even more if you have a general idea of what you want to do but you’re not familiar with the specific tools and libraries that you want to use for the task.

      • @III@lemmy.world
        link
        fedilink
        English
        102 years ago

        It’s also wrong a lot. Hence the requirement for understanding. It can be helpful to get through a stretch but it will fuck up before too long and relying on it entirely is a bad idea.

    • Just yesterday, I wrote a first version of a fairly complex method, then pasted it into GPT-4. It explained my code to me clearly, I was able to have a conversation with it about the code, and when I asked it to write a better version, that version ended up having a couple significant logical simplifications. (And a silly defect that I corrected it on.)

      The damn thing hallucinates sometimes (especially with more obscure/deep topics) and occasionally makes stupid mistakes, so it keeps you on your toes a bit, but it is nevertheless a very valuable tool.

      • @philm@programming.dev
        link
        fedilink
        82 years ago

        That only really works, if the method is self-contained, and written in a language that GPT has seen often (such as python). I stopped using it, because for 1 in 10 successful tries I waste time for the other 9 tries…

    • @corsicanguppy@lemmy.ca
      link
      fedilink
      12 years ago

      I use it to give me prototypes for ansible because Ansible is junk. Then I build my stuff from the mishmash and have GPT check it. Cuts a lot of time down that I’d rather be doing any-bloody-thing else with.

    • @1984@lemmy.today
      link
      fedilink
      2
      edit-2
      2 years ago

      Often the code is self explanitory. I understand the code very often, but I still couldn’t write it correctly from scratch. You never feel like that?

      This is how code examples in books works too. You get some code to look at and try to understand it. Otherwise it’s like you would ignore code examples while learning programming.

    • @philm@programming.dev
      link
      fedilink
      42 years ago

      This.

      If I’m writing something slightly more complex, ChatGPT(4) is mostly failing.

      If I’m writing complex code, I don’t even get the idea of using ChatGPT, because I’m only getting disappointed, and in the end waste more time trying to “engineer” the prompt, only to get disappointed again.

      I currently cannot imagine using ChatGPT for coding, I was excited in the beginning, and it’s sometimes useful, but mostly not really for coding…

      • @worldsayshi@lemmy.world
        link
        fedilink
        32 years ago

        If you’re already knee deep in existing code and looking for bugs or need to write quite specific algorithms it seems not very useful. But if you for some reason need to write stuff that has the slightest feeling of boilerplate, like how do I interact with well established framework or service X while doing A, B C it can be really useful.

        • @oldfart@lemm.ee
          link
          fedilink
          22 years ago

          Also it’s often doing a great job if you paste a stack trace into it and maybe some surrounding code. I used it to fix someone else’s Java code as well as to upgrade some 3rd party Wordpress junk to latest PHP. I barely know Java and stopped following PHP news around version 5.6.

    • threelonmusketeers
      link
      fedilink
      English
      152 years ago

      Of course the first programmer did, but everyone who came after just copied her work and tweaked it a bit to suit their needs.

      • Basically, yeah. Dennis Ritchie wrote the C compiler because he knew exactly what her wanted to use it for and the kinds of code that he wanted to write. Then he went on to write the book that everyone used to learn the language.

        This is true of probably every language, library, framework, etc. The original designer writes it because he knows what he wants to do with it and does so. Then everyone else follows. People then add more features and provide demonstrations of how to use them, and others copy them. It is extremely hard to just look at an API and use that to figure out exactly which calls should be made and in what order. Everyone just reads from the examples and adapts them as needed.

    • @Skyrmir@lemmy.world
      link
      fedilink
      472 years ago

      It’s not the language. ChatGPT is about as useful as a decent code manual. It won’t actually solve any problems for you, but it can show you the general format for doing so.

        • @EatATaco@lemm.ee
          link
          fedilink
          English
          22 years ago

          Yeah I’ve used it for boiler plate stuff for things I’ve not done before, but I always then read about what it did and make sure I understand it and where to look further.

          • @marcos@lemmy.world
            link
            fedilink
            62 years ago

            Ok, I’ll use the “usually” wildcard to absorb this one.

            Odds are that ChatGPT can help you better with C# than the documentation. It’s also easier to navigate because you only need to know the answer before being able to make a good question, while merely knowing the answer and having a search engine won’t help you find the right Microsoft doc.

        • Sometimes whatever you are working with will have outdated or really poor docs, so an advanced internet info aggregator is useful in that sense.

          I started learning nix before chatgpt and it was a nightmare. I had to continually ask for help on discord, of all places, for things that should really be in the docs.

          Chatgpt makes nix easier, except not really because it’s info is outdated a lot of the time.

  • Today we have chatbots. Yesterday we had search engines and stack overflow. Before that we had books. And before that? Well what do you know… software programming is a relatively novel field. It’s almost as if nobody has perfected how it should be learned.

    The most valuable knowledge comes from experience. I copied plenty of code around during my learning days as well, and I still do it today. The most important part however is trying to understand the code you’re working with. If you can understand it, know when it fails, test it in the right way, etc., then sure, you could probably learn to code from chatbots. They provide the information, and you’re at liberty to do what you want with it. If you just copy it and forget, you’ll be a bad programmer. But it’s not like you couldn’t do that before either with the other sources that were available - there were plenty of bad programmers before we had these tools available too.

    That said, there is a risk that these chatbots do not provide any useful context around the code that they produce. When you learned from a book or stack overflow, you were reading from a reasonably authoritative source that could explain the code that was produced. But the authority behind the code from chatbots is probably much weaker than what we have from stack overflow, which in turn was probably also weaker than what we have from books. Does it have an effect or learning? I have no clue. But I still think you can learn from chatbots if you use the output that they provide in the right way. (Disclaimer: I have never used one of them and have no experience with them.)

  • @EnderMB@lemmy.world
    link
    fedilink
    852 years ago

    ChatGPT is banned by my employer, because they don’t want trade secrets being leaked, which IMO is fair enough. We work on ML stuff anyway.

    Anyway, we have a junior engineer that has been caught using ChatGPT several times, whether it’s IT flagging its use, seeing a tab open in their browser during a demo, or simply just seeing code they obviously didn’t write in code I’m reviewing.

    I recently tried to help them out on a project that uses React, and it is clear as day that this engineer cannot write code without ChatGPT. The library use is all over the place, they’ll just “invent” certain API’s, or they’ll use things that were deprecated/don’t work if you’ve even attempted to think about the problem. IMO, reliance on ChatGPT is much worse than how juniors used to be reliant on Stack Overflow to find answers to copy paste.

      • @EnderMB@lemmy.world
        link
        fedilink
        25
        edit-2
        2 years ago

        One of the dirty secrets at FAANG companies is that lots of people join from internships, and can get all the way to senior and above without ever needing to go through a standard, full technical loop. If you have a formal apprenticeship scheme, sometimes you’ll join through a non-tech loop.

    • @Nahdahar@lemmy.world
      link
      fedilink
      26
      edit-2
      2 years ago

      The underlying problem is the same, it just became more accessible to copy code you don’t understand (you don’t even need to come up with a search query that leads you to some kind of answer, chatpgt will interpret your words and come up with something). Proper use of chatgpt can boost productivity, but people (both critics of chatgpt and people who don’t actually know how to code) misuse it, look at it as a “magic solution box” instead of a tool that can assist development and lead you to solutions.

    • @canni@lemmy.one
      link
      fedilink
      02 years ago

      I’ve always, always been a intuition only guy. Meaning I almost never use any thing other than blind guessing on how languages and libraries work. I genuinely don’t feel I’m missing out on anything, my farts already smell better than the rest of my peers and I just don’t feel the need to learn the modern tools of my trade.

    • @alignedchaos@sh.itjust.works
      link
      fedilink
      172 years ago

      Sometimes there are better methods to implement something, and we can learn from others’ mistakes without having to make them ourselves

        • @TrickDacy@lemmy.world
          link
          fedilink
          12 years ago

          The fact that you people pretend to only use documentation like some elitist boyscouts actually does say something about you.

          I don’t believe you, you’re lying, you just want to seem smart. I don’t give a flying fuck if random Internet people think I’m smart or whatever the hell else you’re suggesting. Just flat don’t care. I know there is nothing wrong with using search engines and stack overflow and that we all do it. Pretty weird you all pretend otherwise. Kind of sad really that your ego requires this of you.