Earlier this week I discussed an example of ChatGPT giving ‘placeholder’ answers in lieu of real answers. Below is an example of what that looks like. I could swear this didn’t used to happen, but it basically just ‘doesn’t’ answer your question. I’m interested how often other people see this behavior.

    • TropicalDingdong@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      yeah well I’m not paying for pseudo code. It’s a waste of my time and a waste of a prompt. It’s also not an answer to the question being asked.

  • magiccupcake@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    I see this a lot too, i think chatgpt anticipates the task being difficult?

    I just be more specific, or have it expand its placeholder.