so is calling it fabrication. something incapable of knowing what is true cannot lie.
also, gpts and image generators are fundamentally different technologies sharing very little code beyond the basic matrix manipulation stuff, so the definition of truth needs to be very different.
that’s literally how it works though, the software is trained to remove noise from images and then you feed it pure noise and tell it there’s an image behind it. If that’s not hallucination idk what would be.
that’s more of a comment on the usage than on the technology itself.
remember that google deepdream thing that would hallucinate dogs everywhere? it’s the same tech.
*shrugs
I think calling it a hallucination is anthropomorphizing the technology.
so is calling it fabrication. something incapable of knowing what is true cannot lie.
also, gpts and image generators are fundamentally different technologies sharing very little code beyond the basic matrix manipulation stuff, so the definition of truth needs to be very different.
that’s literally how it works though, the software is trained to remove noise from images and then you feed it pure noise and tell it there’s an image behind it. If that’s not hallucination idk what would be.