Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgexternal-linkmessage-square63fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLugh@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agomessage-square63fedilink
minus-squareconciselyverbose@sh.itjust.workslinkfedilinkEnglisharrow-up0·7 months agoA combination of unique, varied parts is a complex algorithm. A bunch of the same part repeated is a complex model. Model complexity is not in any way similar to algorithmic complexity. They’re only described using the same word because language is abstract.
A combination of unique, varied parts is a complex algorithm.
A bunch of the same part repeated is a complex model.
Model complexity is not in any way similar to algorithmic complexity. They’re only described using the same word because language is abstract.