So the whole chip is a complicated lens, that somehow can perform multiplication using ‘analogue computation’.
Removed by mod
We tricked the same rocks we use for doing math into bending light like the glass, and we use that for doing math, yes.
Imo, analog computation is the way forward with this whole AI thing. It seems like a waste to perform calculations bit-by-bit when neural nets are generally okay with “fuzzy math” anyway.
Removed by mod
Idk, maybe. But i think you may have issues with tolerances and reproducibility. With analog and neutral nets your going to have edgecases where some devices will give vastly differing outcomes. For something that’s fine but not for others.
I don’t want fuzzy math anywhere near autonomous armed machines. You want ED-209? Because that’s how you get ED-209.
human brains are the epitome of fuzzy math machines
Removed by mod
Yeah so nothing will change
Removed by mod
Digital is also analog.
Automobile analogy: there is no replacement for displacement… until there is?
Computers are starting to use staggering amounts of electricity. There is a trade-off here between the utility of the tasks they perform and the climate damage caused by generating all the electricity they need. Bitcoin mining is thought to be currently using 2% of America’s electricity and seems an especially egregious waste of energy.
Radically diminishing computer’s electricity requirements as they become more powerful should be seen as an urgent task.
Removed by mod
Aren’t modern computers taking way less energy than before per work? We just keep using more of it faster than the energy use decreases?
Yes and yes