Will A.I. Take Over The World!
Posted: Sun Jan 15, 2012 6:42 pm
Hi,
I've heard that scientists have managed to simulate half of a mouse's brain using a massive super computer. From this we can do some rough estimations...
A mouse's brain is about 0.4 grams. Half a mouse's brain is about 0.2 grams. An average human's brain is around 1500 grams. This implies that (if weight alone is a reasonable indicator) to simulate a human's brain you'd need around 7500 times the processing as simulating half a mouse's brain; or about 7500 massive super computers.
We all (should) know that twice as many processors doesn't mean twice as much processing - there's scalability problems, interconnects, etc. It'd be much more reasonable to assume you'd need 10000 massive super computers to get the amount of processing needed. Then, if you assume that the software being used for the mouse brain was only "good" and could be improved by a factor of 10 (which I think is unlikely, but why not), that gets us down to 1000 massive super computers to simulate a human brain.
But; simulating an entire human brain would be over-kill. It's fair to assume that about half of a real human's brain is wasted on things like where they left their car keys, what they ate for breakfast, remembering their spouse's birthday, breathing, etc. You'd probably be able to replace a human with half a simulated human brain. This means we're only looking at 500 massive super computers to replace one human.
Of course the "massive supercomputer" was actually an IBM BlueGene with 4096 processors (running at 700 MHz) and 1 TiB of RAM. If we need 500 of these then we'd be looking for a computer with 2048000 CPUs and 500 TiB of RAM (definitely not the average notebook you'd get from Wallmart).
If Moore's law continues to be a reasonable guide (twice as many transistors per year), and we start from this year with 16 CPUs and 8 GiB of RAM; then we might be able to expect "2048000 CPU" systems in about 17 years. Unfortunately this isn't a correct interpretation of Moore's law - twice as many transistors doesn't equate to twice as much processing power. A more realistic guide would be "twice as much processing power every 3 years". In that case we're looking at 51 years before we get enough processing power to simulate a human brain in "commodity" computers.
However; no sane person thinks transistor counts or processing power can continue to increase indefinitely. Even Gordon Moore has stated (in 2005) that it can't continue forever. There's bound to be physical limits (e.g. speed of electron/light), and the closer we get to those limits the harder further improvement is going to be. The thing is we've already reached the "maximum clock speed" limit - only a few CPU manufacturers have managed to get up to 6 GHz, and for a practical CPU (with good yield, etc) the limit is closer to 4 GHz. You'll notice that Intel managed to get close to 4 GHz with the Pentium 4 (about 10 years ago) and hasn't been able to improve speed since. The next big problem is heat.
To reduce heat the normal approach is to reduce clock speed. For example, rather than having 100 CPUs at 1 Ghz it's better to have 400 CPUs at 500 Mhz and do (in theory) twice as much processing for the same amount of heat. That's why the Blue Gene CPU's were only running at 700 MHz. To pack enough processing power into a small enough size, you'd probably need to reduce CPU speed by a factor of 4 (e.g. down to about 200 Mhz) and use 4 times as many CPUs (e.g. about 8000000 CPUs); and then use water cooling and/or refrigeration to remove the heat quickly enough. Maybe we will have enough processing power to replace a human with A.I in about 50 years (and they'd be about the size of a refrigerator, be twice as noisy and generate around 100 times more heat).
So... it does seems plausible! BUT...
... it ignores the laws of supply & demand.
Some jobs are easy - for example, A.I. and robotics has already replaced a lot of the mundane/repetitive work in factories. As A.I. gets more advanced it's reasonable to assume it will replace more human workers, starting from other mundane/repetitive work and gradually moving up to more complicated work. While this is happening the demand for human workers will decrease, and therefore the cost of human workers will decrease. Eventually we'd reach a point where (for complex work) the cost of human workers is less than the cost of adequate A.I. If you take this to the extreme, you're going to have an oversupply of bored humans that are willing to work for free just for the challenge, where the cost of A.I (including initial purchase, training, maintenance and power consumed) simply can't be justified.
The end result isn't A.I. taking over the world. The end result is that maybe in 50 years time (and probably more like 100 years) A.I. will take away all the boring work that humans don't want to do anyway.
I'd also expect a decrease in the cost of goods and services, and a decrease in the time humans spend working (e.g. maybe for humans "full time work" will be 20 hours per week instead of 38). I'd also expect a lot more online jobs (were people don't go to work, but work at home instead). Combined with an increase in online shopping we'll see a huge reduction in things like road traffic (and a downturn in things like car manufacturing and an increase in personal transport - moped, push bike, segway). I'd also expect an huge increase in entertainment (more movies, more computer games and more porn), simply because people have more time to spend.
Cheers,
Brendan
I've heard that scientists have managed to simulate half of a mouse's brain using a massive super computer. From this we can do some rough estimations...
A mouse's brain is about 0.4 grams. Half a mouse's brain is about 0.2 grams. An average human's brain is around 1500 grams. This implies that (if weight alone is a reasonable indicator) to simulate a human's brain you'd need around 7500 times the processing as simulating half a mouse's brain; or about 7500 massive super computers.
We all (should) know that twice as many processors doesn't mean twice as much processing - there's scalability problems, interconnects, etc. It'd be much more reasonable to assume you'd need 10000 massive super computers to get the amount of processing needed. Then, if you assume that the software being used for the mouse brain was only "good" and could be improved by a factor of 10 (which I think is unlikely, but why not), that gets us down to 1000 massive super computers to simulate a human brain.
But; simulating an entire human brain would be over-kill. It's fair to assume that about half of a real human's brain is wasted on things like where they left their car keys, what they ate for breakfast, remembering their spouse's birthday, breathing, etc. You'd probably be able to replace a human with half a simulated human brain. This means we're only looking at 500 massive super computers to replace one human.
Of course the "massive supercomputer" was actually an IBM BlueGene with 4096 processors (running at 700 MHz) and 1 TiB of RAM. If we need 500 of these then we'd be looking for a computer with 2048000 CPUs and 500 TiB of RAM (definitely not the average notebook you'd get from Wallmart).
If Moore's law continues to be a reasonable guide (twice as many transistors per year), and we start from this year with 16 CPUs and 8 GiB of RAM; then we might be able to expect "2048000 CPU" systems in about 17 years. Unfortunately this isn't a correct interpretation of Moore's law - twice as many transistors doesn't equate to twice as much processing power. A more realistic guide would be "twice as much processing power every 3 years". In that case we're looking at 51 years before we get enough processing power to simulate a human brain in "commodity" computers.
However; no sane person thinks transistor counts or processing power can continue to increase indefinitely. Even Gordon Moore has stated (in 2005) that it can't continue forever. There's bound to be physical limits (e.g. speed of electron/light), and the closer we get to those limits the harder further improvement is going to be. The thing is we've already reached the "maximum clock speed" limit - only a few CPU manufacturers have managed to get up to 6 GHz, and for a practical CPU (with good yield, etc) the limit is closer to 4 GHz. You'll notice that Intel managed to get close to 4 GHz with the Pentium 4 (about 10 years ago) and hasn't been able to improve speed since. The next big problem is heat.
To reduce heat the normal approach is to reduce clock speed. For example, rather than having 100 CPUs at 1 Ghz it's better to have 400 CPUs at 500 Mhz and do (in theory) twice as much processing for the same amount of heat. That's why the Blue Gene CPU's were only running at 700 MHz. To pack enough processing power into a small enough size, you'd probably need to reduce CPU speed by a factor of 4 (e.g. down to about 200 Mhz) and use 4 times as many CPUs (e.g. about 8000000 CPUs); and then use water cooling and/or refrigeration to remove the heat quickly enough. Maybe we will have enough processing power to replace a human with A.I in about 50 years (and they'd be about the size of a refrigerator, be twice as noisy and generate around 100 times more heat).
So... it does seems plausible! BUT...
... it ignores the laws of supply & demand.
Some jobs are easy - for example, A.I. and robotics has already replaced a lot of the mundane/repetitive work in factories. As A.I. gets more advanced it's reasonable to assume it will replace more human workers, starting from other mundane/repetitive work and gradually moving up to more complicated work. While this is happening the demand for human workers will decrease, and therefore the cost of human workers will decrease. Eventually we'd reach a point where (for complex work) the cost of human workers is less than the cost of adequate A.I. If you take this to the extreme, you're going to have an oversupply of bored humans that are willing to work for free just for the challenge, where the cost of A.I (including initial purchase, training, maintenance and power consumed) simply can't be justified.
The end result isn't A.I. taking over the world. The end result is that maybe in 50 years time (and probably more like 100 years) A.I. will take away all the boring work that humans don't want to do anyway.
I'd also expect a decrease in the cost of goods and services, and a decrease in the time humans spend working (e.g. maybe for humans "full time work" will be 20 hours per week instead of 38). I'd also expect a lot more online jobs (were people don't go to work, but work at home instead). Combined with an increase in online shopping we'll see a huge reduction in things like road traffic (and a downturn in things like car manufacturing and an increase in personal transport - moped, push bike, segway). I'd also expect an huge increase in entertainment (more movies, more computer games and more porn), simply because people have more time to spend.
Cheers,
Brendan