Thursday, January 29, 2009


In Brief...


This enterprise was canceled for now. I've been in hardware engineering, and I see it takes so much time and is so complicated to implement algorithms which are actually so simple. :)
Software is a much better approach for research of new architectures. Of course!


You know GeForce and GPU? :) GPGPU is General Purpose GPU. My turn back to CGI and a friend of mine who's deep inside it reminded me that this is maybe a better tool to reach to high computing power if needed.

- AI, FPGA, GPGPU, GRID, Computing power???

Generally I don't think computing power is the problem of AI anyway. IMHO we've got more than enough. The problem is the lack of adequate human thinking power. All that zillions of instructions per second and more important: MEMORY accessible by that computing power, are used to calculate boring static things like a zillion of triangles (sorry, not that it's not cool), each of them resulting to something as trivial as a new triangle. Or a pixel, which results in another pixel. Or physics calculations which are reduced to a movement of a barrel . Or weather forecast, which finally is reduced to degrees in Celsium.

Of course that if the algorithms run on these machine are static and do not evolve, they will not lead to anything new like intelligence.

We've got so much memory now, especially those guys with big heads and big super computers. The problem is us, not the hardware.

Con: You're dreaming! You're a mad scientist! Do you know how many neurons, each neuron, 5 zillion PETAFLOPS, blah-blah, brain, evolution, blah-blah. When we reach computing power, blah-blah-blah! In 15-20 years! Blah!

- I agree that we certainly need the cumulative abstract computing power of the Universe needed for a part of it to evolve to an AI to be reached, but I do not that it is related to local. Oh. Sorry, I will not argue. Yes, I am a mad scientist, watch me in Twenkid Studio's "The Lectures of Professor Matematikov" when it is screened on your TV or in selected Youtube channels. Bye now!

No comments :