Teramachine is the second milestone of the examachine research program. It features teraflop-scale incremental machine learning, therefore it embodies an integrated long-term memory which works in black-box fashion from the user’s point of view. I can really feel this is some high technology, it’s a far cry from those neural nets and knn’s and SVM’s. Right now it may be a good time to utter the slogan then:

Algorithmic Probability Theory is to genetic programming what Statistical Learning Theory is to artificial neural networks

That is to say, ALP helps us place the essential work of constructing the right programs under the theoretical microscope and design the best ways to achieve it, rather than trying a gazillion of ad hoc methods with no way to really gauge progress in the problems. I think that is one of the very reasons SVM’s became hugely popular, even beating hand-crafted neural net solutions in some cases. If we can reason about the generalization error,  how effective a method is, how hard it is to compute, then we can build a lot more on top of such machinery as they become reliable tools. We can understand the performance, imagine ways to improve it, and find solutions, whereas otherwise no solution would seem more preferable than the other except for what some random experiments seem to suggest. Without a solid theoretical foundation, it’s a shot in the dark really. There is a chance of hitting, but not very much.

I have now conclusive quantitative evidence that my Heuristic Algorithmic Memory design is effective. Is it only a matter of time before teramachine reaches primary-school level intelligence? I guess it is not that easy, since

  1. There are still many update algorithms that I have not even started programming or finished the design of, that I have to implement
  2. Designing training sequences is a difficult problem
  3. Even making the induction system + transfer learning work well is no guarantee that your cognitive architecture will function well.

Right now, I can declare that I have a working Phase 1 Alpha system of Solomonoff’s grand design. The objective is to implement Alpha in its full glory, I might call it Aleph in reference to Halo. There are also some pending practical issues such as finishing porting the code to CUDA. I have had to work around the limitations of CUDA architecture, in fact the very first implementation simply crashed although there is nothing wrong. It seems I am having bad luck with systems code, in the last two weeks, ocamlmpi+openmpi prevented me from seeing errors (some exceptions were hidden and the program just seemed to hang!), I had to fix a memory corruption error in ocaml gmp bindings, and I had to wrestle with infiniband drivers to get the parallel code to work. Furthermore, it seemed that I had butchered one of the update algorithms, probably intending to make a major change and then left it broken like that, I finally noticed it and fixed it. Nevertheless, I have eliminated all of those minor hurdles, and now I have gotten the four synergistic update algorithms working, and with your permission, the teramachine is blasting off right now, there are so many cores to burn! Onwards to the petamachine!

In the following weeks, I might give some more details about the capabilities of teramachine and  the next milestone.

Teramachine is operational

Eray Özkural

Eray Özkural has obtained his PhD in computer engineering from Bilkent University, Ankara. He has a deep and long-running interest in human-level AI. His name appears in the acknowledgements of Marvin Minsky's The Emotion Machine. He has collaborated briefly with the founder of algorithmic information theory Ray Solomonoff, and in response to a challenge he posed, invented Heuristic Algorithmic Memory, which is a long-term memory design for general-purpose machine learning. Some other researchers have been inspired by HAM and call the approach "Bayesian Program Learning". He has designed a next-generation general-purpose machine learning architecture. He is the recipient of 2015 Kurzweil Best AGI Idea Award for his theoretical contributions to universal induction. He has previously invented an FPGA virtualization scheme for Global Supercomputing, Inc. which was internationally patented. He has also proposed a cryptocurrency called Cypher, and an energy based currency which can drive green energy proliferation. You may find his blog at https://log.examachine.net and some of his free software projects at https://github.com/examachine/.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.