• Training neural nets

    From sean.c4s.vn@gmail.com@21:1/5 to All on Tue Aug 2 09:09:30 2016
    I thought of a way of training deep neural nets by evolving training pairs on a lower dimensional manifold. That is a very abstract type of training but my intuition is that it should work well and greatly reduce the computational effort required. We'
    ll see how it turns out.
    Just arbitrarily I was going to try with say images composed of 100 circles chosen by the evolutionary algorithm. I suppose though fractals are a type of lower dimensional manifold too. If there were some parameterized fractal you could use.
    Off topic, I read the science news every day on phys.org and the general world news on bing.com. The science news has people working very constructively for a better life for everyone. The general world news is a depiction of utter loonacy.
    I find the contrast quite extraordinary.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From sean.c4s.vn@gmail.com@21:1/5 to All on Wed Aug 3 16:57:45 2016
    I did work out an alternate way using two different spreading transforms multiplied together. I doubt that code is even going to get completed though.
    Instead I figured out a reasonable way to grow tree neurons. Ie the neuron is a decision tree.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)