Ai Forums Home Welcome Guest    Wednesday, September 20, 2017
Ai Site > Ai Forums > The Artificial Intelligence Forum > Genetic Algorithms for AI DL Last PostsLoginRegisterWhy Register
Topic: Genetic Algorithms for AI DL

ToolTech
posted 3/23/2017  20:29Send e-mail to userReply with quote
New method to improve backprop

 Article

keghn
posted 3/23/2017  20:59Send e-mail to userReply with quote
What is HW?


ToolTech
posted 3/23/2017  21:27Send e-mail to userReply with quote
HW stands for hardware and is typically GPU or FPGA or other hardware used to evaluate and train AI


keghn
posted 3/23/2017  22:47Send e-mail to userReply with quote
I have my own style of genetic algorithm for NN's to use in AGI's
infrastructure.

How do yours work?


ToolTech
posted 3/23/2017  23:01Send e-mail to userReply with quote
Difficult to answer here. Have spent my spare time since 2013 to develop an entire eco system of sw to form a complete solution for deep genetic algorithms.

In short i have found a mechanism to shorten training time and to find better solutions for sparse datasets


keghn
posted 3/23/2017  23:49Send e-mail to userReply with quote
Well, i consider my self a really good at NN theorist.

Between 2013 and late 2015 was the deep learning boom.

What you say is you have a one shot NN.

I should be able to understand your, somewhat.


 One-Shot Learning - Fresh Machine Learning #1

ToolTech
posted 3/24/2017  00:10Send e-mail to userReply with quote
The main thing I am working on is my findings about the duality between gradient descent and genetic search

A Link below

 http://www.tooltech-software.com/cortex/genetic_simulation.pdf

ToolTech
posted 3/24/2017  00:18Send e-mail to userReply with quote
The other part is that I realize the NN in a different way. Like a compiler generating assembler output

 https://www.linkedin.com/pulse/introduction-deep-learning-assembly-language-anders-mod%C3%A9n
Last edited by ToolTech @ 3/24/2017 12:20:00 AM

keghn
posted 3/24/2017  03:20Send e-mail to userReply with quote
https://groups.google.com/forum/#!topic/comp.ai.neural-nets/96FECpvKr1w

 goggle nn forum

keghn
posted 3/24/2017  15:43Send e-mail to userReply with quote
I believe i understand most of what you are saying.
It just happen i am working on audio waves right now in C/C++.
For my AGi audio infrastructure project. The AGI vision infrastructure will
come next .

So i can really relate to that sine wave example.
I like assembly language But by the time a master 6502 assembly language
thing had move on. C/C++ is so much more portable and there is long
lasting continuity that I like.
I had it with real mod, flat mod, and dos extenders.

So now i prefer Linux.

And i am still getting around to a in line
assembler, if i find an easy tutorial.

So back to the sine wave example:

http://www.tooltech-software.com/cortex/genetic_simulation.pdf

Right now I am not recording with RNN or LSTM. maybe later.
I am just copying audio directly into memory and hit it with
FFT algorithms.
My approach to genetic nn algorithms is just like yours. I start with one
little NN with random weights. Then feed in the data and record the
output data.
All of this tiny NN simulation would happen in a very small location in
memory.



 nn function autoencoder
Last edited by keghn @ 3/24/2017 3:49:00 PM

keghn
posted 3/24/2017  16:03Send e-mail to userReply with quote
Then a clone of would be made in the near memory area. One weight would be allowed to re-randomise and the others are pinned.
A link is made So that i have a link list to all of other mutated clones. And
to others that follow.
A search would look through each one and test it with gradient of descent.

In a real brain this way would work really well because nn work like
a non centralised cpu. Every thing is running in parallel.

https://www.youtube.com/watch?v=Bs4LwCjA12o

nn function autoencoder

 nn function autoencoder
Last edited by keghn @ 4/3/2017 11:27:00 PM

AiHasBeenSolved
posted 3/25/2017  14:33Send e-mail to userReply with quote
 
keghn wrote @ 3/24/2017 4:03:00 PM:
Then a clone of would be made in the near memory area.

 
My AI Minds use the AudListen mind-module to send input into the auditory memory.

 Perl Mind Programming Journal

ToolTech
posted 3/25/2017  17:04Send e-mail to userReply with quote
Are u interested in my framework! I am thinking of letting it out to a few beta testers


keghn
posted 3/25/2017  23:08Send e-mail to userReply with quote
I find your framework interesting.


keghn
posted 3/28/2017  16:08Send e-mail to userReply with quote
Evolution Strategies as a Scalable Alternative to Reinforcement Learning


https://blog.openai.com/evolution-strategies/

 Evolution Strategies as a Scalable Alternative to Reinforcement Learning

ToolTech
posted 3/28/2017  17:03Send e-mail to userReply with quote
Yes. I saw that. Its actually very very simillar to exactly what i have been working on since last year. I have published a lot of details since 2016 about it


DisAlteraVisum
posted 4/3/2017  07:49Send e-mail to userReply with quote
A huge factor of DNA is the proteins that it produces. Another is epigenetic effects. Try taking into account which proteins are produced at which times rather than the DNA structure. Then work your way back to the DNA and whether it is switched on.


ToolTech
posted 4/3/2017  14:03Send e-mail to userReply with quote
Off topic. Not at all what we are talking about. We are talking about deep learning and neural networks


ToolTech
posted 4/3/2017  19:34Send e-mail to userReply with quote
http://tooltech-software.com/cortex/genetic_simulation.pdf

 http://tooltech-software.com/cortex/genetic_simulation.pdf

keghn
posted 4/5/2017  18:00Send e-mail to userReply with quote
DNA is a 4 bit number system. Values are called nibbles.
I work with it lot for testing data compression.


@ToolTech I seen in the neural network world that there are encoders,
perceptrons, and decoders.

In the post above there are is decoder that does a sine wave. Input data is transformed into a output data.

What do you think of perceptrons? To detect something and then pushed out a activation on a dedicated channel.
The way i envision a simple detector
is a detector NN that detect the height of a sine wave.
Let say the a sign that wave bounce between 0 and 1 and can be trained
to detect 0.5 .
This perceptron would have one input and eleven outputs for:
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.999999999999
When 0.5 is detected it would be trained to generate a 0.99999 on output nerve ending
number five? All other of the other outputs would be less then 0.999999999.



Last edited by keghn @ 4/5/2017 6:03:00 PM
  1  2  
'Send Send email to user    Reply with quote Reply with quote    Edit message Edit message

Forums Home    Hal and other child machines    Alan and other chatbots    Language Mind and Consciousness  
Contact Us Terms of Use