VectorLinux
December 17, 2014, 01:03:56 pm *
Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
News: Visit our home page for VL info. To search the old message board go to http://vectorlinux.com/forum1. The first VL forum is temporarily offline until we can find a host for it. Thanks for your patience.
 
Now powered by KnowledgeDex.
   Home   Help Search Login Register  
Please support VectorLinux!
Pages: [1]
  Print  
Author Topic: hopfield experimentation  (Read 2543 times)
Triarius Fidelis
Vecteloper
Vectorian
****
Posts: 2399


Domine, exaudi vocem meam


WWW
« on: January 02, 2009, 07:05:22 am »

What with my interest in artificial intelligence, I wrote a Hopfield network implementation early this morning, and tested it.

Well the first thing I learned is that I'll have to write my Python module in C if I want to use it on any substantial data, because it's slow as a pig for anything but small images. That being said, I started out with one image:



When I asked the Hopfield network for the same, it gave me:



...which, when negated, is the same as the original. I remember reading that Hopfield networks have a tendency to remember the inverse of their pattern. So far so good.

Next thing I tried was memory of one of several images, which proved more difficult. After making a poor selection of Elder Futhark runes, I attempted to remember Ansuz and got this:



Messed up. Looks like Thurisaz and something else. My selection of runes was too self-similar. I tried again, consciously trying to pick out a distinct set: Raidho, Algiz, Sowilo, Tiwaz, Dagaz and Othala. I tried to remember Tiwaz.



Closer, but it looks like Othala ruined the image. After removing Othala:



"I know a twelfth ..."

I tried Sowilo but Tiwaz jinxed that and after a while I figured out that my network could only remember Raidho and Tiwaz very well together. Here's Raidho:



Well that experiment kind of sucked, but it showed me that, especially if I have monochrome images, the originals shouldn't look too similar. I'm going to try with Chinese characters next time, and introduce distortion if that next experiment turns out to be successful. Stay tuned!!
Logged

"Leatherface, you BITCH! Ho Chi Minh, hah hah hah!"

Formerly known as "Epic Fail Guy" and "Döden" in recent months
rbistolfi
Packager
Vectorian
****
Posts: 2291


« Reply #1 on: January 02, 2009, 08:59:12 am »

Awesome, staying tuned...

PS: Dude, pass some wisdom Smiley
Logged

"There is a concept which corrupts and upsets all others. I refer not to Evil, whose limited realm is that of ethics; I refer to the infinite."
Jorge Luis Borges, Avatars of the Tortoise.

--
Jumalauta!!
Triarius Fidelis
Vecteloper
Vectorian
****
Posts: 2399


Domine, exaudi vocem meam


WWW
« Reply #2 on: January 03, 2009, 02:31:58 am »

Alright, I learned more new things today. The first is that reimplementing a data structure in C about which almost everything has O(n^2) complexity does not necessarily save much time or space.

Also, my latest round was successful, if agonizingly slow, because the images were very distinct in terms of how many bits would be 1's if they were XOR'ed together. Hopfield network presented:



Perfect recall (I checked) of the latter, even with 25% noise:



Awesome.

My next neural network experiment will be with much less computationally expensive backpropagation.

By the way, here's the actual code I used for the Hopfield net implementation:

http://paste.lisp.org/display/72998

I'd show you the driver code, but it is exceedingly ugly.

I'd also like to figure out how 'distinct' images should be (using logical XOR as my measuring stick as mentioned earlier) for the network to recall them effectively.
« Last Edit: January 04, 2009, 12:42:43 am by Epic Fail Guy » Logged

"Leatherface, you BITCH! Ho Chi Minh, hah hah hah!"

Formerly known as "Epic Fail Guy" and "Döden" in recent months
wcs
Packager
Vectorian
****
Posts: 1144


« Reply #3 on: January 29, 2009, 03:13:11 am »

Very cool...
Now and then I experiment with neural networks, but not much. Would like to take it more seriously... I'll be starting with a crappy model that was proposed for some psychological data, and suspect that it will fail miserably when confronted with the data I've gathered... but let's see. Sometimes the results can be quite surprising.

I've used Matlab for it before, but it can be a pain with all the registration hassle. I need to have the cd in the drive to launch the program and the like, so would be interested in some open-source alternatives (ideally R or Python).

How do you do them in Python? Are you programming the whole thing, or is there some module for it?
Logged
Triarius Fidelis
Vecteloper
Vectorian
****
Posts: 2399


Domine, exaudi vocem meam


WWW
« Reply #4 on: January 29, 2009, 01:58:52 pm »

If you click on the link I have in the previous post, it takes you to the source code I personally wrote for the Hopfield model. I believe there are already Python implementations of back-propagating neural networks, in fact you should Google bpnn.

The way I knew how to implement the Hopfield network model was through the book Neural Network Architectures by Judith Dayhoff. I recommend this book because, while it lacks any code, it has reasonably detailed intermediate-level descriptions of neural networks that hand-wave enough to make them very readable, but not enough that they are useless for implementation. In fact, this book was very useful for implementation; it's not hard to finagle a proper algorithm out of the author's description. If you want to go further and read all the gory details about gradient descent and the like, there are plenty of books that cover them too.
« Last Edit: January 30, 2009, 12:04:37 am by Epic Fail Guy » Logged

"Leatherface, you BITCH! Ho Chi Minh, hah hah hah!"

Formerly known as "Epic Fail Guy" and "Döden" in recent months
wcs
Packager
Vectorian
****
Posts: 1144


« Reply #5 on: January 30, 2009, 08:17:15 am »

Thank you! I'll have a look at your code and bpnn.
I would only need to implement a backprop network with one hidden layer, so it looks like it won't be much of a problem...
I would need to train it with lots of patterns, though, so it might be quite slow...
Logged
Triarius Fidelis
Vecteloper
Vectorian
****
Posts: 2399


Domine, exaudi vocem meam


WWW
« Reply #6 on: January 30, 2009, 09:44:22 am »

In my experience, the number of input patterns isn't the issue so much as size of the network. On the other hand, I believe there is an upper bound on how many input patterns you can have for a BPNN of a given size, but I'm not sure. I remember for a Hopfield network, you can present it with something like 0.15 * n input patterns, where n is the number of neurons. Don't quote me on anything wrt BPNN's.
Logged

"Leatherface, you BITCH! Ho Chi Minh, hah hah hah!"

Formerly known as "Epic Fail Guy" and "Döden" in recent months
wcs
Packager
Vectorian
****
Posts: 1144


« Reply #7 on: January 30, 2009, 09:57:54 am »

In my case, I would need at least around 200-300 nodes in the input and output layers... the number of hidden units is to be determined, but should be more than 100 or so. Then the training would be on (at least) 3000 patterns until the performance was good enough.... I don't know if this qualifies as "big", though.
Logged
Triarius Fidelis
Vecteloper
Vectorian
****
Posts: 2399


Domine, exaudi vocem meam


WWW
« Reply #8 on: January 30, 2009, 05:49:46 pm »

That's not a lot. My Hopfield network used 4,096.
Logged

"Leatherface, you BITCH! Ho Chi Minh, hah hah hah!"

Formerly known as "Epic Fail Guy" and "Döden" in recent months
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!