I've always been fascinated by Neural Networks - the basic principles are simple and more importantly - intuitive. Yet the final result is complex and rarely explanable. For some time, Neural Networks have been the best learning method for complex, highly non-linear problems. Nowadays it is not so, there are many more learning methods, be it kernel-based Support Vector Machines, Bayesian methods and others, each better at solving different problems. Despite my subjective liking for Neural Networks, I must admit that I never found any problem that was better solved with Neural Networks than with, for example, Support Vector Machines. Yet, there are tons of problems out there and almost as much different types of Neural Networks. So in many cases it's worth considering Neural Networks as one of the various problem solving options.
Andrew Kirrilov has published an interesting Neural Network library on CodeProject, which includes some important network architectures (Back Propagation, Kohonen Self-Organizing Map, Elastic Network, Delta Rule Learning, Perceptron Learning). It seems to me like a good way to better understand how Neural Networks work and maybe integrate them in some applications that require the solving of complex problems. Check it out!
P.S: I have never heard of a similar library for Support Vector Machines. Maybe when I have some time, I will implement something myself and publish it. I don't promise anything - time is currently the resource I lack the most...
No comments:
Post a Comment