Tuesday, October 11, 2011

Open research problems with Evolving Connectionist Systems

I described Evolving Connectionist Systems (ECoS) in an earlier post. A couple of years ago, I published a review article (PDF preprint) where I described the state of the art of ECoS, and identified several open research problems. There hasn't been much progress made in solving these problems, so I'm going to briefly describe them here, and hopefully stimulate a bit more work in this area. Of course, I'm doing a bit of work in some of these, but as I have a real job to do, I don't get as much time to spend on these problems as I'd like.

1) Input significance. With other ANN, especially the venerable MLP, it is possible to get an indication of how important each input variable is to the model. These methods are based on an analysis of the magnitude of the connection weights attached to each input neuron. This method won't work with ECoS networks, however, because the connection weights represent points in space. That is, the magnitude of the weight for an input neuron connection has nothing to do with how important that input is.

2) Optimisation of ECoS networks. While ECoS algorithms are fast learning, they can grow to be quite large, which makes them expensive in terms of memory and computational load. Ideally, it would be possible to reduce their size without sacrificing their accuracy. That is, it would be ideal if we could somehow eliminate redundant information in the ECoS and only retain that which is necessary for maintaining accuracy. I investigated a couple of methods of doing this in my PhD, and a few other people have looked at it as well, but no one has yet cracked the problem in terms of coming up with an optimisation algorithm that will significantly reduce the size of a trained ECoS network without significantly reducing its accuracy. Also, the most effective optimisation methods in the published work use evolutionary algorithms like genetic algorithms or evolution strategies. These are so computationally intensive that the speed advantages of ECoS are lost. An ECoS optimisation algorithm would ideally be as fast, or nearly as fast, as the ECoS training algorithm. It may be that this is inherently impossible.

3) Non-triangular fuzzy membership functions in EFuNN. The Evolving Fuzzy Neural Network EFuNN has triangular fuzzy membership functions (MF) embedded in its structure. These are fast and efficient, but other MF types (such as Gaussian) may be more useful for other applications.

4) Learning in the MF of EFuNN. The fuzzy MF in EFuNN are fixed, that is, they are set once and do not change during the life of the EFuNN. This is in contrast to the open, adaptive nature of EFuNN itself. An extension of the EFuNN learning algorithm that would allow the MF to adapt as the rest of the network adapts, would be extremely useful for data mining applications. This algorithm would have to be as fast as the rest of the EFuNN learning algorithm, which may rule out backpropagation training of the MF, as is used in other fuzzy system optimisation.

Although ECoS networks are very useful algorithms, they could be made even more useful if the problems above were solved. I'm working on some of them, but I would love to see others working on them as well. Contact me if you are interested in collaborating.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.