A Comparison of Learning Rules for Mixed Order Hyper Networks

Kevin Swingler

2015

Abstract

A mixed order hyper network (MOHN) is a neural network in which weights can connect any number of neurons, rather than the usual two. MOHNs can be used as content addressable memories with higher capacity than standard Hopfield networks. They can also be used for regression, clustering, classification, and as fitness models for use in heuristic optimisation. This paper presents a set of methods for estimating the values of the weights in a MOHN from training data. The different methods are compared to each other and to a standard MLP trained by back propagation and found to be faster to train than the MLP and more reliable as the error function does not contain local minima.

Download


Paper Citation


in Harvard Style

Swingler K. (2015). A Comparison of Learning Rules for Mixed Order Hyper Networks . In Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015) ISBN 978-989-758-157-1, pages 17-27. DOI: 10.5220/0005588000170027

in Bibtex Style

@conference{ncta15,
author={Kevin Swingler},
title={A Comparison of Learning Rules for Mixed Order Hyper Networks},
booktitle={Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015)},
year={2015},
pages={17-27},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005588000170027},
isbn={978-989-758-157-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015)
TI - A Comparison of Learning Rules for Mixed Order Hyper Networks
SN - 978-989-758-157-1
AU - Swingler K.
PY - 2015
SP - 17
EP - 27
DO - 10.5220/0005588000170027