Skip to content
University of Maryland LibrariesDigital Repository at the University of Maryland
    • Login
    View Item 
    •   DRUM
    • A. James Clark School of Engineering
    • Institute for Systems Research Technical Reports
    • View Item
    •   DRUM
    • A. James Clark School of Engineering
    • Institute for Systems Research Technical Reports
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Dynamic Attractors and Basin Class Capacity in Binary Neural Networks

    Thumbnail
    View/Open
    TR_95-82.pdf (1.711Mb)
    No. of downloads: 328

    Date
    1995
    Author
    Dayhoff, Judith E.
    Palmadesso, Peter J.
    Metadata
    Show full item record
    Abstract
    The wide repertoire of attractors and basins of attraction that appear in dynamic neural networks not only serve as models of brain activity patterns but create possibilities for new computational paradigms that use attractors and their basins. To develop such computational paradigms, it is first critical to assess neural network capacity for attractors and for differing basins of attraction, depending on the number of neurons and the weights. In this paper we analyze the attractors and basins of attraction for recurrent, fully-connected single layer binary networks. We utilize the network transition graph - a graph that shows all transitions from one state to another for a given neural network - to show all oscillations and fixed-point attractors, along with the basins of attraction. Conditions are shown whereby pairs of transitions are possible from the same neural network. We derive a lower bound for the number of transition graphs possible 2n2- n , for an n-neuron network. Simulation results show a wide variety of transition graphs and basins of attraction and sometimes networks have more attractors than neurons. We count thousands of basin classes - networks with differing basins of attraction - in networks with as few as five neurons. Dynamic networks show promise for overcoming the limitations of static neural networks, by use of dynamic attractors and their basins. We show that dynamic networks have high capacity for basin classes, can have more attractors than neurons, and have more stable basin boundaries than in the Hopfield associative memory.
    URI
    http://hdl.handle.net/1903/5665
    Collections
    • Institute for Systems Research Technical Reports

    DRUM is brought to you by the University of Maryland Libraries
    University of Maryland, College Park, MD 20742-7011 (301)314-1328.
    Please send us your comments.
    Web Accessibility
     

     

    Browse

    All of DRUMCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister
    Pages
    About DRUMAbout Download Statistics

    DRUM is brought to you by the University of Maryland Libraries
    University of Maryland, College Park, MD 20742-7011 (301)314-1328.
    Please send us your comments.
    Web Accessibility