Skip to content

A Julia package for Adaptive Resonance Theory (ART) algorithms.

License

Notifications You must be signed in to change notification settings

markNZed/AdaptiveResonance.jl

 
 

Repository files navigation

AdaptiveResonance

A Julia package for Adaptive Resonance Theory (ART) algorithms.

Documentation Build Status Coverage
Stable Build Status Codecov
Dev Build Status Coveralls
Dependents Date Status
deps version pkgeval

This package is developed and maintained by Sasha Petrenko with sponsorship by the Applied Computational Intelligence Laboratory (ACIL). This project is supported by grants from the Night Vision Electronic Sensors Directorate, the DARPA Lifelong Learning Machines (L2M) program, Teledyne Technologies, and the National Science Foundation. The material, findings, and conclusions here do not necessarily reflect the views of these entities.

Please read the documentation for detailed usage and tutorials.

Contents

Overview

Adaptive Resonance Theory (ART) is a neurocognitive theory of how recurrent cellular networks can learn distributed patterns without supervision. As a theory, it provides coherent and consistent explanations of how real neural networks learn patterns through competition, and it predicts the phenomena of attention and expectation as central to learning. In engineering, the theory has been applied to a myriad of algorithmic models for unsupervised machine learning, though it has been extended to supervised and reinforcement learning frameworks. This package provides implementations of many of these algorithms in Julia for both scientific research and engineering applications. Basic installation is outlined in Installation, while a quickstart is provided in Quickstart. Detailed usage and examples are provided in the documentation.

Installation

This project is distributed as a Julia package, available on JuliaHub. Its usage follows the usual Julia package installation procedure, interactively:

] add AdaptiveResonance

or programmatically:

using Pkg
Pkg.add("AdaptiveResonance")

You may also add the package directly from GitHub to get the latest changes between releases:

] add https://github.com/AP6YC/AdaptiveResonance.jl

Quickstart

Load the module with

using AdaptiveResonance

The stateful information of ART modules are structs with default constructures such as

art = DDVFA()

You can pass module-specific options during construction with keyword arguments such as

art = DDVFA(rho_ub=0.75, rho_lb=0.4)

For more advanced users, options for the modules are contained in Parameters.jl structs. These options can be passed keyword arguments before instantiating the model:

opts = opts_DDVFA(rho_ub=0.75, rho_lb=0.4)
art = DDVFA(opts)

Implemented Modules

This project has implementations of the following ART (unsupervised) and ARTMAP (supervised) modules:

  • ART
    • DDVFA: Distributed Dual Vigilance Fuzzy ART
    • DVFA: Dual Vigilance Fuzzy ART
    • GNFA: Gamma-Normalized Fuzzy ART
  • ARTMAP
    • SFAM: Simplified Fuzzy ARTMAP
    • FAM: Fuzzy ARTMAP
    • DAM: Default ARTMAP

In addition to these modules, this package contains the following accessory methods:

  • ARTSCENE: the ARTSCENE algorithm's multiple-stage filtering process is implemented as artscene_filter. Each filter stage is exported if further granularity is required.
  • performance: classification accuracy is implemented as performance
  • complement_code: complement coding is implemented with complement_code. However, training and classification methods complement code their inputs unless they are passed preprocessed=true.
  • linear_normalization: the first step to complement coding, linear_normalization normalizes input arrays within [0, 1].

Structure

The following file tree summarizes the project structure:

AdaptiveResonance
├── .github/workflows       // GitHub: workflows for testing and documentation.
├── data                    // Data: CI data location.
├── docs                    // Docs: documentation for the module.
│   └───src                 //      Documentation source files.
├── examples                // Source: example usage scripts.
├── src                     // Source: majority of source code.
│   ├───ART                 //      ART-based unsupervised modules.
│   ├───ARTMAP              //      ARTMAP-based supervised modules.
│   └───CVI                 //      Cluster validity indices.
├── test                    // Test: Unit, integration, and environment tests.
├── .appveyor               // Appveyor: Windows-specific coverage.
├── .gitignore              // Git: .gitignore for the whole project.
├── LICENSE                 // Doc: the license to the project.
├── Project.toml            // Julia: the Pkg.jl dependencies of the project.
└── README.md               // Doc: this document.

Contributing

Please raise an issue.

History

  • 7/10/2020 - Begin project.
  • 11/3/2020 - Complete baseline modules and tests.
  • 2/8/2021 - Formalize usage documentation.

Credits

Authors

Software

Adaptive Resonance Theory has been developed in theory and in application by many research groups since the theory's conception, and so this project was not developed in a vacuum. This project itself is built upon the wisdom and precedent of decades of previous work in ART in a variety of programming languages. The code in this repository is inspired the following repositories:

Datasets

Boilerplate clustering datasets are periodically used to test, verify, and provide example of the functionality of the package.

  1. UCI machine learning repository: http://archive.ics.uci.edu/ml

  2. Fundamental Clustering Problems Suite (FCPS): https://www.uni-marburg.de/fb12/arbeitsgruppen/datenbionik/data?language_sync=1

  3. Datasets package: https://www.researchgate.net/publication/239525861_Datasets_package

  4. Clustering basic benchmark: http://cs.uef.fi/sipu/datasets

License

This software is openly maintained by the ACIL of the Missouri University of Science and Technology under the MIT License.

About

A Julia package for Adaptive Resonance Theory (ART) algorithms.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 96.0%
  • TeX 4.0%