Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README.md #499

Closed
wants to merge 3 commits into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 47 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,83 +16,95 @@ Issue tracker at [issues.numenta.org](https://issues.numenta.org/browse/NPC).

For more detailed documentation, see the [OPF wiki page](https://github.com/numenta/nupic/wiki/Online-Prediction-Framework).

__Encoders__ turn raw values into sparse distributed representations (SDRs). A good encoder will capture the semantics of the data type in the SDR using overlapping bits for semantically similar values.
*__Encoders__* turn raw values into sparse distributed representations (SDRs). A good encoder will capture the semantics of the data type in the SDR using overlapping bits for semantically similar values.

__Models__ take sequences of SDRs and make predictions. The CLA is implemented as an OPF model.
*__Models__* take sequences of SDRs and make predictions. The CLA is implemented as an OPF model.

__Metrics__ take input values and predictions and output scalar representations of the quality of the predictions. Different metrics are suitable for different problems.
*__Metrics__* take input values and predictions and output scalar representations of the quality of the predictions. Different metrics are suitable for different problems.

__Clients__ take input data and feed it through encoders, models, and metrics and store or report the resulting predictions or metric results.
*__Clients__* take input data and feed it through encoders, models, and metrics and store or report the resulting predictions or metric results.

## Installation

For all installation options, see the [Getting Started](https://github.com/numenta/nupic/wiki/Getting-Started) wiki page.

Currently supported platforms:
#### Currently supported platforms:

* Linux (32/64bit)
* Mac OSX
* Raspberry Pi (ARMv6)
* [VM images](https://github.com/numenta/nupic/wiki/Running-Nupic-in-a-Virtual-Machine)

Dependencies:
#### Dependencies:

* Python (2.6-2.7) (with development headers)
* GCC (4.6-4.8), or Clang
* Make
* Make or any IDE supported by CMake (Visual Studio, Eclipse, XCode, KDevelop, etc)

The dependencies are included in platform-specific repositories for convenience:

* [nupic-linux64](https://github.com/numenta/nupic-linux64) for 64-bit Linux systems
* [nupic-darwin64](https://github.com/numenta/nupic-darwin64) for 64-bit OS X systems

Add the following to your .bashrc file. Change the paths as needed.

# Installation path
export NTA=$HOME/nta/eng
# Target source/repo path. Defaults to $PWD
export NUPIC=/path/to/repo
# Convenience variable for temporary build files
export BUILDDIR=/tmp/ntabuild
# Number of jobs to run in parallel (optional)
export MK_JOBS=3

# Set up the rest of the necessary env variables. Must be done after
# setting $NTA.
source $NUPIC/env.sh

Complete set of python requirements are documented in [requirements.txt](/external/common/requirements.txt),
compatible with [pip](http://www.pip-installer.org/en/latest/cookbook.html#requirements-files):

pip install -r external/common/requirements.txt

Build and install NuPIC:
## Build and test NuPIC:

### Using command line

#### Generate build files:

mkdir (source)/build
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we explicitly defined what (source) means here? Or maybe we can add a line above that specifies that all commands are being run from the nupic checkout directory, and assume that's where we are. Then we don't even have to use a reference to the source code dir and make all the commands relative.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right Matt, (source) [or maybe (source dir)] is the nupic checkout directory itself (ex: /Volumes/DavidRagazzi/Desktop/nupic-master), not a literal value. I'm avoiding use env variables because this avoid a newbie have to read CMakeLists or shell scripts to find definitions. Your idea is good, we could add a line saying that (source dir) is the current location where user downloaded the repository.

PS: You could choose other name to (source) such as (source dir), (repo dir), (source folder), etc, and other symbols to embraces it ('[', '{', etc). (source) is the first name that come in my mind because it is extremely simple.. hehe Maybe "source dir" is better, "source" is a bit generic.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Davidragazzi Are you suggesting they create a build directory inside the source repository? Currently we don't recommend that. Could it be (source)/../build?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think about keeping the existing $NUPIC instead of the (source)? My concern is: current users are used to it, and all existing documentation (mails, wiki,..) is refering to it. Also $XX makes sense from the shell point of view, while (source) does not. Just my 2c.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand your concern.

But if we continue to use env variables, we have 2 problems:

  • Portability: We remain dependent of shell, which is not portable. To say truth, I really dont like Windows, but I recognise that most systems are based on it (unfortunately the curse of 90% of market share remains true), in addition to some developers love Visual Studio that works only on it.
  • Complexity not so useful: We will have to create a shell file setting variables only for 4 or 5 shell commands. CMake creates env variables but only for runtime purpose, i.e. python files know where are swig modules and the python path.

BUT...........
we could simply add more 1 line such as:
export NUPIC=path/to/source

This way we would have:
export NUPIC = /path/to/source
mkdir $NUPIC/build_system
cd ($NUPIC/build_system
cmake $NUPIC

However, if we want Windows in a soon future, we would have port these same instructions to:
set NUPIC /path/to/source
mkdir %NUPIC%/build_system
cd (%NUPIC%/build_system
cmake %NUPIC%

In my opinion, this would increase complexity. I really dont know if the price is fair.. :-(

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, I just meant to use the term "$NUPIC" (which we are used to) instead of "(source)" in the readme.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah.. ok..

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update:
$NUPIC and $NTA will remain, but they will be created by CMake, not shell scripts. After creation, users will can change $NTA and reference to where they want.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to be clear... if I don't have cmake installed, will make work instead? Then $NUPIC and $NTA will get created in a make file?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if I don't have cmake installed, will make work instead? Then $NUPIC and $NTA will get created in a make file?

Actually $NUPIC and $NTA are created from own CMake script. There's a CMake command called "execute_process()" which we can call any shell command without need create .sh files for this purpose (because this I insisted in remove shell scripts as they are not portable [even for UNIX-like OS]).

Note that CMake doesnt build anything, it simply generates (nicely) rules scripts for the build software that you choose (not matter if Make, Visual Studio, Eclipse, etc). So the repository wont have any Makefile or Configure.ac (for our enjoy!), but they will be generated from CMake script ONLY IF the user want generate them from command line (Make is the default generator when CMake is called from shell, but one also can choose Ninja as build tool which builds faster).

So CMake is bounden but Make is optional.

I'm not a Travis expert, but I believe that Travis script should call first CMake command for generating the Make (or Ninja) files and then build the project from these generated scripts

Update:

A simple Travis script that I adapted from internet (http://stackoverflow.com/questions/13051880/is-there-an-example-project-in-c-that-uses-opencv-and-travis-ci):

env:
....global:
........- NUPIC=$TRAVIS_BUILD_DIR/source
........- NTA=$TRAVIS_BUILD_DIR/release

...

install:
....- if [ $PY_VERSION != "2.6" ]; then (cd nupic-linux64/ && mkdir -p lib/python${PY_VERSION}/site-packages && make > /dev/null) fi
....# Workaround for multiprocessing.Queue SemLock error from run_opf_bechmarks_test.
....# See: travis-ci/travis-cookbooks#155
....- "sudo rm -rf /dev/shm && sudo ln -s /run/shm /dev/shm"
....- pip install -q -r $NUPIC/external/common/requirements.txt
....- mkdir $TRAVIS_BUILD_DIR/build_system
....- cd $TRAVIS_BUILD_DIR/build_system
....- cmake $NUPIC
....- make

Note that this line:
....- "$NUPIC/build.sh"

was replaced by this:
....- mkdir $TRAVIS_BUILD_DIR/build_system
....- cd $TRAVIS_BUILD_DIR/build_system
....- cmake $NUPIC
....- make

Assuming that $TRAVIS_BUILD_DIR is the repository root folder (right?), it's just replace these texts with those correspondent on current Travis script.

cd (source)/build
ccmake ../(source)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wouldn't a simple cmake be a better first choice for the "configure"? The ccmake looks like some "visual" addition to cmake, and it was asking me some stuff and I even didn't know how to make it just-run-it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right. My bad, the correct is cmake, not ccmake.


#### Build:

cd (source)/build
make

#### Run the C++ tests:

cd (source)/bin
htmtest
testeverything

### Using graphical interface

$NUPIC/build.sh
#### Generate the IDE solution:

NuPIC should now be installed in $NTA!
* Open CMake executable.
* Specify the source folder.
* Specify the build folder, ie where IDE solution will be created (ex: /(source)/build).
* Click 'Generate'.
* Choose the IDE that interest you (remember that IDE choice is limited to your OS, ie Visual Studio is available only on CMake for Windows).

## Try it out!
#### Build:

### Tests
* Open 'Nupic.*proj' solution file located on build folder.
* Run 'ALL_BUILD' project from your IDE.

Run the C++ tests:
#### Run the C++ tests:

$NTA/bin/htmtest
$NTA/bin/testeverything
* Run 'HtmTest' and 'TestEverything' projects from your IDE (check 'output' panel to see the results).

Run the Python unit tests:
### Run the Python unit tests:

cd $NTA
./bin/run_tests.sh
cd (source)/bin
run_tests.sh

### Examples

You can run the examples using the OpfRunExperiment OPF client:

python $NUPIC/examples/opf/bin/OpfRunExperiment.py $NUPIC/examples/opf/experiments/multistep/hotgym/
python (source)/examples/opf/bin/OpfRunExperiment.py $NUPIC/examples/opf/experiments/multistep/hotgym/

There are also some sample OPF clients. You can modify these to run your own
data sets. One example is the hotgym prediction client:

python $NUPIC/examples/opf/clients/hotgym/hotgym.py
python (source)/examples/opf/clients/hotgym/hotgym.py

Also check out other uses of the CLA on the [Getting Started](https://github.com/numenta/nupic/wiki/Getting-Started#next-steps) wiki page.