Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

style/travis: Allow travis to build on container based system #47

Merged
merged 1 commit into from
Feb 21, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 8 additions & 5 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
sudo: false
language: rust
rust:
- nightly
Expand All @@ -23,20 +24,22 @@ script:
travis-cargo build &&
travis-cargo test &&
travis-cargo bench &&
travis-cargo doc
travis-cargo --only stable doc
addons:
apt:
packages:
- libcurl4-openssl-dev
- libelf-dev
- libdw-dev
- libblas-dev
install:
- sudo apt-get update
- sudo apt-get install fglrx opencl-headers
- fglrx
- opencl-headers
- binutils-dev
- nvidia-opencl-dev

after_success:
- travis-cargo doc-upload
- travis-cargo coveralls --no-sudo
- travis-cargo coveralls --no-sudo --verify
notifications:
email:
on_success: never
Expand Down
6 changes: 2 additions & 4 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,8 @@ license = "MIT"
[dependencies]
phloem = "~0.3.0"
collenchyma = "= 0.0.3"

log = "0.3.2"

clippy = { version = "0.0.23", optional = true }
log = "~0.3.2"
clippy = { version = "0.0.41", optional = true }

[features]
default = []
Expand Down
12 changes: 9 additions & 3 deletions src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
//! The Network defines the entire model, by defining the hirarchical structure of layers from
//! bottom to top. At execution time, the Network passes the data, flowing through the Network,
//! from one layer to the next. The output of one layer is the input for the layer on top. On a
//! backward pass, the Network passes the deriviates inverted through the Network.
//! backward pass, the Network passes the derivatives inverted through the Network.
//!
//! Layers, the building block of a Leaf Network, are small units, describing computation over
//! numerical input data. Generally speaking Layers take input and produce an output, but
Expand All @@ -37,8 +37,8 @@
//! memory management and synchronization.
//!
//! The learning and optimization of the Network happens at the [Solver][solver] and is decoupled
//! from the Network making the setup clean and flexibel. One of the four layer types is a Loss
//! Layer, which is used for the interaction of Network and Solver. The Network procudes the loss
//! from the Network making the setup clean and flexible. One of the four layer types is a Loss
//! Layer, which is used for the interaction of Network and Solver. The Network produces the loss
//! and gradients, which the Solver uses to optimize the Network through parameter updates. Beside
//! that, the Solver provides housekeeping and other evaluations of the Network. All operation
//! on the Solver happen through Collenchyma, therefore can be executed on Cuda, OpenCL or native
Expand Down Expand Up @@ -116,6 +116,12 @@
unsafe_code,
unused_import_braces, unused_qualifications)]

// used when run with cargo test --no-run --features clippy
// or cargo build --features clippy
#![cfg_attr(feature="clippy", feature(plugin))]
#![cfg_attr(feature="clippy", plugin(clippy))]
#![cfg_attr(feature="clippy", deny(clippy, clippy_pedantic))]

#[macro_use]
extern crate log;
extern crate phloem;
Expand Down