Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Complete unittest for trainer_config_helpers. #108

Merged
merged 3 commits into from
Sep 28, 2016

Commits on Sep 27, 2016

  1. Complete unittest of TrainerConfigHelper

    * Refine img_conv_op
    * Add conv_shift layer
    * Add tensor layer, refine selective fc
    * Try fix some styles
    * simple rnn layers
    * Add image layers
    * util_layers
    * Use collection.Sequence instead of list|tuple
    * BugFix for print_layer
    * Unitest for poolings.
    * Add test_grumemory_layer
    * add last_first_seq to unittest
    * Add test expand layer, test ntm layers
    * add *args for outputs
    * Add trans layer
    * Add test for hsigmoid
    * also add interface to set param attr
    * sampling_id, linear_comb
    * ctc_layer, crf_layer
    * cost_layers
    * beam_search
    reyoung committed Sep 27, 2016
    Configuration menu
    Copy the full SHA
    6343d4a View commit details
    Browse the repository at this point in the history

Commits on Sep 28, 2016

  1. Merge branch 'master' into unittest_for_helpers

    Conflicts:
    	python/paddle/trainer_config_helpers/layers.py
    reyoung committed Sep 28, 2016
    Configuration menu
    Copy the full SHA
    96df7e5 View commit details
    Browse the repository at this point in the history
  2. Follow comments

    * Fix a unittest bug before.
      * dotmul inputs' sizes should be equal
    * add equation to recurrent_layer
    * unify x,y x1,x2 => a,b
    * log fatal when lstm/gru set wrong size
    reyoung committed Sep 28, 2016
    Configuration menu
    Copy the full SHA
    584c0fc View commit details
    Browse the repository at this point in the history