Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make mattest #6305

Open
hanxiaoyu opened this issue Mar 20, 2018 · 8 comments
Open

make mattest #6305

hanxiaoyu opened this issue Mar 20, 2018 · 8 comments
Labels

Comments

@hanxiaoyu
Copy link

hanxiaoyu commented Mar 20, 2018

When I try make mattest, problem appeared as:

cd matlab; /usr/local/MATLAB/R2016b/bin/matlab -nodisplay -r caffe.run_tests(), exit()

                            < M A T L A B (R) >
                  Copyright 1984-2016 The MathWorks, Inc.
                   R2016b (9.1.0.441655) 64-bit (glnxa64)
                             September 7, 2016

 
To get started, type one of these: helpwin, helpdesk, or demo.
For product information, visit www.mathworks.com.
 
Cleared 0 solvers and 0 stand-alone nets
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0320 17:58:08.510360  2344 net.cpp:51] Initializing net from parameters: 
name: "testnet"
force_backward: true
state {
  phase: TRAIN
  level: 0
}
layer {
  name: "data"
  type: "DummyData"
  top: "data"
  top: "label"
  dummy_data_param {
    data_filler {
      type: "gaussian"
      std: 1
    }
    data_filler {
      type: "constant"
    }
    num: 5
    num: 5
    channels: 2
    channels: 1
    height: 3
    height: 1
    width: 4
    width: 1
  }
}
layer {
  name: "conv"
  type: "Convolution"
  bottom: "data"
  top: "conv"
  param {
    decay_mult: 1
  }
  param {
    decay_mult: 0
  }
  convolution_param {
    num_output: 11
    pad: 3
    kernel_size: 2
    weight_filler {
      type: "gaussian"
      std: 1
    }
    bias_filler {
      type: "constant"
      value: 2
    }
  }
}
layer {
  name: "ip"
  type: "InnerProduct"
  bottom: "conv"
  top: "ip"
  inner_product_param {
    num_output: 13
    weight_filler {
      type: "gaussian"
      std: 2.5
    }
    bias_filler {
      type: "constant"
      value: -3
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip"
  bottom: "label"
  top: "loss"
}
I0320 17:58:08.510404  2344 layer_factory.hpp:77] Creating layer data
I0320 17:58:08.510412  2344 net.cpp:84] Creating Layer data
I0320 17:58:08.510417  2344 net.cpp:380] data -> data
I0320 17:58:08.510424  2344 net.cpp:380] data -> label
I0320 17:58:08.510435  2344 net.cpp:122] Setting up data
I0320 17:58:08.510442  2344 net.cpp:129] Top shape: 5 2 3 4 (120)
I0320 17:58:08.510447  2344 net.cpp:129] Top shape: 5 1 1 1 (5)
I0320 17:58:08.510448  2344 net.cpp:137] Memory required for data: 500
I0320 17:58:08.510450  2344 layer_factory.hpp:77] Creating layer conv
I0320 17:58:08.510457  2344 net.cpp:84] Creating Layer conv
I0320 17:58:08.510462  2344 net.cpp:406] conv <- data
I0320 17:58:08.510468  2344 net.cpp:380] conv -> conv
I0320 17:58:08.716840  2344 net.cpp:122] Setting up conv
I0320 17:58:08.716861  2344 net.cpp:129] Top shape: 5 11 8 9 (3960)
I0320 17:58:08.716864  2344 net.cpp:137] Memory required for data: 16340
I0320 17:58:08.716887  2344 layer_factory.hpp:77] Creating layer ip
I0320 17:58:08.716897  2344 net.cpp:84] Creating Layer ip
I0320 17:58:08.716902  2344 net.cpp:406] ip <- conv
I0320 17:58:08.716907  2344 net.cpp:380] ip -> ip
I0320 17:58:08.717002  2344 net.cpp:122] Setting up ip
I0320 17:58:08.717006  2344 net.cpp:129] Top shape: 5 13 (65)
I0320 17:58:08.717010  2344 net.cpp:137] Memory required for data: 16600
I0320 17:58:08.717013  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:08.717017  2344 net.cpp:84] Creating Layer loss
I0320 17:58:08.717034  2344 net.cpp:406] loss <- ip
I0320 17:58:08.717036  2344 net.cpp:406] loss <- label
I0320 17:58:08.717042  2344 net.cpp:380] loss -> loss
I0320 17:58:08.717051  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:08.717181  2344 net.cpp:122] Setting up loss
I0320 17:58:08.717187  2344 net.cpp:129] Top shape: (1)
I0320 17:58:08.717190  2344 net.cpp:132]     with loss weight 1
I0320 17:58:08.717221  2344 net.cpp:137] Memory required for data: 16604
I0320 17:58:08.717226  2344 net.cpp:198] loss needs backward computation.
I0320 17:58:08.717231  2344 net.cpp:198] ip needs backward computation.
I0320 17:58:08.717233  2344 net.cpp:198] conv needs backward computation.
I0320 17:58:08.717236  2344 net.cpp:200] data does not need backward computation.
I0320 17:58:08.717239  2344 net.cpp:242] This network produces output loss
I0320 17:58:08.717243  2344 net.cpp:255] Network initialization done.
I0320 17:58:08.743211  2344 net.cpp:51] Initializing net from parameters: 
name: "testnet"
force_backward: true
state {
  phase: TRAIN
  level: 0
}
layer {
  name: "data"
  type: "DummyData"
  top: "data"
  top: "label"
  dummy_data_param {
    data_filler {
      type: "gaussian"
      std: 1
    }
    data_filler {
      type: "constant"
    }
    num: 5
    num: 5
    channels: 2
    channels: 1
    height: 3
    height: 1
    width: 4
    width: 1
  }
}
layer {
  name: "conv"
  type: "Convolution"
  bottom: "data"
  top: "conv"
  param {
    decay_mult: 1
  }
  param {
    decay_mult: 0
  }
  convolution_param {
    num_output: 11
    pad: 3
    kernel_size: 2
    weight_filler {
      type: "gaussian"
      std: 1
    }
    bias_filler {
      type: "constant"
      value: 2
    }
  }
}
layer {
  name: "ip"
  type: "InnerProduct"
  bottom: "conv"
  top: "ip"
  inner_product_param {
    num_output: 13
    weight_filler {
      type: "gaussian"
      std: 2.5
    }
    bias_filler {
      type: "constant"
      value: -3
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip"
  bottom: "label"
  top: "loss"
}
I0320 17:58:08.743255  2344 layer_factory.hpp:77] Creating layer data
I0320 17:58:08.743268  2344 net.cpp:84] Creating Layer data
I0320 17:58:08.743275  2344 net.cpp:380] data -> data
I0320 17:58:08.743283  2344 net.cpp:380] data -> label
I0320 17:58:08.743293  2344 net.cpp:122] Setting up data
I0320 17:58:08.743296  2344 net.cpp:129] Top shape: 5 2 3 4 (120)
I0320 17:58:08.743299  2344 net.cpp:129] Top shape: 5 1 1 1 (5)
I0320 17:58:08.743302  2344 net.cpp:137] Memory required for data: 500
I0320 17:58:08.743304  2344 layer_factory.hpp:77] Creating layer conv
I0320 17:58:08.743311  2344 net.cpp:84] Creating Layer conv
I0320 17:58:08.743314  2344 net.cpp:406] conv <- data
I0320 17:58:08.743319  2344 net.cpp:380] conv -> conv
I0320 17:58:08.744292  2344 net.cpp:122] Setting up conv
I0320 17:58:08.744302  2344 net.cpp:129] Top shape: 5 11 8 9 (3960)
I0320 17:58:08.744304  2344 net.cpp:137] Memory required for data: 16340
I0320 17:58:08.744313  2344 layer_factory.hpp:77] Creating layer ip
I0320 17:58:08.744319  2344 net.cpp:84] Creating Layer ip
I0320 17:58:08.744321  2344 net.cpp:406] ip <- conv
I0320 17:58:08.744328  2344 net.cpp:380] ip -> ip
I0320 17:58:08.744418  2344 net.cpp:122] Setting up ip
I0320 17:58:08.744423  2344 net.cpp:129] Top shape: 5 13 (65)
I0320 17:58:08.744426  2344 net.cpp:137] Memory required for data: 16600
I0320 17:58:08.744432  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:08.744437  2344 net.cpp:84] Creating Layer loss
I0320 17:58:08.744441  2344 net.cpp:406] loss <- ip
I0320 17:58:08.744444  2344 net.cpp:406] loss <- label
I0320 17:58:08.744449  2344 net.cpp:380] loss -> loss
I0320 17:58:08.744457  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:08.744580  2344 net.cpp:122] Setting up loss
I0320 17:58:08.744586  2344 net.cpp:129] Top shape: (1)
I0320 17:58:08.744590  2344 net.cpp:132]     with loss weight 1
I0320 17:58:08.744596  2344 net.cpp:137] Memory required for data: 16604
I0320 17:58:08.744598  2344 net.cpp:198] loss needs backward computation.
I0320 17:58:08.744602  2344 net.cpp:198] ip needs backward computation.
I0320 17:58:08.744606  2344 net.cpp:198] conv needs backward computation.
I0320 17:58:08.744608  2344 net.cpp:200] data does not need backward computation.
I0320 17:58:08.744611  2344 net.cpp:242] This network produces output loss
I0320 17:58:08.744617  2344 net.cpp:255] Network initialization done.
Running caffe.test.test_net
..W0320 17:58:09.747437  2344 net.hpp:41] DEPRECATED: ForwardPrefilled() will be removed in a future version. Use Forward().
..I0320 17:58:09.836779  2344 net.cpp:51] Initializing net from parameters: 
name: "testnet"
force_backward: true
state {
  phase: TRAIN
  level: 0
}
layer {
  name: "data"
  type: "DummyData"
  top: "data"
  top: "label"
  dummy_data_param {
    data_filler {
      type: "gaussian"
      std: 1
    }
    data_filler {
      type: "constant"
    }
    num: 5
    num: 5
    channels: 2
    channels: 1
    height: 3
    height: 1
    width: 4
    width: 1
  }
}
layer {
  name: "conv"
  type: "Convolution"
  bottom: "data"
  top: "conv"
  param {
    decay_mult: 1
  }
  param {
    decay_mult: 0
  }
  convolution_param {
    num_output: 11
    pad: 3
    kernel_size: 2
    weight_filler {
      type: "gaussian"
      std: 1
    }
    bias_filler {
      type: "constant"
      value: 2
    }
  }
}
layer {
  name: "ip"
  type: "InnerProduct"
  bottom: "conv"
  top: "ip"
  inner_product_param {
    num_output: 13
    weight_filler {
      type: "gaussian"
      std: 2.5
    }
    bias_filler {
      type: "constant"
      value: -3
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip"
  bottom: "label"
  top: "loss"
}
I0320 17:58:09.836823  2344 layer_factory.hpp:77] Creating layer data
I0320 17:58:09.836833  2344 net.cpp:84] Creating Layer data
I0320 17:58:09.836838  2344 net.cpp:380] data -> data
I0320 17:58:09.836844  2344 net.cpp:380] data -> label
I0320 17:58:09.836853  2344 net.cpp:122] Setting up data
I0320 17:58:09.836858  2344 net.cpp:129] Top shape: 5 2 3 4 (120)
I0320 17:58:09.836877  2344 net.cpp:129] Top shape: 5 1 1 1 (5)
I0320 17:58:09.836880  2344 net.cpp:137] Memory required for data: 500
I0320 17:58:09.836884  2344 layer_factory.hpp:77] Creating layer conv
I0320 17:58:09.836901  2344 net.cpp:84] Creating Layer conv
I0320 17:58:09.836905  2344 net.cpp:406] conv <- data
I0320 17:58:09.836922  2344 net.cpp:380] conv -> conv
I0320 17:58:09.837237  2344 net.cpp:122] Setting up conv
I0320 17:58:09.837244  2344 net.cpp:129] Top shape: 5 11 8 9 (3960)
I0320 17:58:09.837249  2344 net.cpp:137] Memory required for data: 16340
I0320 17:58:09.837255  2344 layer_factory.hpp:77] Creating layer ip
I0320 17:58:09.837260  2344 net.cpp:84] Creating Layer ip
I0320 17:58:09.837265  2344 net.cpp:406] ip <- conv
I0320 17:58:09.837268  2344 net.cpp:380] ip -> ip
I0320 17:58:09.837352  2344 net.cpp:122] Setting up ip
I0320 17:58:09.837357  2344 net.cpp:129] Top shape: 5 13 (65)
I0320 17:58:09.837359  2344 net.cpp:137] Memory required for data: 16600
I0320 17:58:09.837364  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:09.837369  2344 net.cpp:84] Creating Layer loss
I0320 17:58:09.837373  2344 net.cpp:406] loss <- ip
I0320 17:58:09.837375  2344 net.cpp:406] loss <- label
I0320 17:58:09.837379  2344 net.cpp:380] loss -> loss
I0320 17:58:09.837385  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:09.837513  2344 net.cpp:122] Setting up loss
I0320 17:58:09.837520  2344 net.cpp:129] Top shape: (1)
I0320 17:58:09.837523  2344 net.cpp:132]     with loss weight 1
I0320 17:58:09.837528  2344 net.cpp:137] Memory required for data: 16604
I0320 17:58:09.837532  2344 net.cpp:198] loss needs backward computation.
I0320 17:58:09.837535  2344 net.cpp:198] ip needs backward computation.
I0320 17:58:09.837538  2344 net.cpp:198] conv needs backward computation.
I0320 17:58:09.837541  2344 net.cpp:200] data does not need backward computation.
I0320 17:58:09.837545  2344 net.cpp:242] This network produces output loss
I0320 17:58:09.837549  2344 net.cpp:255] Network initialization done.
I0320 17:58:09.841357  2344 net.cpp:51] Initializing net from parameters: 
name: "testnet"
force_backward: true
state {
  phase: TRAIN
  level: 0
}
layer {
  name: "data"
  type: "DummyData"
  top: "data"
  top: "label"
  dummy_data_param {
    data_filler {
      type: "gaussian"
      std: 1
    }
    data_filler {
      type: "constant"
    }
    num: 5
    num: 5
    channels: 2
    channels: 1
    height: 3
    height: 1
    width: 4
    width: 1
  }
}
layer {
  name: "conv"
  type: "Convolution"
  bottom: "data"
  top: "conv"
  param {
    decay_mult: 1
  }
  param {
    decay_mult: 0
  }
  convolution_param {
    num_output: 11
    pad: 3
    kernel_size: 2
    weight_filler {
      type: "gaussian"
      std: 1
    }
    bias_filler {
      type: "constant"
      value: 2
    }
  }
}
layer {
  name: "ip"
  type: "InnerProduct"
  bottom: "conv"
  top: "ip"
  inner_product_param {
    num_output: 13
    weight_filler {
      type: "gaussian"
      std: 2.5
    }
    bias_filler {
      type: "constant"
      value: -3
    }
  }
}
layer {
  name: "loss"
  type: "SoftmaxWithLoss"
  bottom: "ip"
  bottom: "label"
  top: "loss"
}
I0320 17:58:09.841394  2344 layer_factory.hpp:77] Creating layer data
I0320 17:58:09.841400  2344 net.cpp:84] Creating Layer data
I0320 17:58:09.841405  2344 net.cpp:380] data -> data
I0320 17:58:09.841413  2344 net.cpp:380] data -> label
I0320 17:58:09.841421  2344 net.cpp:122] Setting up data
I0320 17:58:09.841425  2344 net.cpp:129] Top shape: 5 2 3 4 (120)
I0320 17:58:09.841428  2344 net.cpp:129] Top shape: 5 1 1 1 (5)
I0320 17:58:09.841430  2344 net.cpp:137] Memory required for data: 500
I0320 17:58:09.841434  2344 layer_factory.hpp:77] Creating layer conv
I0320 17:58:09.841439  2344 net.cpp:84] Creating Layer conv
I0320 17:58:09.841441  2344 net.cpp:406] conv <- data
I0320 17:58:09.841444  2344 net.cpp:380] conv -> conv
I0320 17:58:09.842205  2344 net.cpp:122] Setting up conv
I0320 17:58:09.842214  2344 net.cpp:129] Top shape: 5 11 8 9 (3960)
I0320 17:58:09.842217  2344 net.cpp:137] Memory required for data: 16340
I0320 17:58:09.842223  2344 layer_factory.hpp:77] Creating layer ip
I0320 17:58:09.842231  2344 net.cpp:84] Creating Layer ip
I0320 17:58:09.842233  2344 net.cpp:406] ip <- conv
I0320 17:58:09.842237  2344 net.cpp:380] ip -> ip
I0320 17:58:09.842329  2344 net.cpp:122] Setting up ip
I0320 17:58:09.842334  2344 net.cpp:129] Top shape: 5 13 (65)
I0320 17:58:09.842337  2344 net.cpp:137] Memory required for data: 16600
I0320 17:58:09.842341  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:09.842346  2344 net.cpp:84] Creating Layer loss
I0320 17:58:09.842350  2344 net.cpp:406] loss <- ip
I0320 17:58:09.842352  2344 net.cpp:406] loss <- label
I0320 17:58:09.842355  2344 net.cpp:380] loss -> loss
I0320 17:58:09.842360  2344 layer_factory.hpp:77] Creating layer loss
I0320 17:58:09.842516  2344 net.cpp:122] Setting up loss
I0320 17:58:09.842523  2344 net.cpp:129] Top shape: (1)
I0320 17:58:09.842525  2344 net.cpp:132]     with loss weight 1
I0320 17:58:09.842530  2344 net.cpp:137] Memory required for data: 16604
I0320 17:58:09.842533  2344 net.cpp:198] loss needs backward computation.
I0320 17:58:09.842536  2344 net.cpp:198] ip needs backward computation.
I0320 17:58:09.842538  2344 net.cpp:198] conv needs backward computation.
I0320 17:58:09.842541  2344 net.cpp:200] data does not need backward computation.
I0320 17:58:09.842543  2344 net.cpp:242] This network produces output loss
I0320 17:58:09.842547  2344 net.cpp:255] Network initialization done.
.
Done caffe.test.test_net
__________

Attempt to restart MATLAB? [y or n]>>'

after I enter y:

       Segmentation violation detected at Tue Mar 20 17:58:09 2018
------------------------------------------------------------------------

Configuration:
  Crash Decoding      : Disabled - No sandbox or build area path
  Crash Mode          : continue (default)
  Current Graphics Driver: Unknown software 
  Current Visual      : None
  Default Encoding    : UTF-8
  Deployed            : false
  GNU C Library       : 2.23 stable
  Host Name           : han-B250M-D3V
  MATLAB Architecture : glnxa64
  MATLAB Entitlement ID: 6257193
  MATLAB Root         : /usr/local/MATLAB/R2016b
  MATLAB Version      : 9.1.0.441655 (R2016b)
  OpenGL              : software
  Operating System    : Linux 4.13.0-37-generic #42~16.04.1-Ubuntu SMP Wed Mar 7 16:03:28 UTC 2018 x86_64
  Processor ID        : x86 Family 6 Model 158 Stepping 9, GenuineIntel
  Virtual Machine     : Java 1.7.0_60-b19 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
  Window System       : No active display

Fault Count: 1


Abnormal termination:
Segmentation violation

Register State (from fault):
  RAX = 006f0062006c006f  RBX = 006f0062006c006e
  RCX = 00007f21548b1460  RDX = 006f0062006c006e
  RSP = 00007f220bdf3fd0  RBP = 00007f220bdf4020
  RSI = 00007f220bdf4100  RDI = 00007f220bdf4020

   R8 = 006f7f8354f714ce   R9 = 0000000000000000
  R10 = 000000000000007b  R11 = 00007f222a6039c0
  R12 = 00007f220bdf4100  R13 = 00007f2154899060
  R14 = 00007f220bdf4270  R15 = 00007f220bdf4bf8

  RIP = 00007f222a603a5b  EFL = 0000000000010206

   CS = 0033   FS = 0000   GS = 0000

Stack Trace (from fault):
[  0] 0x00007f222a603a5b /usr/local/MATLAB/R2016b/bin/glnxa64/libboost_filesystem.so.1.56.0+00059995 _ZNK5boost10filesystem4path8filenameEv+00000155
[  1] 0x00007f222a604b36 /usr/local/MATLAB/R2016b/bin/glnxa64/libboost_filesystem.so.1.56.0+00064310 _ZNK5boost10filesystem4path9extensionEv+00000022
[  2] 0x00007f222a604c62 /usr/local/MATLAB/R2016b/bin/glnxa64/libboost_filesystem.so.1.56.0+00064610 _ZN5boost10filesystem4path17replace_extensionERKS1_+00000034
[  3] 0x00007f219737aad8 /home/han/caffe/matlab/+caffe/private/caffe_.mexa64+00727768
[  4] 0x00007f219737af30 /home/han/caffe/matlab/+caffe/private/caffe_.mexa64+00728880
[  5] 0x00007f2197314acf /home/han/caffe/matlab/+caffe/private/caffe_.mexa64+00309967
[  6] 0x00007f2197311e7f /home/han/caffe/matlab/+caffe/private/caffe_.mexa64+00298623 mexFunction+00000169
[  7] 0x00007f221ce50caa     /usr/local/MATLAB/R2016b/bin/glnxa64/libmex.so+00175274 mexRunMexFile+00000106
[  8] 0x00007f221ce491a3     /usr/local/MATLAB/R2016b/bin/glnxa64/libmex.so+00143779
[  9] 0x00007f221ce4a345     /usr/local/MATLAB/R2016b/bin/glnxa64/libmex.so+00148293
[ 10] 0x00007f221c1498a3 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_dispatcher.so+00768163 _ZN8Mfh_file16dispatch_fh_implEMS_FviPP11mxArray_tagiS2_EiS2_iS2_+00000947
[ 11] 0x00007f221c14a16e /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_dispatcher.so+00770414 _ZN8Mfh_file11dispatch_fhEiPP11mxArray_tagiS2_+00000030
[ 12] 0x00007f2218f84847 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+11675719
[ 13] 0x00007f2218f84aab /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+11676331
[ 14] 0x00007f2218fea411 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+12092433
[ 15] 0x00007f2218910930 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04909360
[ 16] 0x00007f2218912c3c /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04918332
[ 17] 0x00007f221890f410 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04903952
[ 18] 0x00007f221890a855 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04884565
[ 19] 0x00007f221890ab69 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04885353
[ 20] 0x00007f221890f20d /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04903437
[ 21] 0x00007f221890f2e2 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+04903650
[ 22] 0x00007f2218a06688 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+05916296
[ 23] 0x00007f2218a08b2f /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+05925679
[ 24] 0x00007f2218e8710e /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+10637582
[ 25] 0x00007f2218e4eeab /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+10407595
[ 26] 0x00007f2218e4efb3 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+10407859
[ 27] 0x00007f2218e510d9 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+10416345
[ 28] 0x00007f2218ec9bbe /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+10910654
[ 29] 0x00007f2218eca072 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_lxe.so+10911858
[ 30] 0x00007f221b869941 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwm_interpreter.so+02443585 _Z51inEvalCmdWithLocalReturnInDesiredWSAndPublishEventsRKSbIDsSt11char_traitsIDsESaIDsEEPibbP15inWorkSpace_tag+00000065
[ 31] 0x00007f221cbaafc1   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00696257 _ZNK3iqm18InternalEvalPlugin24inEvalCmdWithLocalReturnERKSbIDsSt11char_traitsIDsESaIDsEEP15inWorkSpace_tag+00000097
[ 32] 0x00007f221cbac9db   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00702939 _ZN3iqm18InternalEvalPlugin7executeEP15inWorkSpace_tagRN5boost10shared_ptrIN14cmddistributor17IIPCompletedEventEEE+00000123
[ 33] 0x00007f221c4206cd   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00624333
[ 34] 0x00007f221cb9fa0a   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00649738
[ 35] 0x00007f221cb8beb2   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00569010
[ 36] 0x00007f221b3e705a /usr/local/MATLAB/R2016b/bin/glnxa64/libmwbridge.so+00159834
[ 37] 0x00007f221b3e7617 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwbridge.so+00161303
[ 38] 0x00007f221b3ee519 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwbridge.so+00189721
[ 39] 0x00007f221b3ee614 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwbridge.so+00189972
[ 40] 0x00007f221b3eefa9 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwbridge.so+00192425 _Z8mnParserv+00000617
[ 41] 0x00007f221c40b243   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00537155
[ 42] 0x00007f221c40d1ce   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00545230
[ 43] 0x00007f221c40d849   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00546889 _ZN5boost6detail17task_shared_stateINS_3_bi6bind_tIvPFvRKNS_8functionIFvvEEEENS2_5list1INS2_5valueIS6_EEEEEEvE6do_runEv+00000025
[ 44] 0x00007f221c40c236   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00541238
[ 45] 0x00007f221cbd3b49   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00863049
[ 46] 0x00007f221cbc051c   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00783644 _ZN5boost6detail8function21function_obj_invoker0ISt8functionIFNS_3anyEvEES4_E6invokeERNS1_15function_bufferE+00000028
[ 47] 0x00007f221cbc01fc   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00782844 _ZN3iqm18PackagedTaskPlugin7executeEP15inWorkSpace_tagRN5boost10shared_ptrIN14cmddistributor17IIPCompletedEventEEE+00000428
[ 48] 0x00007f221cb9fa0a   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00649738
[ 49] 0x00007f221cb8b690   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00566928
[ 50] 0x00007f221cb8e048   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwiqm.so+00577608
[ 51] 0x00007f222c7e040a /usr/local/MATLAB/R2016b/bin/glnxa64/libmwservices.so+02634762
[ 52] 0x00007f222c7e19af /usr/local/MATLAB/R2016b/bin/glnxa64/libmwservices.so+02640303
[ 53] 0x00007f222c7e20e6 /usr/local/MATLAB/R2016b/bin/glnxa64/libmwservices.so+02642150 _Z25svWS_ProcessPendingEventsiib+00000102
[ 54] 0x00007f221c40b8c6   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00538822
[ 55] 0x00007f221c40bc42   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00539714
[ 56] 0x00007f221c3f98d6   /usr/local/MATLAB/R2016b/bin/glnxa64/libmwmcr.so+00465110
[ 57] 0x00007f222b3f96ba              /lib/x86_64-linux-gnu/libpthread.so.0+00030394
[ 58] 0x00007f222b12f41d                    /lib/x86_64-linux-gnu/libc.so.6+01078301 clone+00000109
[ 59] 0x0000000000000000                                   <unknown-module>+00000000


This error was detected while a MEX-file was running. If the MEX-file
is not an official MathWorks function, please examine its source code
for errors. Please consult the External Interfaces Guide for information
on debugging MEX-files.

If this problem is reproducible, please submit a Service Request via:
    http://www.mathworks.com/support/contact_us/

A technical support engineer might contact you with further information.

Thank you for your help.** This crash report has been saved to disk as /home/han/matlab_crash_dump.2301-1 **


Warning: The following error was caught while executing 'caffe.Solver' class
destructor:
Error using caffe_
Usage: caffe_('delete_solver', hSolver)

Error in caffe.Solver/delete (line 40)
      caffe_('delete_solver', self.hSolver_self);

Error in caffe.Solver (line 17)
    function self = Solver(varargin)

Error in caffe.test.test_solver (line 22)
      self.solver = caffe.Solver(solver_file);

Error in caffe.run_tests (line 14)
  run(caffe.test.test_solver) ... 
> In caffe.Solver (line 17)
  In caffe.test.test_solver (line 22)
  In caffe.run_tests (line 14) 
Caught "std::exception" Exception message is:
FatalException
Caught MathWorks::System::FatalException
[Please exit and restart MATLAB]>>

entry restart:

Undefined function or variable 'restart'.'

So does anyone know how to solve this problem? Thank you very much!

Operating system:ubuntu16.04
Compiler:make mattest
CUDA version (if applicable):8.0
CUDNN version (if applicable):5.1
BLAS:
Python or MATLAB version (for pycaffe and matcaffe respectively):MATLAB2016b

@Noiredd Noiredd added the Matlab label Mar 20, 2018
@zsqciel
Copy link

zsqciel commented Mar 23, 2018

i met the same problem, would you please share your solutions if you solve it?

3 similar comments
@EswarSaiKrish
Copy link

i met the same problem, would you please share your solutions if you solve it?

@yuexingyu
Copy link

i met the same problem, would you please share your solutions if you solve it?

@techping
Copy link

techping commented Jun 7, 2018

i met the same problem, would you please share your solutions if you solve it?

@soulslicer
Copy link

i met the same problem, would you please share your solutions if you solve it?

no but seriously though, matcaffe is completely borked. i tried to make it work on multiple versions of Matlab and prior commits of Caffe but could never make it work. your only solution is to switch to python

@BruceW91
Copy link

I met the same problem, would you please share your solutions if you solve it?

I tried to change version of GCC&g++ from 5.4 to 4.9, but it did not work. And my MATLAB version is also R2016b, I wonder is it caused by the version incompatibility? Bacause in the official website, there is a statement
1

@MaxChanger
Copy link

I met the same problem, Operating system:ubuntu16.04, Compiler:make mattest, CUDA version (if applicable):8.0, CUDNN version (if applicable):5.1, there was the same problem as the first person, but I saw some blog achieve the test by Matlab R2016b, so I don't think it's directly related to the version. But I also don't know how to solve the problem.

@Jee-King
Copy link

Jee-King commented Apr 7, 2019

I met the same problem.
That may be caused by the caffe version. This is my solutionit which might help others.
Download and Compile this version:
https://github.com/gy1874/caffe-rpnbf-cudnn5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

10 participants