Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong normal estimation under certain circumstances #560

Closed
CMun opened this issue Mar 6, 2014 · 10 comments
Closed

Wrong normal estimation under certain circumstances #560

CMun opened this issue Mar 6, 2014 · 10 comments

Comments

@CMun
Copy link
Contributor

CMun commented Mar 6, 2014

My code estimates the normals exactly according to the tutorial example ("Estimating Surface Normals in a PointCloud").

I use a small radius, leading to approximately 30-50 points as neighbours. Most of the time everything works as expected.

However, some of my 3D points are coplanar in the XZ plane (with some low 3D scanner noise). For them, normal estimations calculates a lot of (greatly) different normals, although they are more or less on the same plane.

I had a look into the PCL 1.7.1 code of the method computeMeanAndCovarianceMatrix in the file centroid.hpp. In the error cases, some of the values in the accu are exactly 0. This problem seems to arise from a patch ("added two new methods to compute centroid and covariance at once...", commit 2d7b045) contributed from @gedikli some time ago.

The normal estimation in normal_3d.h (also the OMP version) uses the float version of his code. But in his patch it is stated, that this is "less acurate" than the version before. The problem arises from first summing up (potentially large) values, than dividing it and doing some subtraction. This can lead to numerical problems.

Three possibilities to fix the problem arise:

  1. regroup the math for the covariance calculation, so that centroid is calculated first, than the differences of the 3D points are accumulated in accu. Performance should be similiar, since the data for sensible number of points to consider for normal estimation should fit into the cache.
  2. explicitely use the double version of the computeMeanAndCovarianceMatrix, at least for pcl::NormalEstimation.
  3. (my workaround currently) revert to the old behaviour by first explicitely calculate the centroid by compute3DCentroid, than invoke the (still existing) computeCovarianceMatrixNormalized with that centroid.

I have tested the 3rd possibility, but since it's a hack I'd rather would stick to nr. 1 or 2.

I know that is a unfortunate situation, since most of the time the normal estimation works fast and flawlessly. However, I have validated that in the aforementioned situation the covariance matrix is wrongly calculated (compared to e.g. MatLab).

Thanks for your consideration,
Chris

@jspricke
Copy link
Member

Hi Chris, that's quite an important find. Can you share some of your experiments or even better come up with a unit test case? If regrouping is enough, I'm fine with doing that, but otherwise I think using double in there sounds like a good idea as well. @gedikli what do you think?

@CMun
Copy link
Contributor Author

CMun commented Apr 3, 2014

Hi Jochen,

under https://gist.github.com/CMun/9951212 I have uploaded my test data and my current work around.

Maybe @jspricke and @gedikli could have a look.

Remember: If something is changed, the OMP variant has to be updated too!

@saimanoj18
Copy link

Is this fixed in the trunk ?

@andersgb1
Copy link
Contributor

Just as an addendum to the issue, some wiki pages on covariance (
http://en.wikipedia.org/wiki/Covariance#Definition and
http://en.wikipedia.org/wiki/Algorithms_for_calculating_variance#Na.C3.AFve_algorithm)
confirm the problem with the one-pass algorithm and disrecommend using it.

I think 2) is a bad choice, whereas 1) and 3) are definitely avoiding the
problem by using two passes.

In my opinion, using 1) as a solution would not be bad as the complexity
simply goes from O(N) to O(2*N) which is O(N) anyways. Alternatively, maybe
someone could look into a stable one-pass algorithm. In the bottom of
http://en.wikipedia.org/wiki/Algorithms_for_calculating_variance#Covariance,
it is mentioned how to do that using co-moments.

On 6 March 2014 15:21, CMun notifications@github.com wrote:

My code estimates the normals exactly according to the tutorial example
("Estimating Surface Normals in a PointCloud").

I use a small radius, leading to approximately 30-50 points as neighbours.
Most of the time everything works as expected.

However, some of my 3D points are coplanar in the XZ plane (with some low
3D scanner noise). For them, normal estimations calculates a lot of
(greatly) different normals, although they are more or less on the same
plane.

I had a look into the PCL 1.7.1 code of the method
computeMeanAndCovarianceMatrix in the file centroid.hpp. In the error
cases, some of the values in the accu are exactly 0. This problem seems to
arise from a patch ("added two new methods to compute centroid and
covariance at once...", commit 2d7b045
2d7b045)
contributed from @gedikli https://github.com/gedikli some time ago.

The normal estimation in normal_3d.h (also the OMP version) uses the float
version of his code. But in his patch it is stated, that this is "less
acurate" than the version before. The problem arises from first summing up
(potentially large) values, than dividing it and doing some subtraction.
This can lead to numerical problems.

Three possibilities to fix the problem arise:

  1. regroup the math for the covariance calculation, so that centroid is
    calculated first, than the differences of the 3D points are accumulated in
    accu. Performance should be similiar, since the data for sensible number of
    points to consider for normal estimation should fit into the cache.
  2. explicitely use the double version of the
    computeMeanAndCovarianceMatrix, at least for pcl::NormalEstimation.
  3. (my workaround currently) revert to the old behaviour by first
    explicitely calculate the centroid by compute3DCentroid, than invoke the
    (still existing) computeCovarianceMatrixNormalized with that centroid.

I have tested the 3rd possibility, but since it's a hack I'd rather would
stick to nr. 1 or 2.

I know that is a unfortunate situation, since most of the time the normal
estimation works fast and flawlessly. However, I have validated that in the
aforementioned situation the covariance matrix is wrongly calculated
(compared to e.g. MatLab).

Thanks for your consideration,
Chris


Reply to this email directly or view it on GitHub
#560.

@VictorLamoine
Copy link
Contributor

Test code

#include <pcl/point_types.h>
#include <pcl/io/pcd_io.h>
#include <pcl/features/normal_3d.h>
#include <pcl/visualization/pcl_visualizer.h>
#include <pcl/surface/mls.h>
#include <pcl/common/centroid.h>
#include <pcl/common/transforms.h>

typedef pcl::PointXYZRGB PointT;
typedef pcl::PointCloud<PointT> PointCloudT;

int
main (int argc,
      char *argv[])
{
  // Load point cloud from disk
  if (argc != 2)
  {
    PCL_ERROR("Please provide one PCD file as argument\n");
    return -1;
  }

  PointCloudT::Ptr cloud (new PointCloudT);
  PointCloudT::Ptr cloud_centered (new PointCloudT);
  if (pcl::io::loadPCDFile (argv[1], *cloud) < 0)
    return -1;

  for (size_t i = 0; i < cloud->size(); ++i)
  {
    cloud->points[i].r = 255;
    cloud->points[i].g = 155;
    cloud->points[i].b = 155;
  }

  // Compute 3D centroid of the point cloud
  Eigen::Vector4f centroid;
  pcl::compute3DCentroid (*cloud, centroid);
  std::cout << "Centroid\n" << centroid.head<3>() << std::endl;
  // Translate point cloud centroid to origin
  Eigen::Affine3f transformation (Eigen::Affine3f::Identity());
  transformation.translation() << -centroid.head<3>();
  pcl::transformPointCloud(*cloud, *cloud_centered, transformation);

  // Normal estimation
  pcl::PointCloud<pcl::Normal>::Ptr cloud_normals (new pcl::PointCloud<pcl::Normal>);
  pcl::PointCloud<pcl::Normal>::Ptr cloud_centered_normals (new pcl::PointCloud<pcl::Normal>);
  pcl::search::KdTree<PointT>::Ptr tree (new pcl::search::KdTree<PointT> ());
  pcl::NormalEstimation<PointT, pcl::Normal> ne;
  ne.setSearchMethod (tree);
  ne.setRadiusSearch (0.2);
  ne.setViewPoint (0, 0, 1.0);
  // Compute normals on original cloud
  ne.setInputCloud (cloud);
  ne.compute (*cloud_normals);
  // Compute normals on centered cloud
  ne.setInputCloud (cloud_centered);
  ne.compute (*cloud_centered_normals);

  // Visualization
  pcl::visualization::PCLVisualizer viewer ("Normals visualizer");
  int v1(0); int v2(1);
  viewer.createViewPort (0.0, 0.0, 0.5, 1.0, v1);
  viewer.createViewPort (0.5, 0.0, 1.0, 1.0, v2);
  viewer.setBackgroundColor(0.1, 0.1, 0.1, v2);

  // Add point clouds
  viewer.addPointCloud (cloud, "cloud", v1);
  viewer.addPointCloudNormals<PointT, pcl::Normal> (cloud, cloud_normals, 1, 0.05, "normals", v1);
  viewer.addPointCloud (cloud_centered, "cloud_centered", v2);
  viewer.addPointCloudNormals<PointT, pcl::Normal> (cloud_centered, cloud_centered_normals, 1, 0.05, "centered_normals", v2);

  while (!viewer.wasStopped ())
    viewer.spinOnce ();

  return 0;
}

@gcasey
Copy link
Contributor

gcasey commented Oct 29, 2015

Hi all, I wrote this to address the issue in the past but have not had time to address further since

https://github.com/gcasey/pcl/tree/robust-mean-and-covariance-estimate

@gcasey
Copy link
Contributor

gcasey commented Oct 29, 2015

I created PR #1407 but I havent looked at that in some time so it might need updating for compilation or unit tests. I'm happy to provide if you give me some direction.

Thanks

@gcasey
Copy link
Contributor

gcasey commented Nov 2, 2015

Could one of the reporters test #1407 ? I believe it should be ready to fix this problem. Thanks.

@stale
Copy link

stale bot commented May 19, 2020

Marking this as stale due to 30 days of inactivity. It will be closed in 7 days if no further activity occurs.

@mvieth
Copy link
Member

mvieth commented Nov 28, 2021

Fixed by pull request #4983

@mvieth mvieth closed this as completed Nov 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants