Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong Color frame relative to Depth #5675

Closed
AItah opened this issue Jan 19, 2020 · 16 comments
Closed

Wrong Color frame relative to Depth #5675

AItah opened this issue Jan 19, 2020 · 16 comments

Comments

@AItah
Copy link

AItah commented Jan 19, 2020


Required Info
Camera Model D435
Firmware Version 5.11.15.00
Operating System & Version Win 10
Kernel Version (Linux Only) -
Platform PC
SDK Version 2.30.0
Language c++
Segment Robot

Issue Description

Dear friends,

Capturing frame after translating the camera results with the wrong color frame relative to the depth frame.
I run rs-pointcloud example from librealsense2 solution and capture two examples which are attached which explains the issue. the test is to shift my hand in front of the camera. as it can be seen the color is not synced with the depth.
The problem pop when we moved a robotic arm where the D435 is attached to it. The arm is moving and after it stop we acquire frame, unfortunately it did not get a synced images. We know for sure that the arm is in position and not moving or even vibrating!
Mapping Infra - the problem does not occur!
Using realsense viewer - the problem does not occur!
We understand it is not low FPS, or too long exposure time.

We read the the info in the following link:
https://github.com/IntelRealSense/librealsense/wiki/Frame-Buffering-Management-in-RealSense-SDK-2.0#syncer
according to it wait_for_frame should be simplified version of sync frames but some how the problem occur.

Can you please guide us to a simple code example how to correctly acquire the next frame where all streams synced correctly (we would like the SW to get in each call only new frames and drop the old ones).

Best regards,
Amir

ColorAndDepthSync
ColorAndDepthSync_2

@dorodnic
Copy link
Contributor

Hi @spaltiel
Please run the following script to ensure per-frame metadata. Next, please check AUTO_EXPOSURE_PRIORITY_MODE option is disabled on the RGB sensor. Set the two sensors to same FPS (say 60). Hardware sync should kick-in and handled by the software (using wait_for_frames).
One final point - make sure you build and run in Release. Frame drops will cause mismatches.

@AItah
Copy link
Author

AItah commented Jan 19, 2020

Dear Sergey,

Thanks for your quick response. I will try your suggestion and update.
Also, is it possible to do it "manually"?
something like the following (according to a wiki page we read):

sensors[0].open(IR_stream_profile);
sensors[1].open(Color_stream_profile);

rs2::syncer sync(1); // queue of single frame

sensors[0].start(sync);
sensors[1].start(sync);

rs2::frameset fset = sync.wait_for_frames();
rs2::frame depth = fset.first_or_default(RS2_STREAM_DEPTH);
rs2::frame color = fset.first_or_default(RS2_STREAM_COLOR);

Best Regards,
Amir I.

@dorodnic
Copy link
Contributor

Yes, it is
Pipeline wait_for_frames in case of missing frames will "borrow" from last viable frame (prioritizing full-size framesets over matching). Using syncer directly will output only matched frames (assuming all the pre-conditions are matched as described above) and you can get essentially NULL in one of the frames. This will happen when the SDK is confident that the expected match is lost.

@AItah
Copy link
Author

AItah commented Jan 19, 2020

10X!
Probably will try both ways and later I will update.

@AItah
Copy link
Author

AItah commented Jan 20, 2020

Dear Sergey,

Thanks! both solutions works fine.

  1. for "pipe.wait_for_frame .... ": indeed enabling metadata fix the problem.
  2. Directly following the wiki page also fixed the problem.

last point regarding this issue:

I tried to directly implement using syncer methode (similar to the wiki page) inside a new thread.
It works almost great except that now the CPU is high .

Immidiatly after the lines:
sensors[0].start(sync);
sensors[1].start(sync);
the CPU jump as a step function to nearly 75% and stay there. It decrease only after sensors.stop(). we did not need even to call wait_for_frame.
BTW the exactly same code inside the main thread and the cpu is normal without any issues.

Are there any guidelines to follow when implementing directly the syncer way in a different thread which will constantly acquire framestes?

The reason I am asking is that we really want to make sure we get current new image and we are not sure that wait_for_frame will be 100%. So we implemented the syncer and we verify that depth and color are both not null.

Thanks!
Amir I.

@dorodnic
Copy link
Contributor

Just to make sure - Release build?
(There is no other obvious reason for 100% cpu utilization)

@dorodnic
Copy link
Contributor

Also, I did some changes to make metadata installation more convenient - #5678

@AItah
Copy link
Author

AItah commented Jan 20, 2020

Yes it is in release.

@AItah
Copy link
Author

AItah commented Jan 20, 2020

BTW the first link to enable metadata was really good.

@AItah
Copy link
Author

AItah commented Jan 20, 2020

Regarding metadata enable: does installing the "RS400Camera.inf" if enough to enable the metadata and avoid the first procedure of changing the registry and ...?

@dorodnic
Copy link
Contributor

dorodnic commented Jan 20, 2020

If you install the INF you don't need to do anything else, it should just work (for D400 devices)
Can you share the 100% CPU code snippet?

@AItah
Copy link
Author

AItah commented Jan 21, 2020

I will try the INF on different PC - 10X!

Regarding the CPU utilization - I rewrote a clean acquisition flow for test (see below )and it works great. Maybe it is not great coding, however, it might be helpful for someone.

Thanks a lot,
Amir i.

SyncerTh::SyncerTh()
{
 devices = ctx.query_devices(); 
 selected_device = devices[0];
 sensors = selected_device.query_sensors();
 rs2::syncer _sync(1);
 sync = _sync;
}

void SyncerTh::start()
{
 auto spIR = sensors[0].get_stream_profiles();
 auto spRGB = sensors[1].get_stream_profiles();
 sensors[1].open(spRGB[61]); 
 sensors[0].open(spIR[76]); 
 sensors[0].start(sync);
 sensors[1].start(sync);
 
 theTh = new std::thread(SyncLoop, this);
}

int SyncerTh::SyncLoop(void* inMyself)
{
 SyncerTh* me = (SyncerTh*)inMyself;
 rs2::points points;
 rs2::pointcloud pc;

 while (true)
 {	
	rs2::video_frame color = rs2::frame();
	rs2::depth_frame depth = rs2::frame();
		
	rs2::frameset fset;

	while (!color | !depth)
	{
		if (me->sync.try_wait_for_frames(&fset))
		{
			if (fset.size() > 1)
			{
				rs2::align align_to(RS2_STREAM_COLOR);
				fset = align_to.process(fset);
				color = fset.first_or_default(RS2_STREAM_COLOR).as<rs2::video_frame>();

				if (color)
				{
					pc.map_to(color);

					depth = fset.first_or_default(RS2_STREAM_DEPTH).as<rs2::depth_frame>();
					if (depth)
					{
						points = pc.calculate(depth);
					}
				}
				else
				{
				}
			}
		}
	}
 }
	return 1;
}

@RealSenseCustomerSupport
Copy link
Collaborator


Hi @spaltiel, Do you have further question on this?
If we don't hear from you in 7 days, we will close this.

Thanks,

@AItah
Copy link
Author

AItah commented Jan 28, 2020

No, we are good.

again lots of thanks to Sergey.

@AItah AItah closed this as completed Jan 28, 2020
@AItah
Copy link
Author

AItah commented Sep 11, 2020

dear friends,

we would like to add also accel and gyro.

using the same convention:
sensor.open(Accel stream profile)
sensor.open(gyro stream profile)

return crash.

how we can open accel and gyro using same syncer concept?

br,
a.i.

@AItah AItah reopened this Sep 11, 2020
@AItah
Copy link
Author

AItah commented Sep 13, 2020

we found our bug and how to fix it.

Originally we tried to call rs2::sensor::open(stream_profile) twice for the same sensor which result with an error.
Later we noticed that sensor.open has an overload which get std::vector<stream_profile>.
now we do something like the following:
std::vector<stream_profile> vsp{};
vsp.push_back(stream_profile_1);
vsp.push_back(stream_profile_2);
...
sensor.open(vsp);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants