-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to adjust the dense point cloud filtering #911
Comments
Meshlab won't import Alembic, the format for the point cloud. Blender will but as Alembic is a scene you'll get all the cameras as well. Bit messy to clean up. But none of that matters much as I've never been able to generate a new Alembic outside of Meshroom that works when brought back in. Pics of your issue would help with suggestions on what to tweak. |
You could use DepthMapFilter to mask out unwanted areas. Experimental: |
Awesome - thanks! I’m running trunk, so can give that a try. Been reading all the docs again and some of it is making more sense. There was a setting on one of the main nodes re using the angle (larger == closer) as a discriminator. I’ll try that first. @natowi I’d like to try and help with the docs, but the subject is intimidating; what’s the best way to discuss some starting points? Open an issue on the docs repo, or is there a google group? And a de-rail, is it better to ask general questions here or is the group preferred? I’m unclear which is the appropriate place (but the google group seems noisier.) I have some questions re general approaches for a couple of scenarios that may be common starting points. Would be willing to write up the final info as a guide for the docs. |
Hi @julianrendell, |
That is a good idea. This would also be a good place to discuss your other ideas @julianrendell |
@NexTechAR-Scott here's the Metashape result: and here's the Meshroom result: Meshroom has picked up more of the ground, but that's a trivial fix. But I'm not sure what's going on with the "wings"/"ears" - extra mesh to the sides at the top. |
@fabiencastan @natowi be happy to chat re where you need help and if I can be of assistance. I'm in the PST timezone. |
Update - improvements and losses... Here's the results of the default pipeline. Second branch is for "experiments" below. (BTW- now that I'm starting to understand the pipeline, really appreciating it. Great design choice!) I've oriented it to show the object of interest as well as the SFM point cloud. You can see the "ears." Here's the best I've managed so far: First I tried modifying the DepthMapFilter node- ended up with min view angle =10 (max.) It removed only a little bit of the "ears". Then I repeated a few times looking at the DepthMap node- adjusting min view angle to 10 (max) here removed ~25% of them. Much more effect than the DepthMapFilter. The most effect was changing the SFM options: Min observation for triangulation ->4, and Min angle for Triangulation -> 8. Even smaller changes really cleaned up the point cloud in areas away from the subject of interest. But now I'm starting to loose the trunk, and still have small some small artifacts at the top. Here's a close up of the point cloud looking down from the top: And the mesh as a solid: It looks to me like there is a "fold" in the mesh at the edge of the subject of interest, and some sort of error has accumulated in the meshing stage. There doesn't seem to be any points in the point cloud around this area. @natowi @fabiencastan @NexTechAR-Scott I'm guessing that these remnants are really coming from the meshing process- maybe the Min observations from SFM space estimation, or Min observable angle for SFM space observation? Can someone explain what these parameters control? Appreciate your feedback- feel like I'm starting to get a feel for how to set this up as a serious of experiments with each step of the pipeline. When I get my head around this a bit better and figure out the renderfarm I think I'll be ready to add this as a module to the CAD/CAM classes I sometimes teach. |
maybe you can also try to play with the |
SFM angles are essentially bounding box. Think of 0 as looking straight ahead meaning everything in the background of the subject would get rendered 90 would effectively be looking straight down on the object The higher the angle the “smaller” the bounding box for reconstruction Obviously too high an angle can be as bad as too low 40 is a reasonable compromise when shooting int the wild If using a light box the default is fine I’ve got a few settings I can post for |
Thanks @simogasp! Already selected keep only largest mesh; these 'ears' are connected. Will try the large triangles setting. |
Thanks @NexTechAR-Scott ! That’s a good explanation. Just realizing what it can mean to be able to tweak these angles at each stage. If you’d be willing to share a couple of pipelines (outdoor vs light box) that’d be really appreciated. I have personal and local community projects at both scales. (One of my sons is blender mad right now- itching to try the blender plugin and see what we can come up with!) |
Here is the default project.mg I start with and tweak from there as needed. Works for 90% of what I do with no other tuning but a lot of this is dependent on subject matter. If you have not already noticed, bold entries in node parameters indicate a change from default. Open in MR and add your data set. |
@NexTechAR-Scott - appreciated! Had noticed the bold for changes. Looking forward to trying your settings on the sample dataset later today. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue is closed due to inactivity. Feel free to re-open if new information is available. |
I ran both Meshroom (latest source) and Agisoft Metashape (trial) against the monstree full dataset.
In the viewer, they both have amazing results.
Meshlab picked up that holes at the top-left of the knot hole! Metashape has made it black textured geometry.
But Metashape has done a better job of excluding outliers/focusing on the tree trunk in the mesh reconstruction. In Meshroom, both sides of the trunk have "little wings" which look to be heading towards two sparse clusters of points at both edges of the dense point cloud. (Note: Metashape also appears to have these two sparse clusters in it's point cloud, but they appear to be filtered out of the dense point cloud.)
Would these "wings" be gone if the "not-related" points were filtered out?
What options should I tweak to try and filter these?
Is it possible to load the dense point cloud in say meshlab and manually delete this noise? (And is it as simple as just saving the edit and continuing with the processing in Meshroom?)
Or am I focusing on the wrong stage- is there somewhere else in the pipeline I should be making adjustments?
Thanks in advance!
The text was updated successfully, but these errors were encountered: