Skip to content

Commit

Permalink
Merge pull request #344 from princeton-vl/develop
Browse files Browse the repository at this point in the history
v1.12.2
  • Loading branch information
araistrick authored Jan 8, 2025
2 parents d86762e + efc95b9 commit f0040a7
Show file tree
Hide file tree
Showing 14 changed files with 142 additions and 61 deletions.
8 changes: 8 additions & 0 deletions docs/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,3 +180,11 @@ v1.12.1
- Bugfix stdout passthrough mode crashing due to no logfile created
- Add normalmaps to integration test viewer, misc test fixes
- Avoid rare duplicate names in indoor solver

v1.12.2
- Fix excessive time/memory/crashes in nature scenes due to inactive viewpoint filter
- Fix blendergt not set to 1hr timelimit by slurm_1h.gin
- Add get_cmd.child_debug flag
- Usability improvements for integration test scripts
- Fix static asset import #391
- Fix indoor_asset_semantics.py typo #398
2 changes: 1 addition & 1 deletion docs/HelloRoom.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ Each of these commandline args demonstrates a different way in which you can res
- `restrict_solving.consgraph_filters=[\"counter\",\"sink\"]` says to throw out any `constraints` or `score_terms` keys from `home_furniture_constraints()` that do not contain `counter` or `sink` as substrings, producing a simpler constraint graph.
- `compose_indoors.solve_steps_large=30 compose_indoors.solve_steps_small=30` says to spend fewer optimization steps on large/small objects. You can also do the same for medium. These values override the defaults provided in `fast_solve.gin` and `infinigen_examples/configs_indoor/base.gin`

These settings are intended for debugging or for generating tailored datasets. If you want more granular control over what assets are used for what purposes, please customize `infinigen_examples/indoor_asset_semantics.py` which defines this mapping.
These settings are intended for debugging or for generating tailored datasets. If you want more granular control over what assets are used for what purposes, please customize `infinigen_examples/constraints/semantics.py` which defines this mapping.

If you are using the commands from [Creating large datasets](#creating-large-datasets) you will instead add these configs as `--overrides` to the end of your command, rather than `-p`

Expand Down
4 changes: 2 additions & 2 deletions docs/StaticAssets.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ If you want to add more categories, just add more lines with `{CategoryName}` as

## Define Semantics

Infinigen allows the user to specify high-level semantics for the objects in the scene. These semantics are then used to define high-level constraints. For example, we want to say that our static shelf factory is a type of storage unit, which will be placed against the wall, and there will be a bunch of objects on top of it. In general, if you want your static object factory to be treated like an existing asset factory, you can just imitate the semantics of the existing asset factory. Let's demonstrate this idea by defining semantics for our static shelf. We go to `infinigen_examples/indoor_asset_semantics.py` and search for `LargeShelfFactory`. We see that it is used as `Semantics.Storage` and `Semantics.AssetPlaceholderForChildren`. We want our static shelf to be used as a storage unit as well, so we add a line for our new static factory:
Infinigen allows the user to specify high-level semantics for the objects in the scene. These semantics are then used to define high-level constraints. For example, we want to say that our static shelf factory is a type of storage unit, which will be placed against the wall, and there will be a bunch of objects on top of it. In general, if you want your static object factory to be treated like an existing asset factory, you can just imitate the semantics of the existing asset factory. Let's demonstrate this idea by defining semantics for our static shelf. We go to `infinigen_examples/constraints/semantics.py` and search for `LargeShelfFactory`. We see that it is used as `Semantics.Storage` and `Semantics.AssetPlaceholderForChildren`. We want our static shelf to be used as a storage unit as well, so we add a line for our new static factory:
![alt text](images/static_assets/image3.jpg)

Similarly, we add `StaticShelfFactory` to `Semantics.AssetPlaceholderForChildren`. This will replace the placeholder bounding box for the shelf before placing the small objects.
Expand Down Expand Up @@ -166,7 +166,7 @@ StaticMyCategoryFactory = static_category_factory("infinigen/assets/static_asset

5. Add a line in `infinigen/assets/static_assets/__init__.py` to import the factory from other files.

6. Define the semantics for the objects in `infinigen_examples/indoor_asset_semantics.py`. E.g.
6. Define the semantics for the objects in `infinigen_examples/constraints/semantics.py`. E.g.

```python
used_as[Semantics.Furniture] = {...
Expand Down
2 changes: 1 addition & 1 deletion infinigen/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
import logging
from pathlib import Path

__version__ = "1.12.1"
__version__ = "1.12.2"


def repo_root():
Expand Down
14 changes: 6 additions & 8 deletions infinigen/assets/static_assets/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,11 @@ def collapse_hierarchy(root_obj):
mesh_objects = []

def process_object(obj, parent=None):
new_obj = obj
if obj.type != "MESH":
new_obj = create_empty_mesh_object(obj.name, parent)
new_obj.matrix_world = obj.matrix_world
else:
if parent:
new_obj.parent = parent
new_obj = create_empty_mesh_object(obj.name, parent)
new_obj.matrix_world = obj.matrix_world

if obj.type == "MESH":
new_obj.data = obj.data.copy()

mesh_objects.append(new_obj)

Expand All @@ -54,9 +52,9 @@ def process_object(obj, parent=None):

bpy.context.view_layer.objects.active = mesh_objects[0]
bpy.ops.object.join()

final_obj = bpy.context.active_object

# Delete the original hierarchy
butil.delete(list(butil.iter_object_tree(root_obj)))

return final_obj
11 changes: 9 additions & 2 deletions infinigen/core/placement/camera.py
Original file line number Diff line number Diff line change
Expand Up @@ -753,6 +753,8 @@ def animate_cameras(
def anim_valid_camrig_pose_func(cam_rig: bpy.types.Object):
assert len(cam_rig.children) > 0

scores = []

for cam in cam_rig.children:
score = keep_cam_pose_proposal(
cam,
Expand All @@ -765,10 +767,15 @@ def anim_valid_camrig_pose_func(cam_rig: bpy.types.Object):
**kwargs,
)

frame = bpy.context.scene.frame_current
logger.debug(f"Checking {cam.name=} {frame=} got {score=}")

if score is None:
return False
return None

scores.append(score)

return True
return np.min(scores)

for cam_rig in cam_rigs:
if policy_registry is None:
Expand Down
105 changes: 70 additions & 35 deletions infinigen/core/placement/placement.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ def scatter_placeholders(locations, factory: AssetFactory):
return col


def get_placeholder_points(obj):
def get_placeholder_points(obj: bpy.types.Object) -> np.ndarray:
if obj.type == "MESH":
verts = np.zeros((len(obj.data.vertices), 3))
obj.data.vertices.foreach_get("co", verts.reshape(-1))
Expand All @@ -148,51 +148,82 @@ def parse_asset_name(name):
return list(match.groups())


def filter_populate_targets(
placeholders: list[bpy.types.Object],
cameras: list[bpy.types.Object],
dist_cull: float,
vis_cull: float,
verbose: bool,
) -> list[tuple[bpy.types.Object, float, float]]:
if verbose:
placeholders = tqdm(placeholders)

results = []

for i, p in enumerate(placeholders):
classname, *_ = parse_asset_name(p.name)

if classname is None:
raise ValueError(f"Could not parse {p.name=}, got {classname=}")

mask, min_dists, min_vis_dists = split_in_view.compute_inview_distances(
get_placeholder_points(p),
cameras,
dist_max=dist_cull,
vis_margin=vis_cull,
verbose=False,
)

dist = min_dists.min()
vis_dist = min_vis_dists.min()

if not mask.any():
logger.debug(
f"{p.name=} culled, not in view of any camera. {dist=} {vis_dist=}"
)
continue

results.append((p, dist, vis_dist))

return results


def populate_collection(
factory: AssetFactory,
placeholder_col: bpy.types.Collection,
cameras,
asset_col_target=None,
cameras=None,
dist_cull=None,
vis_cull=None,
verbose=True,
cache_system=None,
**asset_kwargs,
):
logger.info(f"Populating placeholders for {factory}")

if asset_col_target is None:
asset_col_target = butil.get_collection(f"unique_assets:{repr(factory)}")

all_objs = []
updated_pholders = []
placeholders = [o for o in placeholder_col.objects if o.parent is None]

if verbose:
placeholders = tqdm(placeholders)

for i, p in enumerate(placeholders):
classname, fac_seed, _, inst_seed = parse_asset_name(p.name)
if classname is None:
continue
if cameras is not None:
logger.info(f"Checking visibility for {placeholder_col.name=}")
targets = filter_populate_targets(
placeholders, cameras, dist_cull, vis_cull, verbose
)
else:
targets = [(p, detail.scatter_res_distance(), 0) for p in placeholders]

if cameras is not None:
mask, min_dists, min_vis_dists = split_in_view.compute_inview_distances(
get_placeholder_points(p), cameras, verbose=verbose
)
print(
f"Populating {len(targets)} placeholders for {factory=} out of {len(placeholders)} total"
)

dist = min_dists.min()
vis_dist = min_vis_dists.min()
all_objs = []
updated_pholders = []

if not mask.any():
logger.debug(
f"{p.name=} culled, not in view of any camera. {dist=} {vis_dist=}"
)
continue
if verbose:
targets = tqdm(targets)

else:
dist = detail.scatter_res_distance()
vis_dist = 0
for i, (p, dist, vis_dist) in enumerate(targets):
classname, inst_seed, *_ = parse_asset_name(p.name)

if cache_system:
if (
Expand All @@ -209,10 +240,12 @@ def populate_collection(
cache_system.link_fire(full_sim_folder, sim_folder, obj, factory)
else:
break
else:
obj = factory.spawn_asset(
i, placeholder=p, distance=dist, vis_distance=vis_dist, **asset_kwargs
)

continue

obj = factory.spawn_asset(
i, placeholder=p, distance=dist, vis_distance=vis_dist, **asset_kwargs
)

if p is not obj:
p.hide_render = True
Expand Down Expand Up @@ -268,11 +301,13 @@ def populate_all(
)
continue

fac_inst = factory_class(int(fac_seed), **kwargs)

new_assets, pholders = populate_collection(
factory_class(int(fac_seed), **kwargs),
col,
asset_target_col,
camera=cameras,
fac_inst,
placeholder_col=col,
cameras=cameras,
asset_target_col=asset_target_col,
dist_cull=dist_cull,
vis_cull=vis_cull,
cache_system=cache_system,
Expand Down
15 changes: 10 additions & 5 deletions infinigen/core/placement/split_in_view.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,12 +149,17 @@ def compute_inview_distances(
bpy.context.scene.frame_set(frame)
for cam in cameras:
dists, vis_dists = compute_vis_dists(points, cam)
mask |= (dists < dist_max) & (vis_dists < vis_margin)
if mask.any():
min_vis_dists[mask] = np.minimum(vis_dists[mask], min_vis_dists[mask])
min_dists[mask] = np.minimum(dists[mask], min_dists[mask])
frame_cam_mask = (dists < dist_max) & (vis_dists < vis_margin)

logger.debug(f"Computed dists for {frame=} {cam.name} {mask.mean()=:.2f}")
if frame_cam_mask.any():
min_vis_dists[frame_cam_mask] = np.minimum(
vis_dists[frame_cam_mask], min_vis_dists[frame_cam_mask]
)
min_dists[frame_cam_mask] = np.minimum(
dists[frame_cam_mask], min_dists[frame_cam_mask]
)

mask |= frame_cam_mask

return mask, min_dists, min_vis_dists

Expand Down
1 change: 1 addition & 0 deletions infinigen/datagen/configs/compute_platform/slurm_1h.gin
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ queue_coarse.hours = 1
queue_fine_terrain.hours = 1
queue_populate.hours = 1
queue_render.hours = 1
ground_truth/queue_render.hours = 1
queue_mesh_save.hours = 1
queue_opengl.hours = 1

Expand Down
10 changes: 10 additions & 0 deletions infinigen/datagen/job_funcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ def get_cmd(
driver_script="infinigen_examples.generate_nature", # replace with a regular path to a .py, or another installed module
input_folder=None,
process_niceness=None,
child_debug=None,
):
if isinstance(task, list):
task = " ".join(task)
Expand All @@ -72,9 +73,18 @@ def get_cmd(
cmd += "--input_folder " + str(input_folder) + " "
if output_folder is not None:
cmd += "--output_folder " + str(output_folder) + " "

cmd += f"--seed {seed} --task {task} --task_uniqname {taskname} "

if child_debug is not None:
if child_debug == "all":
cmd += "--debug "
else:
cmd += f"--debug {child_debug} "

if len(configs) != 0:
cmd += f'-g {" ".join(configs)} '

cmd += "-p"

return cmd.split()
Expand Down
4 changes: 1 addition & 3 deletions infinigen/datagen/manage_jobs.py
Original file line number Diff line number Diff line change
Expand Up @@ -346,9 +346,7 @@ def update_symlink(scene_folder, scenes):
std_out = scene_folder / "logs" / f"{scene.job_id}_0_log.out"

if not std_out.exists():
raise FileNotFoundError(
f"{std_out=} does not exist during attempt to symlink from {to=}"
)
std_out.touch()

if os.path.islink(to):
os.unlink(to)
Expand Down
2 changes: 2 additions & 0 deletions infinigen/datagen/util/upload_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -194,6 +194,8 @@ def get_upload_func(method="smbclient"):
return smb_client.upload
elif method.startswith("copyfile"):
return lambda x, y: copy_upload_file(x, y, root_dir=method.split(":")[-1])
elif method == "mock":
return lambda x, y: print(f"Mock upload {x} to {y}")
else:
raise ValueError(f"Unrecognized {method=}")

Expand Down
23 changes: 20 additions & 3 deletions infinigen/tools/results/analyze_crash_reasons.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,15 @@ def get_configs(log_path, stage):
return match.groups()[0]


def main(args):
crash_reasons = (args.input_folder / "crash_summaries.txt").read_text().split("\n")
def parse_run_folder(run_folder: Path, args: argparse.Namespace):

crash_reasons = (run_folder / "crash_summaries.txt")

if not crash_reasons.exists():
print(f"Could not find crash reasons for {run_folder}")
return

crash_reasons = crash_reasons.read_text().split("\n")

regex = re.compile(
".*\s.*\s(.*\/([a-zA-Z0-9]*)\/logs\/(.*))\sreason=[\"'](.*)[\"']\snode='(.*)'"
Expand Down Expand Up @@ -70,6 +77,9 @@ def main(args):

df = pd.DataFrame.from_records(records)

return df

def visualize_results(df: pd.DataFrame, args: argparse.Namespace):
df["reason_canonical"] = df["reason"].apply(canonicalize_reason)

print("COMMON CRASH REASONS")
Expand Down Expand Up @@ -108,10 +118,17 @@ def main(args):
print(f" {row}")
print("")

def main(args):

run_dfs = [parse_run_folder(run_folder, args) for run_folder in args.input_folder]
run_dfs = [x for x in run_dfs if x is not None]

df = pd.concat(run_dfs)
visualize_results(df, args)

if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--input_folder", type=Path, required=True)
parser.add_argument("--input_folder", type=Path, required=True, nargs="+")
args = parser.parse_args()

main(args)
2 changes: 1 addition & 1 deletion tests/integration/launch.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ INFINIGEN_VERSION=$(python -c "import infinigen; print(infinigen.__version__)")
COMMIT_HASH=$(git rev-parse HEAD | cut -c 1-6)
DATE=$(date '+%Y-%m-%d')
JOBTAG="${DATE}_ifg-int"
BRANCH=$(git rev-parse --abbrev-ref HEAD | sed 's/_/-/g; s/\//_/g')
BRANCH=$(git rev-parse --abbrev-ref HEAD | sed 's/_/-/g; s|/|-|g; s/\//_/g')
VERSION_STRING="${DATE}_${BRANCH}_${COMMIT_HASH}_${USER}"

mkdir -p $OUTPUT_PATH
Expand Down

0 comments on commit f0040a7

Please sign in to comment.