Skip to content

2. Building and running code

Michelle edited this page Aug 21, 2022 · 45 revisions

We use CMake as our build system, and Python setuptools to bundle our Python wheels that get uploaded to PyPI.

In everyday use, you may often find yourself only interacting with the C++ code. When that happens, your points of entry for running C++ code are the tests. If you've followed the instructions in setting up your environment you now have a VSCode instance set up and are ready to build the code. The VSCode instance also comes bundled with a number of helpful plugins. One of those plugins is for interacting with CMake. That's the easiest way to run (and debug) tests.

Here's the official tutorial on the CMake plugin. It's definitely recommended reading.

Configure your project

If you don't have time to read docs, here are some simple instructions to get up and running:

  1. On the left tool bar, you'll find at the bottom of the list of tools one with the CMake logo (when you mouse over it it says "CMake"). Click on that.
  2. At the top of the window that pops up, directly to the right of "CMAKE: PROJECT OUTLINE" is an icon of a document with an arrow sticking out of it to the right. It says "Configure all projects" when you mouse over it. Click on that. If you get an error about CMakeCache being out of date, open up a terminal (at the bottom of your window) and rm -rf build/ to clear your build cache. Then try again.
  3. If you get a popup which says "Select a Kit for diffdart", choose "GCC 9.3.1 x86_64-redhat-linux".

Building and Running C++ Tests

Once you've configured your project, there's a number of folders that appear under a heading "dart":

  • dart
  • python
  • unittests
  • examples
  • tutorials
  • uninstall

Click on unittests to open it up. In here you should see a bunch of new folders:

  • benchmarks
  • comprehensive
  • regression
  • unit
  • gtest
  • gtest_main

Most useful tests live either in comprensive or unit. If you open those up, you'll see a long list of tests. You can right click on any of these tests and select "Run in Terminal" or "Debug" from the list of options that appear.

The corresponding code lives at the same path in our codebase for unittests/comprehensive and unittests/unit tests.

For example, try right clicking on unittests/comprensive/test_CollideGradient and selecting "Debug". That should compile the code (which can take a while) and then run the unittests/comprehensive/test_CollideGradient.cpp code in the VSCode debugger (which is just a nice GUI on top of gdb).

Building and running all tests

If you'd like to build and run all tests at once, you can do this from the command line of the docker container:

cd build
make tests -j<num_jobs>
make test

Here's an explanation of what each command does:

  • make tests: builds all of the tests.
    • This builds whatever build variant (debug, release) VSCode is set to. This is because VSCode's Cmake plugin creates the build folder automatically when the build variant is set in VSCode. Alternatively, you could also run mkdir build && cd build && cmake .. and pass any args you want to the cmake call.
  • -j<num_jobs>: tells Make to build the tests in parallel with num_jobs number of threads. Without num_jobs (just running -j by itself), it tries to guess how many threads to use. Each thread can use GBs of RAM, so set num_jobs accordingly to prevent your machine from crashing. You can always start with -j1 or -j2. On my machine it takes roughly 11 minutes with -j4.
  • make test runs all of the tests. This takes around 1.5 hours on my machine to run.

This effectively does the same thing as the CI (see 6. Pushing changes to PyPI and the Website). But this could be faster than waiting to run the tests if your computer isn't ancient.

Writing your own unit tests

  1. Under unittests, you can create your own unit test file under any of the folders (e.g., comprehensive, unit).
  2. Open up CMakeLists.txt under the folder you've created your file in.
  3. Add the line dart_add_test("<folder_name>" <test_name>) for your test within the list of dart_add_test lines at the beginning of the file (following alphabetical order).
  4. If your test requires additional libraries, you may need to additionally link that library by calling target_link_libraries. See CMakeLists.txt files for examples of how to do this. For instance, if you're loading a robot file, you probably need to link the dart-utils library (and also dart-utils-urdf if loading URDF files).
    • An example error you might see is undefined reference to 'dart::utils::UniversalLoader::loadWorld(std::string const&)'. You'll want to link dart-utils to fix this error.

Selecting the build variant CMAKE_BUILD_TYPE from VSCode:

You can build for Debug, Release, RelWithDebInfo, and MinSizeRel. This corresponds to cmake -DCMAKE_BUILD_TYPE=Release ... at the command line.

You can change variants by pressing shift+ctrl+p and finding "CMake: Select Variant". That command will let you choose which mode to build. You can also click on the part of the bottom bar that says CMake: [<build variant>] to select the variant.

Tips:

  • If you're trying to use the debugger and you're seeing a lot of "optimized-out" for variable values and you're having trouble setting breakpoints, you probably forgot to switch to Debug mode.
  • Debug is much faster to build than Release, but is ~100x slower at runtime.

Seeing what's going on (running the 3D visualizer):

If you want to see a GUI while you're working on C++, you'll need to launch our web GUI from source, connect to it in a browser, and then launch a C++ test that creates a GUI server to display progress.

To launch the web GUI, open up a terminal in VSCode, and run the following commands:

# cd javascript
# npm install
# npm run dev

That last command will boot a server on http://localhost:9000. You should be able to open that URL up in a browser on your computer and see the GUI (this works because the Docker container you're working in is automatically set up to forward port 9000 to your local machine).

Once the web GUI is running, try booting up a test that runs a GUI server from the CMake GUI. For example, try right clicking on unittests/comprehensive/test_GUIWebsocketServer (in the CMake interface) and selecting "Run in Terminal".

Once that's running, now you should visit (or refresh) http://localhost:9000 and you should see an Atlas robot on your screen!

Building and running Python wheels:

You may want to build and run your own Python binary that reflects any code changes you make during your development (e.g., modify the core C++ code, or adding Python bindings). There are two ways to do this.

Building a Python binaries locally

Building the Python binary

From the CMake plugin in VSCode, you can navigate to python/_nimblephysics/_nimblephysics and right click and "Build". This will build ONLY the _nimblephysics.so binary (with the build variant you chose), and will not package it up with Python. This is a good option if you're just trying to check if your code will compile. The resulting binary is created at build/python/_nimblephysics/_nimblephysics.so in your repo, if compilation is successful.

Testing the Python binary

To test this binary locally, you can manually replace the binary in your installed version of the nimblephysics production wheel (i.e., when you run the pip install nimblephysics command). The binary is located at:

<LIB_DIR>/lib/python3.8/site-packages/nimblephysics_libs/_nimblephysics.so

You can use the command pip3 show nimblephysics_libs (or it might be nimblephysics for you) to figure out the exact path. Here's an example of the output you might see:

Name: nimblephysics
Version: 0.6.15.1
Summary: A differentiable fully featured physics engine
Home-page: UNKNOWN
Author: Keenon Werling
Author-email: keenonwerling@gmail.com
License: MIT
Location: /home/<USERNAME>/.local/lib/python3.8/site-packages
Requires: torch, numpy
Required-by: 

The Location field tells you where the nimblephysics package is installed. Once you've figured out where the nimblephysics_libs package is, you can manually replace _nimblephysics.so there:

# Optional: Back up the original binary. Example command:
mv <LIB_DIR>/lib/python3.8/site-packages/nimblephysics_libs/_nimblephysics.so <LIB_DIR>/lib/python3.8/site-packages/nimblephysics_libs/_nimblephysics.old.so

# Now replace with the new _nimblephysics.so
mv build/python/_nimblephysics/_nimblephysics.so <LIB_DIR>/lib/python3.8/site-packages/nimblephysics_libs/

Now when you do import nimblephysics in Python, your custom binary is what get executed.

Notes

  • This Python binary will only work on a machine with the same setup as the docker container (or just run within the docker container itself), because you built binary inside of the docker container. See Pushing changes to PyPI and the Website for instructions on how to push your changes so that a generic Python binary that works for multiple platforms is built.
  • Currently, typecasting between Eigen (C++) and numpy (Python) seems to only work in Release mode, not debug mode.

Packaging Python wheels for others

You can also run $PYTHON setup.py sdist bdist_wheel from a VSCode terminal. This will build an entire Python wheel, similar to the ones we ship to PyPI. If you then run ${PYTHON} -m auditwheel repair NAME_OF_YOUR_WHEEL_HERE, this will bundle any dynamically linked libraries into the _nimblephysics.so library that's in your wheel (to make it portable), and then your new wheel will be exactly the same as the ones our CI system generates for PyPI. You can move this wheel around, install it, send it to friends, etc.

Note: When installing the wheel, use --force-reinstall to enforce the new wheel override the previous one.