-
Hi, first of all thanks for your work ! I am trying to use theseus essentialy for the pytorch implementation of Levenberg-Marquardt. I know this implementation is different but I am trying to use it in the same context as scipy.optimize.least_squares. I have a residuals function which returns errors and a jacobian one both written in pytorch. I followed tutorials (especially 3) to create "theseus warpers" for my functions. I have one cost function for my whole optimization problem which warpsmy residuals and jacobian function. I encounter a problem during optimization in dense_linearization.py : "self.A[:, row_slice, col_slice] = var_jacobian By looking in detail, I have those shapes in _linearize_jacobian_impl : everything seems coherent except for A column number which seems to come from Linearization Class initialization : self.num_cols = col_counter` the number of column seems to come from the var dimension which are the optimization variables ? Shouldn't it be defined with respect to the returned error dimension ? I am probably understanding something wrong. Is this behavior normal and I defined my cost function wrongly ? Thanks for your help |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 4 replies
-
Hi @HassanLH. Thanks for your message! Our jacobians are expected to be of shape |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
cost_function.dim()
should return the error's dimension. If you have some minimal code to reproduce this error, I can help you debug.