Skip to content

Commit

Permalink
Fixed issue where _create_*_transformers sometimes would not return a…
Browse files Browse the repository at this point in the history
… value
  • Loading branch information
stewarthe6 committed Dec 11, 2024
1 parent 4689f9e commit cd14bd9
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion atomsci/ddm/pipeline/model_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -338,6 +338,8 @@ def _create_output_transformers(self, dataset):
# TODO: Just a warning, we may have response transformers for classification datasets in the future
if self.params.prediction_type=='regression' and self.params.transformers is True:
return [trans.NormalizationTransformerMissingData(transform_y=True, dataset=dataset)]
else:
return []

# ****************************************************************************************

Expand Down Expand Up @@ -1612,7 +1614,9 @@ def _create_output_transformers(self, dataset):
"""
# TODO: Just a warning, we may have response transformers for classification datasets in the future
if self.params.prediction_type=='regression' and self.params.transformers is True:
self.transformers = [trans.NormalizationTransformerHybrid(transform_y=True, dataset=dataset)]
return [trans.NormalizationTransformerHybrid(transform_y=True, dataset=dataset)]
else:
return []

# ****************************************************************************************
class ForestModelWrapper(ModelWrapper):
Expand Down

2 comments on commit cd14bd9

@paulsonak
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is returning a list instead of setting self.transformers as the list OK for this HybridModel class? The function docstring specifically describes the way it works (overwrite transformers), but this is apparently different than the way other model classes work. Need to double check that whatever is calling this function is also rectified to receive the list and set the transformers there.

@stewarthe6
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it's supposed to return transformers now. I haven't had time to go back and update the documentation yet.

for k, td in training_datasets.items():

You can see there that the returned list of transformers gets assigned to the correct fold in the calling function.

Please sign in to comment.