bump albumentation to >2 and fix breaking changes#3174
bump albumentation to >2 and fix breaking changes#3174arashsm79 wants to merge 1 commit intoDeepLabCut:mainfrom
Conversation
- Fix name and signature changes in albumentations 2.x
There was a problem hiding this comment.
Pull request overview
This PR updates the albumentations dependency from version <=1.4.3 to >=2 and addresses all breaking API changes. The migration includes parameter name changes, method signature updates, and consistent handling of keypoints/bboxes as numpy arrays throughout the codebase.
Changes:
- Updated albumentations dependency to >=2 in setup.py and requirements.txt
- Migrated all deprecated API calls (always_apply → p=1.0, get_params_dependent_on_targets → get_params_dependent_on_data, etc.)
- Changed keypoint and bbox handling to consistently use numpy arrays instead of lists
Reviewed changes
Copilot reviewed 5 out of 6 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| setup.py | Updated albumentations constraint from <=1.4.3 to >=2 |
| requirements.txt | Updated albumentations constraint from <=1.4.3 to >=2 |
| deeplabcut/pose_estimation_pytorch/data/transforms.py | Comprehensive migration of all albumentations API calls to v2 syntax |
| deeplabcut/pose_estimation_pytorch/data/utils.py | Convert keypoints to numpy array before passing to transforms |
| tests/pose_estimation_pytorch/data/test_transforms.py | Updated keypoint format to use lists (converted to arrays internally) and adjusted precision tolerance |
| tests/pose_estimation_pytorch/other/test_custom_transforms.py | Updated keypoint format to numpy array |
Comments suppressed due to low confidence (2)
deeplabcut/pose_estimation_pytorch/data/transforms.py:485
- The ElasticTransform class still passes the deprecated
always_applyparameter to the parent A.ElasticTransform class. In albumentations v2, this parameter has been removed and should be converted top=1.0 if always_apply else pbefore calling super().init(), similar to other transforms in this file (see Grayscale, CoarseDropout, ScaleToUnitRange examples).
super().__init__(
alpha,
sigma,
alpha_affine,
interpolation,
border_mode,
value,
mask_value,
always_apply,
approximate,
same_dxdy,
p,
)
deeplabcut/pose_estimation_pytorch/data/transforms.py:437
- Empty error message in ValueError. Should provide a descriptive message such as '
alphamust be a float, int, or tuple of two numbers.' to help users understand what went wrong.
raise ValueError("")
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| min_width=crop_sampling["width"], | ||
| border_mode=cv2.BORDER_CONSTANT, | ||
| always_apply=True, | ||
| p=1.0 # always apply |
There was a problem hiding this comment.
Missing space after the '#' symbol in the comment. Should be '# always apply'.
| p=1.0 # always apply | |
| p=1.0 # always apply |
| bboxes_out = bbox_xyxy | ||
| if bboxes_extra is not None: | ||
| bboxes_out = [bbox + extra for bbox, extra in zip(bboxes_out, bboxes_extra)] | ||
| bboxes_out = np.column_stack([bbox_xyxy] + [np.array(bboxes_extra)]) |
There was a problem hiding this comment.
The bboxes_extra stacking logic is incorrect. bboxes_extra is already a list of arrays/tuples from line 620. Wrapping it in [np.array(bboxes_extra)] creates a 2D array instead of properly stacking the extra columns. This should be np.column_stack([bbox_xyxy, bboxes_extra]) or similar to correctly append the extra columns.
| bboxes_out = np.column_stack([bbox_xyxy] + [np.array(bboxes_extra)]) | |
| bboxes_extra_arr = np.asarray(bboxes_extra) | |
| bboxes_out = np.concatenate([bbox_xyxy, bboxes_extra_arr], axis=1) |
| class ScaleToUnitRange(A.ImageOnlyTransform): | ||
| def __init__(self, always_apply=True, p=1.0): | ||
| super().__init__(always_apply=always_apply, p=p) | ||
| super().__init__(p= 1.0 if always_apply else p) |
There was a problem hiding this comment.
Extra space after '=' in 'p= 1.0'. Should be 'p=1.0' to follow consistent formatting with other lines in the file (e.g., lines 429, 549).
| super().__init__(p= 1.0 if always_apply else p) | |
| super().__init__(p=1.0 if always_apply else p) |
We may need some additional tests for ElasticTransform. I'm not sure if it's being tested anywhere. ( @deruyter92 )