mindspore.dataset.vision.py_transforms.RandomAffine

class mindspore.dataset.vision.py_transforms.RandomAffine(degrees, translate=None, scale=None, shear=None, resample=Inter.NEAREST, fill_value=0)[source]

Apply random affine transformation to the input PIL Image.

Parameters
  • degrees (Union[float, Sequence[float, float]]) – Range of degrees to select from. If float is provided, the degree will be randomly selected from (-degrees, degrees). If Sequence[float, float] is provided, it needs to be arranged in order of (min, max).

  • translate (Sequence[float, float], optional) – Maximum absolute fraction sequence in shape of (tx, ty) for horizontal and vertical translations. The horizontal and vertical shifts are randomly selected from (-tx * width, tx * width) and (-ty * height, ty * height) respectively. Default: None, means no translation.

  • scale (Sequence[float, float], optional) – Range of scaling factor to select from. Default: None, means to keep the original scale.

  • shear (Union[float, Sequence[float, float], Sequence[float, float, float, float]], optional) – Range of shear factor to select from. If float is provided, a shearing parallel to X axis with a factor selected from (- shear , shear ) will be applied. If Sequence[float, float] is provided, a shearing parallel to X axis with a factor selected from ( shear [0], shear [1]) will be applied. If Sequence[float, float, float, float] is provided, a shearing parallel to X axis with a factor selected from ( shear [0], shear [1]) and a shearing parallel to Y axis with a factor selected from ( shear [2], shear [3]) will be applied. Default: None, means no shearing.

  • resample (Inter, optional) –

    Method of interpolation. It can be Inter.BILINEAR, Inter.NEAREST or Inter.BICUBIC. If the input PIL Image is in mode of “1” or “P”, Inter.NEAREST will be used directly. Default: Inter.NEAREST.

    • Inter.BILINEAR, bilinear interpolation.

    • Inter.NEAREST, nearest-neighbor interpolation.

    • Inter.BICUBIC, bicubic interpolation.

  • fill_value (Union[int, tuple[int, int, int]], optional) – Pixel value for areas outside the transform image. If int is provided, it will be used for all RGB channels. If tuple[int, int, int] is provided, it will be used for R, G, B channels respectively. Only supported with Pillow 5.0.0 and above. Default: 0.

Raises
Supported Platforms:

CPU

Examples

>>> from mindspore.dataset.transforms.py_transforms import Compose
>>>
>>> transforms_list = Compose([py_vision.Decode(),
...                            py_vision.RandomAffine(degrees=15, translate=(0.1, 0.1), scale=(0.9, 1.1)),
...                            py_vision.ToTensor()])
>>> # apply the transform to dataset through map function
>>> image_folder_dataset = image_folder_dataset.map(operations=transforms_list,
...                                                 input_columns="image")