I'm learning about image denoising and Pytorch.I want to get burst of images generated from a single image. For example, I have an image, then random crop a patch of specific size from it. Then I want to add an 1 or 2 pixels shift on it to get a new image with tiny difference. What could I do? Is it better to use some techniques in PIL or others?
Asked
Active
Viewed 764 times
1
-
1Possible duplicate of [Data Augmentation in PyTorch](https://stackoverflow.com/questions/51677788/data-augmentation-in-pytorch) – Paul92 Apr 27 '19 at 14:47
-
I have thought about it, but it seemed difficult to restrict the misalignment to 1 or 2 pixels. In other words, I want the first crop place to be random but next crops near the first one. I don't know how augmentation in Pytorch to do that. – Hanlin Apr 27 '19 at 15:07
1 Answers
1
You should use the transforms
to do some image augmentation for your problem.
As I read your comment, you can restrict translate = (a, b)
to do some tiny random shifts in both dimensions.
import torchvision.transforms as transforms
transform = transforms.RandomAffine(degrees, translate=None, scale=None, shear=None, resample=False, fillcolor=0)
img = PIL.Image.open('path/img')
new_img = transform(img)
If you want to perform more transforms like Crop
as well, group all the transform
into one big transform
using transforms.Compose
. Here is your reference

David Ng
- 1,618
- 10
- 11
-
I cannot restrict translate = (2, 2). It rises an error "valueError should be between 0 and 1" – Hanlin Apr 28 '19 at 03:35
-
I just edit the answer (my mistake) to `translate = (a, b)` as the doc indicates `-img_width * a < dx < img_width * a`. So you can compute `a = some value` to match `dx = 2 pixels`, same for `dy`. – David Ng Apr 28 '19 at 03:48
-
I am confused why the a, b should be multipled with img_width or height. If -img_width < dx < img_width, the image will be shifted beyond the original place?then left a whole black or white image? Anyway, I finally use PIL to achieve this, and after get these images data, i will transform it to tensor. Thank you anyway ! – Hanlin Apr 29 '19 at 06:57