I'm writing a python program that read the colour of each pixel of an image, and then make a file with an ASCII character with a similar brightness of that pixel (using a dot of the originale pixel is dark, and a "#" if is bright). I realised that ASCII characters are much larger than single pixels: the resulting ASCII art would be too large to display on a normal monitor.
Now I want to try to apply this process only count one in every 10 pixels. How should I do this? How can set the step of my for cycle !=1 (default value)?