We can perform the operation by using another list. I've shared the code for it below. Hope this is what you're looking for.
final_urls=[]
start_urls = [
'https://www.xxxxxxx.com.au/home-garden/page-%s/c18397',
'https://www.xxxxxxx.com.au/automotive/page-%s/c21159',
'https://www.xxxxxxx.com.au/garden/page-%s/c25449']
final_urls.extend(url % page for page in range(1, 50) for url in start_urls)
Output Snippet
final_urls[1:20]
['https://www.xxxxxxx.com.au/automotive/page-1/c21159',
'https://www.xxxxxxx.com.au/garden/page-1/c25449',
'https://www.xxxxxxx.com.au/home-garden/page-2/c18397',
'https://www.xxxxxxx.com.au/automotive/page-2/c21159',
'https://www.xxxxxxx.com.au/garden/page-2/c25449',
'https://www.xxxxxxx.com.au/home-garden/page-3/c18397',
'https://www.xxxxxxx.com.au/automotive/page-3/c21159',
'https://www.xxxxxxx.com.au/garden/page-3/c25449',
'https://www.xxxxxxx.com.au/home-garden/page-4/c18397',
'https://www.xxxxxxx.com.au/automotive/page-4/c21159',
'https://www.xxxxxxx.com.au/garden/page-4/c25449',
'https://www.xxxxxxx.com.au/home-garden/page-5/c18397',
'https://www.xxxxxxx.com.au/automotive/page-5/c21159',
'https://www.xxxxxxx.com.au/garden/page-5/c25449',
'https://www.xxxxxxx.com.au/home-garden/page-6/c18397',
'https://www.xxxxxxx.com.au/automotive/page-6/c21159',
'https://www.xxxxxxx.com.au/garden/page-6/c25449',
'https://www.xxxxxxx.com.au/home-garden/page-7/c18397',
'https://www.xxxxxxx.com.au/automotive/page-7/c21159']
About your latest enquiry, have you tried this?
def parse(self, response):
for link in final_urls:
request = scrapy.Request(link)
yield request