As from scrapy docs, CloseSpider Exception can only be raised from a callback function (by default parse function) in a Spider only. Raising it in pipeline will crash spider. To achieve the similar results from a pipeline, you can initiate a shutdown signal, that will close scrapy gracefully.
from scrapy.project import crawler
crawler._signal_shutdown(9,0)
Do remember ,scrapy might process already fired or even scheduled requests even after initiating shutdown signal.
To do it from Spider, set some variable in Spider from Pipeline like this.
def process_item(self, item, spider):
if some_condition_is_met:
spider.close_manually = True
After this in the callback function of your spider , you can raise close spider exception.
def parse(self, response):
if self.close_manually:
raise CloseSpider('Already been scraped.')