-1

I wanna make a web crawling, currently i am reading a txt file with 12000 urls, i wanna use concurrency in this process, but the requests don't work.

typealias escHandler = ( URLResponse?, Data? ) -> Void

func getRequest(url : URL, _ handler : @escaping escHandler){

let session = URLSession(  
    configuration: .default,
     delegate: nil,
    delegateQueue: nil)

var request = URLRequest(url:url)

request.httpMethod = "GET"

let task = session.dataTask(with: request){ (data,response,error) in

        handler(response,data)    
}

task.resume()

}


    for sUrl in textFile.components(separatedBy: "\n"){
        let url  = URL(string: sUrl)!

        getRequest(url: url){ response,data in

            print("RESPONSE REACHED")

        }
    }
eduardo
  • 123
  • 1
  • 11

1 Answers1

0

If you have your URLSessions working correctly, all you need to go is create separate OperationQueue create a Operation for each of your async tasks you want completed, add it to your operation queue, and set your OperationQueue's maxConcurrentOperationCount to control how many of your tasks can run at one time. Puesdo code:

let operationQueue = OperationQueue()
operationQueue.qualityOfService = .utility

let exOperation = BlockOperation(block: {
    //Your URLSessions go here.
})
exOperation.completionBlock = {
    // A completionBlock if needed
}

operationQueue.addOperation(exOperation)
exOperation.start()

Using a OperationQueue subclass and Operation subclass will give you additional utilities for dealing with multiple threads.

Jacob Boyd
  • 672
  • 3
  • 19
  • thank you Jacob, your suggestion fits perfectly for the this problem. – eduardo Nov 10 '16 at 00:12
  • @eduardo - It's not that simple because these tasks you're adding to the queue are, themselves, asynchronous. So you have to create asynchronous operation. See http://stackoverflow.com/a/40560463/1271826. – Rob Nov 12 '16 at 10:22