Assuming that you don't read 1 file for 2 hours, but instead you read many files, each of which can be processed in under 5 minutes, you can use Azure Functions on consumption plan. So, first criteria that you should consider is whether that is true, or if you can slice your FTP requests in more, but smaller.
Azure Functions use the same SDK as WebJobs under the hood, but you get the benefit of faster time to running code, and you get to worry less about managing them. That is good for general case, but if you want more control, WebJobs provide you that. On the other side of spectrum, for full control, you can use Azure VMs.
This answer gives you a nice overview of Functions vs WebJobs.
One idea, if you are willing to move your CSV files, you may use Azure Data Lake in combination with U-SQL.
I am not really sure how your current pipeline receives data, but you can set up a pipeline using Azure Functions to store all your data in Data Lake. Since you store files that you receive using FTP, you don't have to have long running Azure Function. You would be able to run Azure Function on consumption plan that stores only smaller amounts of data in Data Lake per each run, so that it doesn't timeout.
You can then prepare data using various analytical capabilities of U-SQL and other Data lake analytics services. With UDOs, or Azure Data Factory, you can load data into SQL Azure Database. Good thing for U-SQL is that you pay compute only when you use it, like in Azure Functions, so you can have the whole pipeline using "serverless" computing.