In SOA, you can adapt Biztalk or SAP BusinessObjects Data Integrator way of processing. Basically, it is a scheduler job / windows service, or something similar. You provide two service point, 1 for the scheduler to retrieve the data, and another for the scheduler to send the data. The scheduler's responsibility here is just to run periodically and transforming data.
So, the basic steps will be:
Step 1: The scheduler run and get the data from service A
Scheduler --get--> Service A
Service A --data--> Scheduler
Step 2: The scheduler doing data transformation
[ Conversion --> Conversion --> Conversion --> Conversion ]
Step 3: The scheduler send the data to another service
Scheduler --data--> Service B
In both Biztalk and SAP BusinessObject Data Integrator, the steps are configurable (they can retrieve from whatever service and can do scripting data transformation), so it's more flexible.
However, there are still usual problems that can happen with ETL processing. For example: the data is too big, network performance impact, RTO's, duplicated data, etc. So the ETL best practices still a requirement here (use of staging table, logging, etc).
But are the performance degradation and additional points of failure
worth it?
The performance impact will happen since now you have extra connection/authentication step (to webservice), and transportation step (webservice to scheduler via protocol). But for error-prone, I think it's the same error that you need to handle with other service call.
Is it worth it? It depends. If you are working in same environment (same database) then it's debatable. If you are working in different environment (two different system for example, from Asp.Net to SAP, or different database instance at least), then this architecture is the best bet to handle ETL.