I am taking over a project to replace an ancient legacy system from the ground up. Before I came on, the company hired a consultant who put together a basic sketch of the system and pushed SOA heavily. This resulted in a long list of "entity services", with the intention of them being composed into more complex service combinations. For instance, a user wanting committee info would hit the "Committee" service, which then calls the "Person" service to get its members, and the "Meeting" service to get its meetings, and so on.
I understand the flexibility gains in this, but my concerns are about performance. It seems to me that a system built with such a fine level of granularity to its services spends too many resources on translating service messages, and the performance will be unacceptable. It also seems to me that the flexibility gains can still be made using basic reusable objects, although in that case the benefit of a technology-agnostic interface is lost to gain performance.
For more background: The organization requesting this software does not currently have a stable of third-party software suites that need integrating with. This software will replace all suites. There are also currently no outside consumers who need to access the data outside of the provided website interface -- all service calls will be from other pieces inside our system. The choice of SOA in this case seems entirely based on the concept of "preparation".
So my question -- what level of granularity is acceptable in a stable of services without sacrificing performance? Am I being too skeptical of the performance hits we'll take implementing all our entities as services? Should functionality be available as web services only when they are needed, with the "preparation" focus instead going into designing the business layer for the probability of services later being dropped on top of it?