0

I am stuck in a scenario wondering about the best practice that can be applied for that. I am working on an Ecommerce website dealing in orders, shipments, invoices etc.

Apart from UI creation in my application i also have the ability to send/receive data from different suppliers(through supplier apis xml/json). The application can post orders/shipments to a supplier created in my application or get all orders/shipments from a supplier to import in my app. The scenario varies from supplier to supplier.

My question is what would be the best approach to handle this. Below are the two approaches i have thought of. I am using the first approach right now but i am thinking whether its the right one to be used in this case.

1.) I have created a generic code to generate JSON/XML based on XPATH. For example for below XML generation i use the XPATH as Orders.order.orderNumber <Orders><order><orderNumber>testorder</orderNumber></order></Orders> The XPATH is stored in database and based on the different configured XPATH in database, complete JSON/XML is generated and sent to the supplier(GET/POST is also configured in database).

The advantage i think of this approach is minimal work to add new suppliers into the system. The disadvantage i think it has is the XML/JSON generation which goes through big loops. As the API's to be called for suppliers(Orders GET etc) are mostly fixed, this appears to be a disdvantage.

2.) I create separate services to handle each supplier calls, create methods that handle each call with XML hardcoded into the application(Without configuration in DB through XPATH). For ex, for 2 suppliers SuppA and SuppB. For SuppA orders list is to be downloaded and for SuppB shipments are to be posted. So there will be 2 services in my application handling each supplier calls independent of one another, 1 for SuppA and another for SuppB. SuppA service will call the orders GET api and SuppB service will call Shipments POST api.

The advantage of this i think is that it would be fast as XML/JSON generation would not be needed as only the required calls will be coded in the service codes for each supplier. The disadvantage i think it has is that only fixed services can be called. I the first approach anything could be changed through DB, but in this code needs to be changed.

Please suggest on this. Which approach is best for this scenario, the Generic DB configuration or the hardcoded separate supplier services?

Raghav
  • 552
  • 1
  • 9
  • 34
  • The data model or schemas for each supplier are they different from each other ? – cool Oct 03 '15 at 16:01
  • Yes, the data model for each supplier is different. They are exposed through their service API's. We may be sending a particular set of fields to one supplier but a different set of fields to another supplier, depending on their supported Request xml/json. – Raghav Oct 05 '15 at 05:32

2 Answers2

0

Start by writing a unit test for the code. This will more than likely lead you to asking yourself what leads to a more testable and maintainable application as well as what solves the problem in the most quality manner. The answer is potentially a combination of both #1 and #2. Hard coding "magic" values is never a great option. And, having to call a heavy persistence layer is not great either specifically with regards to performance and/or scalability.

I suggest:

  1. Store the values in a configurable place that is easy to maintain.
  2. Use a caching layer like Redis or Memcache to allow the code to get the values in a scalable and extensible manner.

Benefits:

  1. Prevents hardcoding.
  2. Prevents hard resets/restarts on app config changes.
  3. More extensible code for adding/removing a new supplier.
  4. Better overall app agility.

Last Note: As a fun thought you may be able to use a factory pattern to generate supplier objects on the fly to allow for the addition and deletion of suppliers to the running system w/out re-coding the system. This is a bit more involved but could be an option.

jsh
  • 338
  • 1
  • 8
0

Personally I think the second approach would be closer to my chosen solution: You have a number of external interfaces with which you have to communicate. From your description I understand, that all handle either XML or JSON. Thinking about the case where new suppliers should be added this quickly becomes a question about maintainability.

So I would define an Interface for each supplier with a matching data model. On your side you will have a different data model to represent the orders, shipments, invoices and whatnot. Then you can use a mapping framework like Dozer to convert the data from one representation to another.

Then you can use a different framework JAXB or GSON to convert your objects to XML or JSON.

hotzst
  • 7,238
  • 9
  • 41
  • 64
  • One more question, I also have an integration to Quickbooks which used to have Class/Objects SDK instead of XML/JSON. I wrote a generic code using Reflection API to implement it. The class name and methods were stored in DB and based on that classes were dynamically created on runtime and the methods were called through reflection and data values were set. Will it be good practice to apply the second approach in this too as other wise we'll need to call all setters and getters explicitly in the code itself for all cases? – Raghav Oct 05 '15 at 17:17
  • `GSON` accesses the fields by default and for `JAXB` it can be configured whether to use fields or getter/setter methods. For more details take a look at http://stackoverflow.com/questions/11385214/how-can-i-get-gson-to-use-accessors-rather-than-fields and http://stackoverflow.com/questions/6203487/why-does-gson-use-fields-and-not-getters-setters – hotzst Oct 06 '15 at 05:39
  • Thanks a lot for this. The combination of Dozer with JAXB or GSON would do the trick. :) – Raghav Oct 06 '15 at 15:08