15

I have a use case where I receive some attributes in the request like this,

"filters": [
  {
    "field": "fName",
    "value": "Tom"
  },
  {
    "field": "LName",
    "value": "Hanks"
  }
]

I don't have a model defined for this. I just receive these attributes in the request and fire a query on elastic search using these attributes. My records in elastic search have the same attribute names.

Now, I have to support a legacy application where attribute's names are completely different. E.g.: fName becomes firstName and lName becomes lastName.

Problem: Need to accept old attribute names in the request, convert them to new ones so that it matches my elastic search records. Fetch the data with new attribute names and convert back to old ones before sending out the response from the application.

NOTE: I don't have POJO's defined for these records.

How can this be achieved effectively? I was thinking of using Orika mapper but not sure how that will work without defining classes first.

Clijsters
  • 4,031
  • 1
  • 27
  • 37
User0911
  • 1,552
  • 5
  • 22
  • 33
  • Ive run into a similar problem when syncing info between two systems. They had very different naming conventions and one of them had changed naming conventions multiple times in the past few years. (I had to sync all of it). The way i solved this was with a config file, that mapped the new fields to the old fields. When a request to sync came in, i found the matching field, switched it to the field it was paired with, and sent it off. For the response I grabbed the mapping again using the alternate name and sent a response. Maybe something similar can help? – gkgkgkgk Aug 07 '18 at 20:41
  • Sure. I don't really have much research on this, but I found that it was the best solution because it could be changed by the person I was distributing the app to if they needed to ever add/delete fields. The other approach I took (which I scrapped because it was too hard coded) was writing a custom Object Serializer/Deserializer and running the request reply through that. – gkgkgkgk Aug 07 '18 at 20:47
  • @gkgkgkgk ok, Do you have any gist or pointers available for config solution? – User0911 Aug 07 '18 at 20:49
  • Unfortunately I don't... But the config shouldn't be too hard to set up. I basically wrote a json file with the mappings between new and old names, then in my code found the field value that contained the field I was looking for. I then grabbed the mapped value and used that one. I can write up a basic flow/psuedocode if that is helpful – gkgkgkgk Aug 07 '18 at 21:29
  • I understand you have no POJO currently defined, but what is your desired output to do your elastic search? Is it a POJO with fName, lName fields filled with values based on the request? – Magdrop Aug 07 '18 at 23:40
  • 3
    What prevents you from writing a transformer from request JSON to your normalized JSON? – Mạnh Quyết Nguyễn Aug 08 '18 at 02:45
  • 3
    I would also rather skip the object mapping and directly map JSON to normalized JSON... In the end that is what you want... You do not need to invent classes/objects that are of no benefit to you and additionally use a mapping utility just to accomplish what you already could accomplish via plain json transformation... maybe the following answer is of help to you: [JSON to JSON transformer mentioning also ElasticSearch as use case](https://stackoverflow.com/a/17413190/6202869) – Roland Aug 08 '18 at 11:00

3 Answers3

4

What prevents you from writing a transformer from request JSON to your normalized JSON?

The normal flow I can think of is:

Request JSON -> POJO -> POJO with normalized value -> Normalized JSON

So your POJO looks like:

public class Filter {

     List<FieldFilter> filters;

     public static class FieldFilter {
         private String field;
         private String value;
     }
}

Now you will have a transformation map like:

Map<String, String> fieldNameMapping = new HashMap<>();
fieldNameMapping.put("fName", "firstName");
fieldNameMapping.put("firstName", "firstName");

// The process of populating this map can be done either by a static initializer, or config/properties reader

Then you transform your POJO:

Filter filterRequest;
List<FieldFilters> normlizedFilters = 
    filterReq.getFilters().stream()
             .map(f -> new FieldFilter(fieldNameMapping.get(f.getField()), f.getValue())
             .collect(toList());

Then convert the Filter class to your normalized JSON.

Mạnh Quyết Nguyễn
  • 17,677
  • 1
  • 23
  • 51
3

We have a similar scenario and we are using apache JOLT.If you want to try some samples, you can refer jolt-demo-online-utility

Vivek Shukla
  • 711
  • 9
  • 18
0

Use a JSON to JSON-transformer instead. Good answers regarding this can be found here: JSON to JSON transformer and here : XSLT equivalent for JSON

In the end you do not require an intermediate object type here. You even said, that you do not have such a type yet and inventing it, just to transform it, doesn't really make sense.

Roland
  • 22,259
  • 4
  • 57
  • 84