4

I have a large json request, that is, it has around 50k rows with 15 columns that I have to insert into a SQLite DB with the same structure. This is, I have to copy the same data allocated in postgres db into my sqlite db within my app. Is there some efficient way to do it? is there some api or something that could help to the work?

I have to tell that I'm able to do with OMRLite with JSON datas that isn't large but when I try to do with a bigger ones I have my app crashes and has the out of memory error.

Please if you have some idea or some example that I could follow I will appreciate it a lot! thanks in advance!

Gray
  • 115,027
  • 24
  • 293
  • 354
  • I would say use pre-compiled insert statement query with transaction management with Asynctask and notify user with proper message. Refer this http://stackoverflow.com/a/31962559/1686269 – Durgesh Patel Aug 21 '15 at 13:26
  • try using Gson to parse the Json to Java objects and then use these objects to write to sqlite, there are plenty of ORM libraries for sqlite just google ORM android and you will find results – Bhargav Aug 21 '15 at 13:27
  • I see. So, I do the JSON request from a servlet and then I parse those datas with GSON and I use ORMLite to manage the SQLite. But as I said the problem is when I have a large json request. – Andres Melgarejo Ledesma Aug 21 '15 at 13:40

3 Answers3

2

You can also use Google's Official Gson Streaming Library.

Gson Streaming : Gson Streaming

JsonReader plays very important role to parse json using Streaming library.

Durgesh Patel
  • 1,035
  • 8
  • 15
1

Because the JSON is so large you can't load it completely in memory. Even using the standard JSONObject will result on out of memory on many devices.

What I've done in a similar situation was to use Jackson for parsing it. Jackson can do it from the stream so memory usage is not a problem at all. The downside is API which is not that straight forward to use compared to normal options.

Here is an example I found: Jackson Streaming

azertiti
  • 3,150
  • 17
  • 19
  • Thanks for your answer, and what do you know about gson? I use gson to the work of parsing into java objects, you think gson is using the phone memory to do that work? probably yes because the error. You recommend jackson @azertiti? – Andres Melgarejo Ledesma Aug 21 '15 at 14:22
  • As far as I know yes, it's going to parse the whole JSON at once. You can also look at memory usage in Android Studio when starting to parse it. If the memory goes a lot up you have the answer. I don't use it in any of my projects to easily test but I solved the same problem in the past with Jackson Streaming API. – azertiti Aug 21 '15 at 14:29
  • ok, thanks a lot! I'll try it right now! I will post if that solve the problem. – Andres Melgarejo Ledesma Aug 21 '15 at 14:35
0

GSON stream works for me Gson - streaming but I have to say it takes his time, around 4 minutes for 68K rows with 10 columns for example. But anyway solves my issue. So I have this JSON responde:

- preciosArtPK: {
                   codLista: 1,
                   codArticulo: 11348,
                   cansiVenta: 1,
                   fecVigencia: 1435781252000
},
  siglaVenta: "UN",
  precioVenta: 0,
  margenPct: 100,
  codUsuario: 1,
  vigente: "S",
  nomModulo: "MIGRACION"

Above JSON is a part of an array response but I have those "preciosArtPK" to include in the serialization with gson. How could I do that? I have the class that handles my serialization:

@DatabaseTable(tableName = "preciosart")
public class PreciosArt {

public static final String PRECIOS_COD_LISTA = "_id";
public static final String PRECIOS_COD_ARTICULO = "cod_articulo";
public static final String PRECIOS_CANSI_VENTA = "cansi_venta";
public static final String PRECIOS_FEC_VIGENCIA = "fec_vigencia";
public static final String PRECIOS_SIGLA_VENTA = "sigla_venta";
public static final String PRECIOS_PRECIO_VENTA = "precio_venta";
public static final String PRECIOS_MARGEN_PCT = "margen_pct";
public static final String PRECIOS_COD_USUARIO = "cod_usuario";
public static final String PRECIOS_VIGENTE = "vigente";
public static final String PRECIOS_NOM_MODULO = "nom_modulo";


@DatabaseField(id = true, unique = true, columnName = PRECIOS_COD_LISTA)    
private Integer codLista;

@DatabaseField(unique = true, columnName = PRECIOS_COD_ARTICULO)
@SerializedName("codArticulo") // probar
private Integer codArticulo;

@DatabaseField(unique = true, columnName = PRECIOS_CANSI_VENTA)
private Integer cansiVenta;

@DatabaseField(unique = true, columnName = PRECIOS_FEC_VIGENCIA)
private Long fecVigencia;

@DatabaseField(columnName = PRECIOS_SIGLA_VENTA)
@SerializedName("siglaVenta")
private String siglaVenta;

@DatabaseField(columnName = PRECIOS_PRECIO_VENTA)
@SerializedName("precioVenta")
private Double precioVenta;

@DatabaseField(columnName = PRECIOS_MARGEN_PCT)
@SerializedName("margenPct")
private Float margenPct;

@DatabaseField(columnName = PRECIOS_COD_USUARIO)
@SerializedName("codUsuario")
private Integer codUsuario;

@DatabaseField(columnName = PRECIOS_VIGENTE)
@SerializedName("vigente")
private String vigente;

@DatabaseField(columnName = PRECIOS_NOM_MODULO)
@SerializedName("nomModulo")
private String nomModulo;

but this does't fill those fields (codArticulo, cansiVenta and fecVigencia). I read about to deserialize these json formats making another class of it, so I did the same:

@SerializedName("codLista")
private Integer codLista;

@SerializedName("codArticulo")
private Integer codArticulo;

@SerializedName("cansiVenta")
private Integer cansiVenta;

@SerializedName("fecVigencia")
private Long fecVigencia;

Problem is: how could I fill those field with my json deserialized? I use OMRLite to do the work, this class is for that purpose:

public long updatePreciosart(PreciosArt preciosArt){
    long resultUpdate = -1;
    try {
        getHelper().getPreciosartDao().createOrUpdate(preciosArt);
        resultUpdate = 0;
    } catch (SQLException e) {
        e.printStackTrace();
        resultUpdate = -1;
    }
    return resultUpdate;
}

Hope you understand what the problem is and hope you help me with that! thanks again!

  • I took a quick glance at this. I am suspecting that each insert searches for column index by column name. Perhaps here lies part of reason for slow performance. – f470071 Feb 19 '17 at 09:17