0

I'm trying to load about 20k items in a sqlite3 database using a little Golang program.

But, it takes a lot of time... Like 45mins...

I've try doing :

_,errExec:=db.Exec("Insert into ...")

Is there any solution the reduce the update or insert (of about 20k items) in under 5mins ? and if yes, how can I do ?

Thank you for your answer.

EDIT : The full function :

func updateDB(){

file, err := ioutil.ReadFile("./database/id.json")
  if err != nil {
    fmt.Printf("File error: %v\n", err)
    os.Exit(1)
}

var ids []int
json.Unmarshal(file, &ids)

db, err := sql.Open("sqlite3","./database/gwitem.db")
if err != nil {
  log.Fatal(err.Error())
}

defer db.Close()

var prix price
for i :=0; i < len(ids); i++{
    //getJson("https://api.guildwars2.com/v2/commerce/prices/"+strconv.Itoa(ids[i]),&prix)
    url := "https://api.guildwars2.com/v2/commerce/prices/"+strconv.Itoa(ids[i])
    getJson(url,&prix)

    _,errExec:=db.Exec("Insert into trade values(null,"+strconv.FormatInt(prix.Id,10)+","+strconv.FormatInt(prix.Buys.Quantity,10)+","+strconv.FormatInt(prix.Buys.Unit_price,10)+","+strconv.FormatInt(prix.Sells.Quantity,10)+","+strconv.FormatInt(prix.Sells.Unit_price,10)+",0)")
    if errExec != nil{
      fmt.Println(err.Error())
      log.Fatal(err)
    }
 }
}
plosxh
  • 33
  • 5
  • Executing all of the `INSERT`s inside of a transaction will significantly reduce the processing time. –  Jan 09 '17 at 20:58
  • that's the only way I know to insert data in my database, i've done some research to find if there is a ay to do like a bulk update in sqlite3, but it seems like there is none... Feel free to give me some keys to find out how to optimize it and reduce the cost of all those inserts if you have the knowledge. I'm fairly new to programmation and I know that doing inserts in a loop is costly but this is the only way I know... – plosxh Jan 09 '17 at 21:06

0 Answers0