I have a json file containing hundreds of Json objects.
My postgresql table was created like this:
CREATE TABLE collections(
id serial,
collection json);
It can add one object at a time into the table using INSERT INTO collections (collection) values (json_object);
but that's tedious and not sustainable. What would be a better way to do this?
One solution I found (as explained by this StackOverflow answer) was to create (1) create temporary table and bulk json data into it (2) create columns corresponding to keys and add values like so:
create temporary table temp_json (values text) on commit drop;
copy temp_json from 'C:\SAMPLE.JSON';
-- remove this comment to insert records into your table
-- insert into tbl_staging_eventlog1 ("EId", "Category", "Mac", "Path", "ID")
select values->>'EId' as EId,
values->>'Category' as Category,
values->>'Mac' as Mac,
values->>'Path' as Path,
values->>'ID' as ID
from (
select json_array_elements(replace(values,'\','\\')::json) as values
from temp_json
) a;
but this defeats the whole purpose of NoSQL. I merely just want to store an autoincrementing id with a json object on each row.