2

I have a file with a simple xml structure that I want to load into a Postgres table.

<rows>
<field name="id">1</field>
<field name="age">75-84</field>
<field name="gndr">F</field>
<field name="inc">32000-47999</field>
</rows>

Is there an easy way of doing this?

Erwin Brandstetter
  • 605,456
  • 145
  • 1,078
  • 1,228
ADJ
  • 4,892
  • 10
  • 50
  • 83
  • Are you trying to throw it into a blob, or parse it out into a normalized table structure? What tool do you want to use (Perl, Java, etc.)? – jjanes Mar 31 '14 at 22:12
  • Trying to parse into a normalized table. If PostGres doesn't have a function to take care of the parsing, Python would be my first choice. – ADJ Mar 31 '14 at 22:20

1 Answers1

5

Well, it can be done. I wrote a complete function for the same purpose a while back:

I use pg_read_file() to read in the whole file into a variable in plpgsql and proceed from there. This carries a few restrictions:

Only files within the database cluster directory and the log_directory can be accessed. Use a relative path for files in the cluster directory, and a path matching the log_directory configuration setting for log files. Use of these functions is restricted to superusers.

But I listed alternatives.

If you can convert the XML file to JSON it gets simpler. There are a bunch of tools for that available.

Starting with Postgres 9.3 you get pretty good support for the json data type.
Better yet, use the jsonb data type in Postgres 9.4+.

Community
  • 1
  • 1
Erwin Brandstetter
  • 605,456
  • 145
  • 1,078
  • 1,228