1

I have problem with importing big xml file (1.3 gb) into mongodb in order to search for most frequent words in map & reduce manner.

http://dumps.wikimedia.org/plwiki/20141228/plwiki-20141228-pages-articles-multistream.xml.bz2

Here I enclose xml cut (first 10 000 lines) out from this big file:

http://www.filedropper.com/text2

I know that I can't import xml directly into mongodb. I used some tools do so. I used some python scripts and all has failed.

Which tool or script should I use? What should be a key & value? I think the best solution to find most frequent world would be this.

(_id : id, value: word )

then I would sum all the elements like in docs example:

http://docs.mongodb.org/manual/core/map-reduce/

Any clues would be greatly appreciated, but how to import this file into mongodb to have collections like that?

(_id : id, value: word )

If you have any idea please share.

Edited After research, I would use python or js to complete this task.

I would extract only words in <text></text> section which is under /<page><revision>, exlude &lt, &gt etc., and then separate words and upload them to mongodb with pymongo or js.

So there are several pages with revision and text.

Edited

user2980480
  • 35
  • 1
  • 1
  • 6
  • Does anyone know how to convert such a big file, text section , into csv or json – user2980480 Jan 08 '15 at 23:02
  • the problem of big files, can be solved with `fileinput`, because you will load only one line at once, and not the whole file will be loaded to memory, then you decide when you will write to another file (csv or json). – Abdelouahab Jan 09 '15 at 00:38
  • Can you give me an example? – user2980480 Jan 09 '15 at 00:41
  • i made this, since the resulting file will be really big, then using `open` will use all memory, https://github.com/abdelouahabb/kouider-ezzadam/blob/master/hasher-comparer.py – Abdelouahab Jan 09 '15 at 01:15
  • I treid doing that and also http://stackoverflow.com/questions/19286118/python-convert-very-large-6-4gb-xml-files-to-json and got memory error.... – user2980480 Jan 09 '15 at 09:16

2 Answers2

1

To save all this data, save them on Gridfs

And the easiest way to convert the xml, is to use this tool to convert it to json and save it:

https://stackoverflow.com/a/10201405/861487

import xmltodict

doc = xmltodict.parse("""
... <mydocument has="an attribute">
...   <and>
...     <many>elements</many>
...     <many>more elements</many>
...   </and>
...   <plus a="complex">
...     element as well
...   </plus>
... </mydocument>
... """)

doc['mydocument']['@has']
Out[3]: u'an attribute'
Community
  • 1
  • 1
Abdelouahab
  • 7,331
  • 11
  • 52
  • 82
  • Thank you for your help, but it doesn't work. I have even installed both xmltodict modules(one that you included and 2 official but "object has no atribute parse..." I think I should extract and prepare the data before an upload. Sth like : http://stackoverflow.com/questions/18595791/parse-xml-file-to-fetch-required-data-and-store-it-in-mongodb-database-in-python – user2980480 Jan 07 '15 at 22:16
  • i just tested it, and it works, did the example worked? – Abdelouahab Jan 07 '15 at 22:39
  • Yes it works. I have a different idea how to import to mongodb. Could you take a look at: http://stackoverflow.com/questions/27841981/python-extract-words-from-xml, please. If this is resolved, I will handle. – user2980480 Jan 08 '15 at 16:10
0

The XML file i'm using goes this way :

<labels>
     <label>
          <name>Bobby Nice</name>
          <urls>
               <url>www.examplex.com</url>
               <url>www.exampley.com</url>
               <url>www.examplez.com</url>
          </urls>
     </label>
     ...
</labels>

and i can import it using xml-stream with mongodb

See: https://github.com/assistunion/xml-stream

Code:

var XmlStream = require('xml-stream');
// Pass the ReadStream object to xml-stream
var stream = fs.createReadStream('20080309_labels.xml');
var xml = new XmlStream(stream);

var i = 1;
var array = [];
xml.on('endElement: label', function(label) {
  array.push(label);
  db.collection('labels').update(label, label, { upsert:true }, (err, doc) => {
    if(err) {
      process.stdout.write(err + "\r");
    } else {
      process.stdout.write(`Saved ${i} entries..\r`);
      i++;
    }
  });
});

xml.on('end', function() {
  console.log('end event received, done');
});
OpSocket
  • 997
  • 11
  • 19