3

Possible Duplicate:
Is there a built-in way to handle multiple files as one stream?

I have an extremely large amount of consecutive data split into multiple files. Using XmlDocument or LINQ are not options because these files are huge and loading 500GB into memory is not an option.

Therefore I must use XmlReader.

The files I have are fragmented in an XML sense, e.g.:

File 1:

...
<Person>
  <FirstName>John</FirstName>
  <LastName>Doe</LastName>

File 2:

  <Email>john@doe.com</Email>
</Person>
...

One potential solution is that I need a way to set up a stream, e.g.:

using (XmlReader reader = XmlReader.Create(stream)) { ... }

such that the stream continually feeds the XmlReader each file consecutively. So when the stream hits the end of a file, it automatically feeds it the next file. How would I accomplish that? So that to the XmlReader, it looks like it's just iterating through one large stream, but that stream is composed of multiple consecutive files?

Thanks,

Allison

Community
  • 1
  • 1
Allison A
  • 5,575
  • 6
  • 28
  • 32
  • I hate to vote to close, since this is a really well-written question, but my answer would just be a copy and paste of the linked one. – Bobson Jan 23 '13 at 16:38

1 Answers1

0

Linq to xml will eat massive xml files for breakfast.

are you aware of http://msdn.microsoft.com/en-us/library/system.xml.linq.xnode.readfrom.aspx ?

Loofer
  • 6,841
  • 9
  • 61
  • 102