We are developing new application on top of RavenDb Ver 3.0 as Data storage.
On performance tests we found an issue with our system.
The issue is that when we run a query like below, to take all the results, if the total results become more than sth (like >2000 document or sometimes even less)
the "Query Failed" with "OutOfMemoryException" Message appears.
public ICollection<T> Find<T>(Expression<Func<T, bool>> predicate)
{
var spendTime01 = Stopwatch.StartNew();
var list = new List<T>();
var power = 2000;
RavenQueryStatistics statistics;
using (DocumentSession)
{
list.AddRange(DocumentSession.Query<T>().Statistics(out statistics)
.Where(predicate).Take(power));
}
if (statistics.TotalResults > power)
{
var toTake = statistics.TotalResults - power;
var taken = power;
while (toTake > 0)
{
using (DocumentSession)
{
list.AddRange(
DocumentSession.Query<T>()
.Where(predicate)
.Skip(taken)
.Take(toTake > power ? power : toTake));
toTake -= power;
taken += power;
}
}
}
//using (DocumentSession)
//{
// var query = DocumentSession.Query<T>("Activities/All").Where(predicate);
// using (var enumerator = DocumentSession.Advanced.Stream(query))
// {
// while (enumerator.MoveNext())
// {
// list.Add(enumerator.Current.Document);
// }
// }
//}
spendTime01.Stop();
Debug.WriteLine($"Raven Find Predicate Elapsed Time: {spendTime01.Elapsed}");
return list;
}
I tried to use index and switch to Advanced.Stream(query)
but this command is running very slow and for 4000 document takes 20 Sec to enumerate the result and add it to list.
I read all the blogs and answered around the net about this and yet I don't have any POV about the problem and also I'm completely worry about what will happen if the total results become more than hundred of thousands.