Foreach loop through table seems to load every row and run out of memory

entity-framework entity-framework-6 entity-framework-6.1


I am trying to dump a database table using the Entity Framework in C# using EF Version 6.1.1. I was expecting the foreach loop to load each row one at a time but it gradually loads every row and then runs out of memory.

foreach (Quote L in context.Quotes)
        "INSERT INTO [dbo].[Quote] ([QuoteId], [BookId], [VerbTenseId], [Form], [Quotation]) VALUES (%1,%2,%3,N'%4',N'%5')", 
        L.QuoteId, L.BookId, L.VerbTenseId, L.Form, L.Quotation);

In the above code, context is a DatabaseContext that has already been initialized to my database and Quotes is valid table of Quote records. I am attempting to dump every record to a file for backup. Obviously (as suggested previously) I could abandon the Entity Framework and use a DataReader to iterate the database table, but surely the Framework does allow the iteration of tables that are too big to fit in memory.

Is there an elegant way (perhaps a flag in the context somewhere) that would permit this?

Normally, the result sets are small (a few thousand) but in this case it runs into the millions.

There seem options that worked with previous versions of the framework but, unless I am misunderstanding, those methods and flags no longer exist.

This is only test code and not a regular database backup. I am not looking for a better way express my code. I am looking for an elegant way to use EF. If EF is just missing such support that is my answer.

Please do not mess with the example, just answer if there is a way to use EF to enumerate a database table without loading every row into memory and keeping it there. I can code other examples, and offer other options to achieve a database backup. This is an example only.

7/20/2014 11:20:34 PM

Popular Answer

I haven't tested it, but try foreach (Quote L in context.Quotes.AsNoTracking()) {...}. .AsNoTracking() should not put entities in cache so I assume they will be consumed by GC when they out of the scope.

Alternative way would be to implement batch processing (select top N entities, process, select next top N). In this case make sure that you dispose and create new context every iteration and use proper OrderBy() in the query.

7/17/2014 3:29:12 PM

Related Questions


Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow