0

I want to make a copy of a list of records. The code below only copies references of the records so changing Data in myRecs1 also changes it in MyRecs2. Besides doing a loop in a loop, is there an easy way to get a full copy of myRecs2 from myRecs1?

    static void Main(string[] args)
    {
        List<MyRec> myRecs1 = new List<MyRec>()
        {
            new MyRec() {Id = 1, Data = "001"},
            new MyRec() {Id = 2, Data = "002"},
            new MyRec() {Id = 3, Data = "003"}
        };

        //List<MyRec> myRecs2 = myRecs1.ToList(); // does not work of course

        // ugly but works
        List<MyRec> myRecs2 = myRecs1.Select(rec => new MyRec()
        {
            Id = rec.Id,
            Data = rec.Data
        }).ToList();

        myRecs1[2].Data = "xxx";


        foreach (var rec in myRecs2)
        {
            Console.WriteLine(rec.Data);
        }
    }

    public class MyRec
    {
        public int Id { get; set; }
        public string Data { get; set; }
    }
  • edited to show working example I don't like
Peter Kellner
  • 14,748
  • 25
  • 102
  • 188
  • 1
    serialze and desrialize, my chosen tool is json using newtonsoft json.net – pm100 Feb 19 '16 at 01:14
  • The linked answer uses the older binary serializer that requires an object to implement ISerializable. You can do the same with more modern serializers. – Eric J. Feb 19 '16 at 01:17

1 Answers1

1

does this work for you?

 List<MyRec> myRecs1 = new List<MyRec>()
        {
            new MyRec() {Id = 1, Data = "001"},
            new MyRec() {Id = 2, Data = "002"},
            new MyRec() {Id = 3, Data = "003"}
        };

List<MyRec> myRecs2 = JsonConvert.DeserializeObject<List<MyRec>>(
                                                JsonConvert.SerializeObject(myRecs1));

this way, you basically defer the whole looping/nesting/data type handling etc. to a framework.. you can go for binary, xml or other serialization as well.. but they normally have more constraints than JSON which has worked well for me, most of the times.

Raja Nadar
  • 9,409
  • 2
  • 32
  • 41
  • interesting but I'm guessing that would be very slow (real problem is 100,000s of records) – Peter Kellner Feb 19 '16 at 01:16
  • The binary serializer (see linked question) or DataContractSerializer are both pretty fast. I have not measured Json serializer performance. – Eric J. Feb 19 '16 at 01:17
  • @PeterKellner i would just measure the time, and see if it is an acceptable value. if not, there are other formatters like binary, xml, data contracts etc. and you can measure the time there as well. (provided the constraints of Serializable etc. are okay in your system) and if everything else fails, you could even try an one-off looped copy and measure its time. (too tedious and manual and error prone) the idea is that serialization/desrialization is an out-of-the-box solution for such problems. – Raja Nadar Feb 19 '16 at 01:21
  • 1
    @EricJ. good point.. nothing like measuring it and making a decision based on the acceptable values in your system/application. – Raja Nadar Feb 19 '16 at 01:22
  • I added a funky loop I don't like. I was really wondering if there is language support for what I'm trying to do. Seems like this would be a very common thing to do and I can't think of an easy way to do it. – Peter Kellner Feb 19 '16 at 01:22
  • `I was really wondering if there is language support for what I'm trying to do` Check out the linked question for some discussion. Deep copies are commonly needed, indeed. – Eric J. Feb 19 '16 at 01:32
  • I'm good with this is a dup. If I could close it I would – Peter Kellner Feb 19 '16 at 01:36