-3

What would be better in terms of speed for a far larger database between these two? they don't have to talk to their foreign key counterparts because a loop will occur sending an email from the table and deleting the entry so there be no view. I don't know if having the foreign keys and getting it to read each time but only storing a fraction of the data, or storing the data again and not having it read other tables?

namespace Linkofy.Models
{
public class AutoSending
{
    public int ID { get; set; }

    [Display(Name = "Receipiant Name")]
    public string receiptName { get; set; }

    [Display(Name = "Receipiant Emial")]
    public string receiptMail { get; set; }

    [Display(Name = "Sender Name")]
    public string senderName { get; set; }

    [Display(Name = "Email Address")]
    public string emailAddress { get; set; }

    [Display(Name = "Password")]
    public string password { get; set; }

    [Display(Name = "Send subject")]
    public string subject { get; set; }

    [Display(Name = "Send body")]
    public string Body { get; set; }

    [Required(ErrorMessage = "Send Time")]
    public DateTime sendDate { get; set; }

    public int autoListID { get; set; }

    public int? UserTableID { get; set; }
    public virtual UserTable UserTable { get; set; }
}
}


namespace Linkofy.Models
{
public class autoList
{
    public int ID { get; set; }

    public int? OutreachNamesID { get; set; }
    public virtual OutreachNames OutreachNames { get; set; }

    public int EmailAccountID { get; set; }
    public virtual EmailAccount EmailAccounts { get; set; }

    public int TemplateID { get; set; }
    public virtual Template Templates { get; set; }


    [Required(ErrorMessage = "Emails Sent")]
    public int sent { get; set; }


    [Required(ErrorMessage = "Total to Send")]
    public int total { get; set; }


    [Required(ErrorMessage = "Start Date")]
    public int startDate { get; set; }


    [Required(ErrorMessage = "End Date")]
    public DateTime endDate { get; set; }

    public int? UserTableID { get; set; }
    public virtual UserTable UserTable { get; set; }
}
}
Lucie
  • 131
  • 1
  • 9

1 Answers1

1

If I understand what you mean by this:

storing the data again and not having it read other tables

Yes, storing all of the data in one table will always be faster. How much faster is what you should try to determine. Modern relational databases are very fast. In an email-sending scenario, I doubt the database will be your bottleneck.

In general this sounds like you might be falling into the trap of premature optimization (fixing an assumed performance issue without actually knowing the code you're worried about actually has a performance problem worth fixing).

McGuireV10
  • 9,572
  • 5
  • 48
  • 64
  • Ahh great thanks :) yeah I just didn't want to have to rebuild it all later when its done so thought it was best to ask now! How would you go about testing something like that? – Lucie Feb 03 '18 at 12:13
  • That depends on a lot of things. If you're truly sending a lot of mail (thousands or more), if this isn't some internal enterprise system, you should probably look at bulk mail services (personally I like [mailgun](https://www.mailgun.com)). In general though, you'll need a way to send lots of test email messages so you can time that API. I can almost guarantee that will be the bottleneck. A lot of the bulk services also have ways to manage your mail address books, too -- and they'll even manage and track things like unsubscribe requests. – McGuireV10 Feb 03 '18 at 12:47
  • I see you recently asked a question about Azure deployment. Azure _requires_ you to use a third-party mail service. They tend to recommend SendGrid but we found mailgun to be cheaper (and I like their API better). – McGuireV10 Feb 03 '18 at 12:50
  • Ah i see! well i am using this code to send out through gmail as the smtp i predominantly will be using is gmail, will this code be able to handle it? https://stackoverflow.com/questions/32260/sending-email-in-net-through-gmail but quite right i would like to integrate it more such as reading mail etc. – Lucie Feb 05 '18 at 09:42
  • You can't use any SMTP service from Azure (which is what `System.Net.Mail` would do). Microsoft blocks all the normal mail ports since Azure has been heavily abused in the past by spammers. You have to choose a service provider with an API-based mail solution. Also if you tried to send any large volume through gmail, Google would quickly throttle you (or shut you down completely). – McGuireV10 Feb 05 '18 at 11:18
  • That's great to know didn't even know it was a problem! Thanks, though people would add their gmail accounts and then it will send out per chosen account so ive put paramters in place of the send email and password. So it would have to be gmail (as most outreach is done via that) would that have to be gmail api then, would it be worth not using azure? – Lucie Feb 05 '18 at 11:44
  • I can't tell you whether or not Azure makes sense, but in any high-volume mail service, I'd go with a third-party provider, regardless of hosting. If you're sending direct to the gmail API, they're going to charge you for that and I'd bet a dedicated provider will be cheaper, too. Time to do some research! – McGuireV10 Feb 05 '18 at 15:48
  • Thanks for your help on this! its something I had no clue on! I do a ton of gmail outreach and find you can send around 80 a day on a gmail email (@gmail), and up to 200 on a domain email (@company) so it would be limited to this daily send. Thought obviously as my client base grew it would get far larger overall. I will look into it though thanks for your help!! – Lucie Feb 05 '18 at 16:14