I'm developing an application which scrapes HTML feeds after it authenticates. These websites only support email/password authentication, but some integrations may require additional information. So my question is now specific should I get when creating domain events and aggregates? I've worked with Prooph and have created simple aggregates and domain events which only deal with a single entity type. But now I'm wondering if aggregates and domain events should be specific towards these third-party website scrapers. Should there be an event per feed scraper, or is creating a generic event and aggregate better? The properties of each aggregate could differ.
class WebsiteA extends AggregateRoot
{
private $id;
private $email;
private $password;
public static function initiate($id, $email, $password)
{...}
}
class WebsiteB extends AggregateRoot
{
private $id;
private $email;
private $password;
private $accountIds = [];
private $userSalt;
public static function initiate($id, $email, $password, $accountIds, $userSalt)
{...}
}
Then something similar for the domain events
class WebsiteAWasInitiated extends AggregateChanged
{
public static function withUser($id, $email, $password){}
}
class WebsiteBWasInitiated extends AggregateChanged
{
public static function withUser($id, $email, $password, $accountIds, $userSalt){}
}
Or would it be better to create a single aggregate and domain event? Which both websites use. Keep in mind, the list of supported websites will grow.
class WebsiteScraper extends AggregateRoot
{
private $id;
private $credentials;
public static function initiate($id, $credentials)
{...}
}
class WebsiteScraperWasInitiated extends AggregateChanged
{
public static function withUser($id, $credentials){}
}