I have code where I log different categories of information. Each piece of code has a specific tag that has been made for them. (e.g : database.message
, database.odbc
, event.cevent
...)
Moreover, I've made a function that reads a json file and their value which corresponds to their according severity filter
{
"database" :
{
"message": "INFO",
"odbc": "DEBUG",
},
"event" :
{
"cevent" : "INFO",
},
}
My problem is, I want to set a "basic filter", (for instance only log messages that are "INFO" or above) for all items whose tags were not set in this file.
Right now, I'm adding filters this way:
logging::core::get()->add_global_attribute("Tag", attrs::constant<std::string>(""));
logging::add_common_attributes();
std::vector<std::string> tags; // Suppose it's already filled with the tags and values from the json
...
for (const auto& tag :tags)
{
boost::shared_ptr<text_sink> sink(new text_sink(backend)); // same as the doc
auto level = logging::trivial::info; // just an example for more clarity
sink->set_filter(expr::attr<std::string>("Tag") == tag && expr::attr<logging::trivial::severity_level>("Severity") >= level);
logging::core::get()->add_sink(sink);
}
This piece of code works and correctly reads and sets filter according to the json file.
So to add this "basic filter", I also added this once every filter has been set:
boost::shared_ptr<text_sink> basic_sink(new text_sink(backend));
auto filter = logging::trivial::severity >= logging::trivial::info;
for (const auto& tag : tags)
{
filter = filter && expr::attr<std::string>("Tag") != tag;
}
basic_sink->set_filter(filter);
logging::core::get()->add_sink(basic_sink);
But it duplicates messages that are defined in the json, when I thought this would filter out tags stored. Do you have any ideas on how to avoid such duplication or do I have to implement such a sink as mentionned in this post