I have the following problem.
I have this Microsoft SQL Server query:
Select AVG(Severity) AS 'Average Severity' from VulnerabilityAlertDocument
that, executed into Microsoft SQL Server Managment Studio, return to me this float: 7,34792844929602
Now I am trying to create a C# method that simply implement this query and return the obtained value, so I have done something like this:
public double getVulnerabilitySeverityAverage()
{
double vulnerabilitySeverityAverage;
_strSQL = "Select AVG(Severity) AS 'Average Severity' from VulnerabilityAlertDocument";
System.Data.Common.DbCommand command;
command = _connection.CreateCommand();
command.CommandText = _strSQL;
vulnerabilitySeverityAverage = command.ExecuteNonQuery();
return vulnerabilitySeverityAverage;
}
The problem is that when the previous methos is executed the result value into vulnerabilitySeverityAverage is not the expected value (7,34792844929602) but it is the wrong value: -1.0
Why? What is it wrong? What am I missing?
Tnx