I am writing software in C# measuring or utilizing out or in octets (bytes) via SNMP. I need to do how many bytes pass in 1000 secs?
According to my research, its value gets timed out or reset sometimes because some results give a negative value.
.1.3.6.1.2.1.2.2.1.10
for input stream in .139
In 1024 secs it gives result of -2,1 MBytes.
How can I get accurate measurement of traffic (in or out) ?
EDIT : This code I use for calculations. It takes value in everysec and gets result.
private void timer1_Tick(object sender, EventArgs e)
{
Cursor.Current = Cursors.WaitCursor;
SnmpObject objSnmpObject, objSnmpIfSpeed;
objSnmpObject = (SnmpObject)objSnmpManager.Get(".1.3.6.1.2.1.2.2.1.16.139");
objSnmpIfSpeed = (SnmpObject)objSnmpManager.Get(".1.3.6.1.2.1.2.2.1.5.139");
if (GetResult() == 0)
{
float value = Int64.Parse(objSnmpObject.Value);
float ifSpeed = Int64.Parse(objSnmpIfSpeed.Value);
float Bytes = (value * 8 * 100 / ifSpeed);
// float megaBytes = Bytes / 1024;
sum += Bytes;
tb_calc.Text = (sum.ToString() + " Bytes");
}
_gv_timeSec++;
lb_timer.Text = _gv_timeSec.ToString();
Cursor.Current = Cursors.Default;
}