I'm currently using C# and the StreamReader Class to read in StandardOutput from a Python File which retrieves server data in a JSON format. The user can enter a search key (serverName, IP, ID, etc) and it would return this data below. The user can also search for as many server details as they please, sometimes up a thousand search keys.
Example:
[{'Search': 'serverName', 'Name': 'test.testing.com', 'IP':
'10.XXX.XXX.X', 'State': 'OK', 'OS': 'Windows', 'ID':'123456'}]
I'm currently reading the output and adding it to a list within the C# code.
StreamWriter sw = pythonProcess.StandardInput;
StreamReader sr = pythonProcess.StandardOutput;
//Code to read in the search key inputs
returnList.Add(sr.ReadLine());
The output is all contained within one line (not sure if that's the best way to handle it) so looping is not needed. It can read smaller outputs fine however, when if the user enters a large list of search keys, the program hangs on the sr.Readline() piece.
Whenever I run the Python Script in the terminal, I'm able to retrieve the JSON output fairly quickly no matter how large the search key array is. Is there a better/faster way to handle reading in a large line of JSON output from the stream without causing it to hang?
Thanks in advance
EDIT:
Here is my Python process information as well..
pythonProcess = new Process();
try
{
pythonProcess.StartInfo.FileName = filename;
pythonProcess.StartInfo.RedirectStandardInput = true;
pythonProcess.StartInfo.RedirectStandardOutput = true;
pythonProcess.StartInfo.RedirectStandardError = true;
pythonProcess.StartInfo.Verb = "runas";
pythonProcess.StartInfo.UseShellExecute = false;
pythonProcess.StartInfo.Arguments = "-u \"" + serverPath + "\\" + scriptName + "\"";
pythonProcess.Start(); // start the process (the python program)
}
catch (Exception e)
{
}