0

I am just trying to get the project of

https://developer.android.com/samples/NetworkConnect/index.html

running. Everything works well, until I increase the len variable to approx. above 2000. I just dont receive anything above that number of characters in the webpage. It just stops displaying anything. When I am displaying the

result.length()

it gives me my len-variable. However, the content of the String stops at 2000. In below listed example I set the len to 60000 and increased timeouts. The Webpage I am trying to retrieve is

http://www.aviationweather.gov/metar/data?ids=eddm&format=decoded&date=0&hours=8

But the problem is also with other "long" pages. It just stops after around 2000 Characters. In my opinion the problem should be somewhere in the inputStream method, because the result length is diplaying correctly as 60000 (so its not the ReadIt).

Any help would be highly appreciated! Here is the code:

    private class DownloadTask extends AsyncTask<String, Void, String> {

        @Override
        protected String doInBackground(String... urls) {
            try {
                return loadFromNetwork(urls[0]);
            } catch (IOException e) {
              return getString(R.string.connection_error);
            }
        }

        /**
         * Uses the logging framework to display the output of the fetch
         * operation in the log fragment.
         */
        @Override
        protected void onPostExecute(String result) {
          Log.i(TAG, result.substring(1500));
        }
    }

    /** Initiates the fetch operation. */
    private String loadFromNetwork(String urlString) throws IOException {
        InputStream stream = null;
        String str ="";

        try {
            stream = downloadUrl(urlString);
            str = readIt(stream, 60000);
       } finally {
           if (stream != null) {
               stream.close();
            }
        }
        return str;
    }

    /**
     * Given a string representation of a URL, sets up a connection and gets
     * an input stream.
     * @param urlString A string representation of a URL.
     * @return An InputStream retrieved from a successful HttpURLConnection.
     * @throws java.io.IOException
     */
    private InputStream downloadUrl(String urlString) throws IOException {
        // BEGIN_INCLUDE(get_inputstream)
        URL url = new URL(urlString);
        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
        conn.setReadTimeout(20000 /* milliseconds */);
        conn.setConnectTimeout(25000 /* milliseconds */);
        conn.setRequestMethod("GET");
        conn.setDoInput(true);
        // Start the query
        conn.connect();
        InputStream stream = conn.getInputStream();
        return stream;
        // END_INCLUDE(get_inputstream)
    }

    /** Reads an InputStream and converts it to a String.
     * @param stream InputStream containing HTML from targeted site.
     * @param len Length of string that this method returns.
     * @return String concatenated according to len parameter.
     * @throws java.io.IOException
     * @throws java.io.UnsupportedEncodingException
     */
    private String readIt(InputStream stream, int len) throws IOException, UnsupportedEncodingException {
        Reader reader = null;
        BufferedReader in = new BufferedReader(new InputStreamReader(stream));
//        reader = new InputStreamReader(stream, "UTF-8");
        char[] buffer = new char[len];
        in.read(buffer);
        return new String(buffer);
    }
tomseitz
  • 503
  • 2
  • 4
  • 14
  • Did you make sure it's not the problem with logcat limit? Like here: [link](http://stackoverflow.com/questions/8888654/android-set-max-length-of-logcat-messages) . Correct me if I'm wrong but basically the length is OK, it's just that you don't see the full content when calling `Log.i(TAG, str)` ? – Poger Jan 26 '16 at 19:44
  • Int nread = in.read(buffer); Check nread! – greenapps Jan 26 '16 at 20:52
  • Why are you using a 60000? You should just read all that is send. Its text isnt it? – greenapps Jan 26 '16 at 20:54
  • @greenapps: good suggestion. nread gives me 1171. By far not enough. 60000 was just a big number to check. Reading "all" would be exactly what I want. Yes, text. – tomseitz Jan 27 '16 at 15:14
  • @poger: in my own implementation I return the result as a string. Same effect, so that can not be the problem (protected void onPostExecute(String result) {myWebsite = result;} – tomseitz Jan 27 '16 at 15:14
  • Then make a loop where you continue reading. You could use a reader and use a readLine() member. – greenapps Jan 27 '16 at 15:22
  • I just tried with some other pages and get different sizes for nread (between 1300 and 2048 - which is, strangely, 2K). But its always the same number for a specific Webpage. @greenapps : are you thinking of a loop in the downloadURL or in the readIt section? – tomseitz Jan 27 '16 at 15:29
  • In readIt() of course. – greenapps Jan 27 '16 at 15:37

1 Answers1

0

Thank you greenapps for your input! For everybody else, who is looking for a solution, this is, how I did it: I just had to change the readIt section to the following:

    private String readIt(InputStream stream, int len) throws IOException, UnsupportedEncodingException {
    String line = "";
    String site ="";
    BufferedReader buffer = new BufferedReader(new InputStreamReader(stream, "UTF-8"));
    while ((line =  buffer.readLine()) != null) {
        if (line.length()>1) {
            site = site + line;
        }
    }
    return site;
}
tomseitz
  • 503
  • 2
  • 4
  • 14
  • That should be `site = site + line + "\n"; `. And you should use a StringBuilder here to collect the lines. – greenapps Jan 27 '16 at 16:57