0

Goal: to get values from a text file and store it into values to load into my sqlite database.

Problem: My method is not efficient, and I need help comming up with an easier way.

As of right now I am parsing my textfile that looks like this.

agency_id,agency_name,agency_url,agency_timezone,agency_lang,agency_phone
1,"NJ TRANSIT BUS","http://www.njtransit.com/",America/New_York,en,""
2,"NJ TRANSIT RAIL","http://www.njtransit.com/",America/New_York,en,""

I am parsing everytime i read a comma, then storing that value into a variable, then I will use that variable as my database value.

This method works and is time consuming, The next text file I have to read in has over 200 lines of code, and i need to find an easier way.

AgencyString = readText();
        tv = (TextView) findViewById(R.id.letter);

        tv.setText(readText());

        StringTokenizer st = new StringTokenizer(AgencyString, ",");

        for (int i = 0; i < AgencyArray.length; i++) {
            size = i; // which value i am targeting in the textfile 
//ex. 1 would be agency_id, 2 would be agency_name
            AgencyArray[i] = st.nextToken();
        }
        tv.setText(AgencyArray[size]);  //the value im going to store into database value
    }

    private String readText() {
        InputStream inputStream = getResources().openRawResource(R.raw.agency);

        ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();

        int i;
        try {
            i = inputStream.read();
            while (i != -1) {
                byteArrayOutputStream.write(i);
                i = inputStream.read();
            }
            inputStream.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
        return byteArrayOutputStream.toString();

    }
The Tokenizer
  • 1,564
  • 3
  • 29
  • 46

2 Answers2

1

First, why is this a problem? I don't mean to answer your question with a question so to speak, but more context is required to understand in what way you need to improve the efficiency of what you're doing. Is there a perceived delay in the application due to the parsing of the file, or do you have a more serious ANR problem due to you running on the UI thread? Unless there is some bottleneck in other code not shown, I honestly doubt you'd read and tokenise it faster that you're presently doing. Well, actually, no doubt you probably could; however, I believe it's more a case of designing your application so that delays involved in fetching and parsing of large data aren't perceived by or cause irritation to the user. My own application parses massive files like this and it does take a fraction of a second, but it doesn't present a problem due to the design of the overall application and UI. Also, have you used the profiler to see what's taking time? And also, have you run this on a real device, without debugger attached? Having the debugger attached to the real device, or using the simulator greatly increases execution time by several orders.

I am making the assumption that you need to parse this file type after receiving it over a network, as opposed to being something that is bundled with the application and only needs parsing once.

Trevor
  • 10,903
  • 5
  • 61
  • 84
  • I should have probably worded my question better. I need to improve efficiency not necessarily by the speed it parses through the file, but by the way I am coding it. Using this method with a text file with over 200 lines of code, and extracting data and storing it into variables will take a long time, on my behalf. I was wondering if there is another way i can code this, that can make my job a little easier. – The Tokenizer Aug 17 '12 at 20:04
  • Processing just 200 lines with even 50 fields each, should take less than a second. – Rajesh J Advani Aug 17 '12 at 21:21
0

You could just bundle the SQLite database with your application instead of representing it in a text file. Look at the answer to this question

Community
  • 1
  • 1
Rajesh J Advani
  • 5,585
  • 2
  • 23
  • 35