0

I want to create a program in java that takes a URL and parses it for different strings and then stores it into an excel file. I have never done anything with web pages or excel files so I was wondering;

Where do I start?

I am trying to grasp an understanding of the process as well as any methods I will need to learn to use before actually starting the project. I have not come up with too much, but I believe I am searching the wrong keywords for what I am looking for.

Any help is appreciated.

Edit: I am going to implement this by using a URL of foreclosed homes and want the program to take number of beds, number of baths, square feet, price, and location in their respective columns.

Jordan.J.D
  • 7,999
  • 11
  • 48
  • 78

2 Answers2

0

You need to either implement a scraper, or find one you can use. A scraper is used to extract data from markup files, or any presentation layer file (given that the interested party does not have access to the underlying data tier).

You don't say anything about how you are going to implement this. There are libraries that will write out excel files. Poi is the one that seems to get the most play.

Rob
  • 11,446
  • 7
  • 39
  • 57
0

You want to start issuing an HTTP request and then getting the response body back as a string. You will then be able to parse the response string as needed.

Here is a past answer that can get you started.

How can I get an http response body as a string in Java?

Community
  • 1
  • 1
Rob Breidecker
  • 604
  • 1
  • 7
  • 12