11

I want to write a program in C++ that helps to manage my hockey pool, and one of the key things i will need to do is read off the schedule for the week ahead. I was hoping to use the NHL website. is there any way to have the program download the HTML file for a given url, and then parse that? i suppose that once i have the file downloaded, Simple file I/O would do, but im not sure how to download the file.

Ben313
  • 1,662
  • 3
  • 20
  • 32

3 Answers3

16

I would use some library providing Http abstraction.

For example:

cpp-netlib

#include <boost/network/protocol/http/client.hpp>
#include <string>
#include <iostream>
int main()
{
    boost::network::http::client client;
    boost::network::http::client::request request("http://www.example.com");
    request << boost::network::header("Connection", "close");
    boost::network::http::client::response response = client.get(request);

    std::cout << body(response);
}

I do not think it can get much easier than that

On GNU/Linux compile with:

g++ -I. -I$BOOST_ROOT -L$BOOST_ROOT/stage/lib -lboost_system -pthread my_main.cpp

QHttp

Example for this could get quite long, since QHttp can send only non-blocking requests (that means, that you have to catch some signals reporting that the request was finished, etc.). But the documentation is superb, so it should not be a problem. :)

463035818_is_not_an_ai
  • 109,796
  • 11
  • 89
  • 185
Palmik
  • 2,675
  • 16
  • 13
6

Using libcurl is one option. Here is an example of using it with C++ to download contents of a webpage as a string.

darioo
  • 46,442
  • 10
  • 75
  • 103
0

I finally managed to compile it and link with:

    g++ -I. -I/usr/include -lboost_thread -lboost_system 
    -lcppnetlib-client-connections -lcppnetlib-server-parsers 
    -lcppnetlib-uri -pthread main.cpp
user1638291
  • 49
  • 2
  • 8