0

Is any reason why it would be preferable to use sockets (or libraries packaged with the OS) rather than third party libraries such as libcurl.

For example I have been following a few tutorials like this one on Winsock to try and access an http site and it seems it can do everything I need it to do but libcurl can do those things too. Is there any reason to use third part libraries rather than what the OS can supply (I am thinking of factors like execution speed, reliability ect). I know portability is an issue here and possibly ease of use but are there other factors that might make it preferable?

Xantium
  • 11,201
  • 10
  • 62
  • 89
  • 1
    If you're working with a proprietary/obscure tcpip based protocol, you'd need to use sockets. – 1.618 Dec 21 '17 at 23:13
  • `libcurl` isn't a general purpose socket library, it can only access HTTP and FTP servers (maybe a few other types). If you need to do things that it can't do, you'll need to write custom code. You also need to use native libraries if you're writing a library like `libcurl`. – Barmar Dec 21 '17 at 23:14
  • Is there any case when it *wouldn't?* – user207421 Dec 21 '17 at 23:39
  • @EJP Unnecessary functions, updates, execution speeds for example. – Xantium Dec 22 '17 at 00:14

1 Answers1

1

HTTP is a complicated protocol, and so are most other important network protocols. If you try to implement it yourself using the native, low-level socket interfaces, you'll most likely miss some important details (e.g. chunked encoding). If there's a high-level library that implements it, you're almost always better off using that.

It's unlikely that there will be a significant difference in performance. Networks are orders of magnitude slower than CPUs, so the bottleneck is in the data communications, not the processing code.

Barmar
  • 741,623
  • 53
  • 500
  • 612
  • Thank you for your answer. So in terms of efficiency it really doesn't make that much difference but how about unnecessary functions that come with in a high level library is it worth re-writing a miniaturized one to streamline it as much as possible? – Xantium Dec 21 '17 at 23:24
  • 1
    There is generally no need to do so. Your compiler and linker are only going to include needed code and will likely optimize away any code that is not used or referenced (and even some that is if it serves no purpose). – David C. Rankin Dec 21 '17 at 23:27
  • @DavidC.Rankin You don't know where I could find out more information on that do you? – Xantium Dec 21 '17 at 23:49
  • @Simon, sure this link discusses various compilers and caveats [Do unused functions get optimized out?](https://stackoverflow.com/questions/6215782/do-unused-functions-get-optimized-out) – David C. Rankin Dec 22 '17 at 00:51
  • @DavidC.Rankin. Thanks. I never knew that. – Xantium Dec 22 '17 at 00:56
  • I look at it as a portability issue. If you are just developing of M$, then the winsock lib is fine. However, if you want your code to be portable (without huge sections of preprocessor conditionals based on OS), then an available library like libcurl is fine as long as it does what you need it to do. It takes a bit of forward thinking. Nothing is more painful than spending a couple of months to develop your "killer app", only to have to rewrite 10,000 lines of code because you used a non-portable OS specific library `:)` – David C. Rankin Dec 22 '17 at 01:00
  • @Simon Reliability is the far more important issue. Do you really want to study 100's of pages of HTTP protocol specification to make sure you handle all the details right? Do you expect to support all the possible encodings, reply codes? Can you really take weeks or months to implement your application, to ensure you get this all right? – Barmar Dec 22 '17 at 01:03
  • @Barmar. I guess that answers it really. No I wouldn't. – Xantium Dec 22 '17 at 01:07
  • @DavidC.Rankin Good point. That really is an encouraging factor to use a library. – Xantium Dec 22 '17 at 01:08