In my program i am using timeval structure of time.h for a TCP/IP socket program, in which the client waits for a timeout value as specified by this structure value the structure initialization is as below
struct timeval tv;
tv.tv_sec = 10;
tv.tv_usec = 0;
and setting socket options as is. Since recv() is a blocking call I've put a timeout:
setsockopt(sock, SOL_SOCKET, SO_RCVTIMEO, (char *)&tv,sizeof(struct timeval)) ;
and receive data using recv() function. So to verify whether delay is ok I used two variables start & stop of type time_t:
time_t start=clock();
BytesRcvd = recv(sock, CacheBuffer1, sizeof(CacheBuffer1), FLAG);
time_t stop=clock();
time_t difference=difftime(stop,start);
so as per the definitions what I expect is the recv() functions waits for maximum of 10 seconds until data is received via socket. From the server side I didn't send anything. But upon calculating the difference the value I've obtained is 10 but I didn't feel a 10 second delay for reception, but just in the range of milliseconds, so I assume it only took about 10 millisecond
What might be the issue?? Any thoughts?
[update from comment]
My socket is non-blocking that's why I used setsocketopt() function, and I want to wait for a timeout value of 10 Seconds ,ie; if within 10seconds no data is received I have to exit from the recv() function...