The server constantly sends data to the client at intervals of 1 second - 2 minutes. Data arrives, but after about 15-20 minutes the client stops receiving messages. In this case, rv is equal to 1 and n! = 0 and the server sends data (checked).
while(1){
FD_ZERO(&set);
FD_SET(sockfd, &set);
timeout.tv_sec = 30;
timeout.tv_usec = 0;
rv = select(sockfd + 1, &set, NULL, NULL, &timeout);
std::cout << "rv Read: " << rv << std::endl;
if(rv == -1){
printf("ERROE select\n");
restartSocket();
}
else if(rv == 0){
printf("READ timeout\n");
}
else{
bzero(buffer,256);
printf("Start read\n");
int n;
n = read(sockfd,buffer,256);
if (n < 0){
printf("ERROR reading from socket");
restartSocket();
}
printf("read: %s\n",buffer);
}
}
After another 10-15 minutes, read() returns -1 and then the server reconnects and everything starts working normally for the same 15-20 minutes and then everything goes in a circle.
How to make the client always receive data from the server? Or how to make read() or select() understand that all the same there is no connection with the server and gave the command to reset the socket? (But I think that doing a re-connection every 5-10 minutes is somehow wrong)
Aucun commentaire:
Enregistrer un commentaire