hi all,
I have a server and a client application. The server application used a
datagram socket to accept requests from client, It uses the same socket to
send the response back to client. More than one client may connect to the
application. The recieving and sending processes are done by two threads,
which are not synchronised. When I run client and server in LAN(with leased
line connection), it works fine,and takes only few milliseconds for communication.But
problem starts(case1 when my machine in which the server runs is connected
to the Internet through dialup connection.It took around one minute for the
datagram packet from client to reach the server(utilises dial up connection
for fetching pages) and vice versa. Why this delay happens when the server
running machine is connected to the Internet?

Case2:- Now even if I'm using LAN for the communication between client and
server, with server connected to Internet using dial up connection(but not
used for fetching pages),the problem persists.

what is the reason for this?