How fast can information be through socket connections?
October 27, 2005 1:59 AM   Subscribe

LatencyFilter: I'm making a multiplayer flash game, where players communicate with each other via a socket connection to a server. How fast can information be sent like this?

The game is going to be kind of like a variant of snake, where each player controls a snake or worm, and a player dies if he crashes his snake into someone else.

For this to work each player has to send the current coordinates of the head of his snake to the server, and the server then has to send it out to every other player in the game. I need this to happen as often as possible.

Currently, the game looks acceptably good at 10 frames per second. As far as I get it, with that frame rate, the time between the coordinates are sent from one player to it reaches another can't be any more than 100 ms. Would that be achievable, as long as all players are connected to the server through an always open socket connection?

How much would it matter that the server is able to process the information quickly?

The information I need to send has to have three digits for both the x and y coordinates. If x = 400 and y = 500, I imagine what would be sent would be a string like "400500" followed by a zero byte.

Also, I haven't started creating the server-side program that is going to deal with this. Does it matter what programming language I use for it?
posted by cheerleaders_to_your_funeral to Computers & Internet (5 answers total)
Lots of questions! I did something like this with an interactive art work.

1) Don't get bandwidth and latency confused. Your data needs look negligible for broadband bandwidth and few players and 10fps (format it "1,400,500;2,532,9;..." i.e. player_number,x,y; or in XML - the bandwidth will take it) .

The key issue is the time it takes to reach the server, the time .it takes for the server to do something with it, and the time it takes messages from the server to be returned. This time is known as latency, and depends on how many things are inbetween the client and the server. You can measure it with a ping command and above 100ms is not unusual over the internet. It's completely unrelated to frame rate.

2) Also an issue you haven't mentioned is synchronisation. If for instance, something special happens when one snake bumps into another, the server has to be in charge of deciding - things happening at slightly different times on clients will mean that each client might calculate time-dependent things differently.

3) Programming language - I suggest python or ruby for speed, power and coolness.
posted by cogat at 2:47 AM on October 27, 2005

You need a packet every 100ms? It'll work if they're on a LAN. It just might work in North America. It won't work around the world.

Don't make your game graphics update synchronously with the network. Draw things at 10fps, and only update world state when you get something from the network. One easy way to do this is have the server send out starting positions, then only send out updates when a player presses a key. Otherwise each client runs a local simulation of what the server state should be. This is how all modern network games work.

It doesn't much matter what you implement your server in unless you need one server to support thousands of clients. I'd use Python up until the moment it's too slow.
posted by Nelson at 3:34 AM on October 27, 2005

I've done some work using java and flash that used the same method that Nelson mentioned -- pushing actions between the two clients so that the respective states are updated when the data is available. That way, latency is less of an issue.

I used a great tutorial that helped me make a multi-player game of pong that utilized this method. Unfortunately, I am having trouble googling the page right now. If I get home before someone figures it out for me, I'll post a link here.
posted by sleslie at 7:39 AM on October 27, 2005

have you thought about using unity?
posted by judith at 9:25 AM on October 27, 2005

Best answer: Nelson has it. You want to maintain the "master state" in the server. This is the definitive source, that determines everything: whether or not a player has collided, whether he turned away in time or not, etc. Make all such decisions on the server, none on the client.

This server state is updated from messages received from the clients. They should be programmed to only send updates when the user presses a button or otherwise changes state. Having each client report "I'm at 1!" "I'm at 2!" "I'm at 3!" constantly is completely superfluous if the rules of the simulation mandate that that's what happens anyway barring any change of state.

So the server will keep state, updated by the rules of the game and client updates. It will periodically send out updates to all connected clients as well. So when user A changes direction or collides with user C, user B is notified. These updates should all have their own timestamp - don't rely on the network. In other words, on the server you have a master clock, and every event has a timestamp, and that timestamp is included in the state message sent out to the clients. Don't just say "player A turned left", say "player A turned left at t=5.221s".

On the client end you essentially run a parallel simulation. The client knows the rules of the game, i.e. each worm continues going in the direction it was previously going, unless there is a state change. So in the client you just keep a local version of the game state. However - and this is the important part - the client state is ALWAYS superceded by the server state. The job of the client is to update its local version of the state to match what the server says it is. And you use the timestamps included in the messages, not the times that the network actually delivered the packets. In some cases this can even mean things seeming to happen in reverse - for example, player B was moving to the left in the local simulation, but he had actually turned right a small time ago, so when you receive the state update from the server your local version has player B in the wrong position. So, you back up your local clock to the timestamp indicated in the server message, apply the state change, and then step your local clock forward (applying the rules of the simulation) until you catch back up with real-time.

This is the critical point. The client's own local view of the state of the game is just an approximation. It can often be wrong for very brief periods, and it is always trumped by what the server says. The server is the final decision of everything: scoring, when two players collide, etc. The local client can go ahead and try to calculate these things, but if it does so it's only for display purposes so that it can try to draw the correct graphics - but you don't actually record a score or notify the player that they died until the server says so. The server maintains the one true gamestate.

For just a simple game of worms, this probably won't matter so much... but imagine if you were coding a FPS game, and how you would reconsile all those players running around, jumping, shooting at each other, projectiles in the air, etc. The only possible way this can work with the latencies of the internet is the method above, where the server maintains the true state, and every client runs a local simulation of the action based on last known state.
posted by Rhomboid at 9:09 AM on October 28, 2005

« Older My Palm Hurts   |   Rip a DVD, add timecode? Newer »
This thread is closed to new comments.