Well I can only speculate as I have yet to code a MMORPG, so here is what I think, based on some assumptions. Let's take a style like Runescape for example, where you click a destination and your character moves across a plotted course to the destination.
Assume TCP is being used and headers and packet identification come to 45 bytes a packet.
Assume each movement will use an average of 5 waypoints.
Assume there are 25 players in a single area that all see each other.
The client sends a destination to the server, and the server replies with a plotted course, 5 sets of X/Y coordinates, both coordinates being floats. This is 80 bytes, but let's say 80+2 when the character's ID is added.
When the player wants to move, they click a destination. The client then sends this destination to the server, so that the waypoints for the path can be calculated.
Every update period (let's assume 1 second) the server gathers the data of the players who have requested to move since the last period, and puts it into a single packet to send to all the players. Assuming half (12) of the players are moved, each packet will be 1029 bytes. Now you want to send this data to all the players that can see each other, which in this case is 25. This update costs 25KB. If all of the players have moved in this period, it costs 50KB. Of course this isn't a sensible example as you don't expect 5 waypoints to be covered in under each second, but you get the idea. Let's just pretend the players can't decide where to go and pick new destinations every second.
Here is that rather crude equation:
CostInBytes = (((Waypoints * 16 + 2) * PlayersMoved + 45) * PlayersInArea)
KBps = (CostInBytes * UpdatePeriod) / 1024
As I said before this is just speculation, so take these values with a pinch of salt (or an entire salt shaker if you forget to screw the lid on properly).