[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]
On 11/09/12 20:52, stinga wrote: > G'day all, > > I need to test a webserver but I need to slow down the data to about > 1mb. I am on a fftc so I am not seeing a real world. > Any ideas how to do this simply? mod_bandwidth is it is apache is probably easiest. I would say "why bother" - you can get a feel for performance of pages simply by comparing the relative timings, and the size of data downloaded. The emulation would be flawed, as you would also need to constrain the uplink performance (at least in theory) to mimic real life. Also "why?". If the website is built well and performing efficiently all that rate limiting it will do is make the big chunks of data take longer to slow down. Real end users will have different round trip, so you may need to throw in an arbitrary delay of several tens of milliseconds - not seen software to do that but I bet Linux IP stack can be tweaked to do it if you really want. There are companies that specialise in these sorts of measurements from around the world, and if you are big enough and the content valuable enough use them. Otherwise using a typical 3G phone connection might be the easiest way to get realistic slow downloads. -- The Mailing List for the Devon & Cornwall LUG http://mailman.dclug.org.uk/listinfo/list FAQ: http://www.dcglug.org.uk/listfaq