D&C GLug - Home Page

[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]

Re: [LUG] And you wonder why the tubes are blocked ...

 

Gordon Henderson wrote:
One of my clients - a web design company - has taken on 2 existing 
sites in recent months.. And these sites have a sort of interactive 
bit to fetch some time-specific data off the website - and it turns 
out that interactive bit sits in your browser, runs some asynchronous 
javascript to poke a query back to the server ... once a second.
It came to my attention some months back as the server was being 
beaten up a bit and I tracked it down to the actual mysql query which 
was non too efficient, and with 80 or so remote people having this 
thing sit in their browser, it was having a significant effect on an 
already busy (shared) server... so they changed it to just do the 
query once a minute, stuff the value in a file (it's returning 
literally 3 or 4 words of text) which the 1-second remote probes 
accessed. Much happiness.
(The data is pre-loaded into the database and is time specific, but 
only changes every 2-3 minutes)
However what I didn't realise was that these 1-second probes were 
still happening - I thought they'd have done the sensible thing and 
reduced the frequency of these probes. It seems not and I didn't see 
it as that server is quite a busy one, so they actually got lost in 
the noise of 1000's of other http requests going on ...
Recently client takes on a 3rd site of the same ilk, and this one is 
on a brand new server - it seemed to go live yesterday because I 
noticed the traffic on it go from zero to lots - then did some 
investigating, as you do... And found this 3rd site doing exactly the 
same thing, and then groaned as I realised those 1-second probes were 
still happening on the other server too. (which is hosting 2 of these 
sites)
I calculated/extrapolated that at the peak time, the server would 
exchange over 60GB of data a month just servicing these requests if 
they were 24/7. Obviously they die down overnight, but not by much.
Now I know 60GB isn't a lot by todays standards, but if this is 
typical of what todays modern programmer is doing to the interwebs 
then there's no hope for us. Programs running in the background of the 
browser all the time, and data constantly being exchanged via a web 
server isn't really helping anything at all.
But what if this is being run on someones mobile device and they're 
paying for data? It's unlikely this particular site would be run by 
someone on a mobile, but with the proliferation of 3G dongles, you 
never know...
Is it me, or are we doomed to this lazy programmer approach? (Or is 
that "just the way is it" with this new fangled web2oreah)
Gordon

This is the MS effect. "Computers are easy!"
Theres been some talk about the NHS IT failures and its down to the same thing: "Computers are easy" so we can pay for young programmers out out college to do the work cheaply. They know nothing - it takes a good 10 years to become a programmer - simply because you have to learn to do everything that is required and thats not simple, not easy and to get decent results takes experience.
The problem Gordon described is a simple one to solve:
check for a dirty bit, dont run the whole Stored Procedure (and if its not a stored procedure ask why!) just find out if the data has been updated - and return an empty recordset if it hasnt. Seriously easy in the worst ajax I've seen. Its not web2oreah - that is just a feature of MSoreah poisoning another part of the web.
Tom te tom te tom

--
The Mailing List for the Devon & Cornwall LUG
http://mailman.dclug.org.uk/listinfo/list
FAQ: http://www.dcglug.org.uk/linux_adm/list-faq.html