Google analytics and webmaster tools how crap?????!!!

Jazajay

Active Member
Right I've installed Google analytics the other day after reading a post from Emma, now I was getting on average 1.5 second page downloads on an unprimed cached less than a second on some pages (Primed cache) thereafter right.

Now I check out their following suggestions on page performance, I mean OMG ~ :mad:

# The first 1 is reduce my 2 CSS sheets into 1.
1 CSS SHEET FOR THE ENTIRE SITE, that's insane, impractical and would actually slow it down.

# Add compression to the following file http://www.google-analytics.com/ga.js That's not even mine it's theirs!!!!!!!

# Reduce DNS lookups from the following domains ~http://www.google-analytics.com/ga.js
WHAT!!!!!!! How can I do that if I want to use their program?????? That makes no sense.

Now don't get me wrong reducing DNS lookups does make sense, but I can't reduce that, reducing CSS sheets again makes sense if you have 3 or 5 per page, but 2? Combining 2 to 1 will actually slow it down not speed it up, as you will have a longer CSS file that wont be needed on most pages, so therefore take longer to download, and if you create a separate CSS file for each page then each 1 will have to be downloaded individually. The best way is to get them to cache long term then add all your main rules to 1, compressed , then add separate changes to certain areas, that way you have a maximum of say 5 or 6 even for big sites, 2 for small sites, these then cache and as they are smaller take miliseconds to download.

Right now as they even say I have the site running 84% faster than the average website, yeah no crap, however by just adding their tracking they have slowed it down by 0.1 of a second on average so it's not helping.

So can't wait to finish my own tracking and then add it to this site, may be overkill but at least I will be following their own advice by deleting their uncompressed js file and reducing a DNS lookup to their site.

Right now analytics.
Just from the 6 terms they have given me and their "apparent positions" I'm doing a pretty bad job at SEO, Err....Wrong!!!! Why? Because I checked them all last night, and a few this morning and I know they are on page 1, as i manually checked with web history off, if I was following that for SEO metrics and didn't bother checking then well I wouldn't be doing myself any favours.

That said there is 1 term, out of 14 that I know are on page 1, going on Google Analytics I only have 1 term in the top 10, err....WRONG!!!!!!!! that I didn't know about manly because I've done my keyword research very throughly before hand and know pretty much all the terms I should be hitting, 22 in total working on the other 8 at the mo. However their one is not even relevant to my clients site and if I'm actually getting traffic from that jease that's going to seriously bugger the metrics up and the bounce rate. But hay if Google say it's relevant it's got to be right, right?

SIGH, yet Google's amazing, still have no idea why ppl say that. :mad:

Right rant over and now my question?
Can anyone point out any good features or bad for that of using either, struggling to find many good ones that I can't do better, finding loads of crap ones though. :down:

Jaz
 
I've added it to mine and to be honest can't say I've noticed a slow down, where did you stick the js code as mines right at the very bottom.
 
I'd say something more than analytics is at fault if you're noticing a slow-down…
 
Yeah it's at the bottom, hate to think what it would be like if I stuck it at the top.
No I'm only noticing a 0.1 of a second slow down something that I wouldn't even bother with but if you check out the following file ~
http://www.google-analytics.com/ga.js

I would personally expect more of a slow down, thats a lot of code that has to execute and saying it's not even compressed. Well they are not following their own advice.

But the slow down's not really my point, it's more their awful advice, which would slow the site down a bit, and secondly advice that I can't change if I use their program.

It's like saying to a child, here's some cigarettes, then grounding them for having cigarettes on them. As in they suggest I cut a DNS lookup out to speed up the site, but I must have the DNS lookup in as the DNS lookup is for them then to tell me that that's what's slowing the site down.
Now how can I take it out if I want to use it?

I can't unless I stop using their webmaster tools, it pointless advice.
 
Here is their exact advice ~
Details: Save up to 13.6 KB, 1 requests, 1 DNS lookups
Enable gzip compression
Compressing the following resources with gzip could reduce their transfer size by 13.6 KB:
http://www.google-analytics.com/ga.js (13.6 KB)

Minimize DNS lookups
The domains of the following URLs only serve one resource each. If possible, avoid the extra DNS lookups by serving these resources from existing domains:
http://www.google-analytics.com/ga.js
Cite: Google's Webmaster tools.

That extra 13.6k will slow down the site as it is another 13.6k that has to be downloaded, if THEY added compression to it, which is what their advice is saying, then it really wouldn't.

In fact that 1 JS script is larger than most of the sites HTML combined. :mad:
 
Sorry to wade in late on this one, I have to agree with you Jay,

I have also been *advised* by google on how to improve the loading speed, however in 90% of the cases the suggested solution cannot be done as its google scripts that need to be compressed!

Now, I'm fully aware as I'm sure you are, that site loading time/speed is going to be the big buzzword for SEO over the next few months starting from next year and such! but its going to be a bit rich if google penalise your site on loading times, when they themselves don't provide compressed scripts.

:angry:
 
they probably have compressed scripts but only use it for their own sites :(
 
Hay peeps right i've done some research into it n google does compress it when it gets loaded by the browser, how ever it shows up as un compressed due to the fact their bot gets the uncompressed version, they will be changing this in the new year. So it is actually compressed when the user gets it, down to something like 6k from what i can remember in testing. So it is currently an issue with analytics that will be fixed soon, according to a google spokesman.

The DNS lookup though is still crap advice. :)

@sunburn,
I know but how much grief do i get for my minimalist approach to coding? Less code less to download = faster page load = as of 2010 better rankings. But hay i'm a nut. :D
 
Well one of the plugins I'm currently using extensively is the google maps one, and this weighs in at a shocking 440k which really makes me feel ill, however it does provide the solution the END USERS want. So as long as google can waiver any penalty for using their scripts ill be happy come the new year.

As for general speed of site vs. code im very much in favour of optimising web code for faster processing and bandwidth saving, as long as it improves the EU experience.

For me this is an Instant Mash scenario... yes instant mash is quicker and easier to make hell even a lot less messy but does it taste as good as normal mash? NO!.
 
I can get a 700k page to load on average between 3-5seconds so what i think will happen is a new job title will open up that being site preformance specalist, as to get such a huge file to load takes a geek who understands about saving http headers, how browsers internally work n how to squeeze the webserver out of every last k it sends.

Because some techniques only give u a reduction by a few k, put code minimalisation in there, plus them all together though and u get huge perforance benefits, then if u benchmark your server scripts u can work out where to optimise server code and squese even more out, n of coarse u have img optimization, right file extentions, pngs down to 24 or 8 bit n exported properly.

400k though tbh i could still get to load in a few seconds.

But yeah with everything in my book it goes end user, best practise n semantics then the search engines. End of the day sites that load quick have a better chance of improving their bounce rate, the only true metric on any site. :)
 
Why dont you compress and host the ga.js file yourself?

I've noticed that there is a real problem with slow-down on page-loads because until all javascript is loaded, such as the ga.js file which does take ages sometimes, jquery won't initialize until loaded.

Maybe this is a mistake on my part I dont know
 
Well it is compressed it's just that they don't actually see that it is and thus flag it up as an issue when there isn't one. They are getting on to fixing it so it doesn't show as an issue, but knowing how fast Google move to any problem that annoys web devs it may be a while.

On top of that them flagging up 1 DNS as an issue is again laughable, 2 or 3 yes fine, but 1 that's going to be more of a benefit than a problem.
Due to the fact that means that browsers can download an extra few HTTP responses from that separate DNS, so in fact it is actually a benefit having it on their servers, but also why should I add more strain to mine just to use their tool which I have given up on as it still doesn't register I've visited the site from Leicester in the Geo-location even when I have my proxy turned off.

The web masters central or what ever I only use to check Google's registered page speed of the average page of the site.

Other wise TBH it's pointless my custom solutions track SEO stats a hell of a lot better, more precisely and in more detail TBH. I mean it still says the site ranks top position of 13 for a term it ranks both 1st and 2nd for, don't use web history, and also has a position 8th for verified by several users. Honestly don't relie on it giving you the right positions manually check them.
 
Back
Top