Jazajay
Active Member
Right I've installed Google analytics the other day after reading a post from Emma, now I was getting on average 1.5 second page downloads on an unprimed cached less than a second on some pages (Primed cache) thereafter right.
Now I check out their following suggestions on page performance, I mean OMG ~
# The first 1 is reduce my 2 CSS sheets into 1.
1 CSS SHEET FOR THE ENTIRE SITE, that's insane, impractical and would actually slow it down.
# Add compression to the following file http://www.google-analytics.com/ga.js That's not even mine it's theirs!!!!!!!
# Reduce DNS lookups from the following domains ~http://www.google-analytics.com/ga.js
WHAT!!!!!!! How can I do that if I want to use their program?????? That makes no sense.
Now don't get me wrong reducing DNS lookups does make sense, but I can't reduce that, reducing CSS sheets again makes sense if you have 3 or 5 per page, but 2? Combining 2 to 1 will actually slow it down not speed it up, as you will have a longer CSS file that wont be needed on most pages, so therefore take longer to download, and if you create a separate CSS file for each page then each 1 will have to be downloaded individually. The best way is to get them to cache long term then add all your main rules to 1, compressed , then add separate changes to certain areas, that way you have a maximum of say 5 or 6 even for big sites, 2 for small sites, these then cache and as they are smaller take miliseconds to download.
Right now as they even say I have the site running 84% faster than the average website, yeah no crap, however by just adding their tracking they have slowed it down by 0.1 of a second on average so it's not helping.
So can't wait to finish my own tracking and then add it to this site, may be overkill but at least I will be following their own advice by deleting their uncompressed js file and reducing a DNS lookup to their site.
Right now analytics.
Just from the 6 terms they have given me and their "apparent positions" I'm doing a pretty bad job at SEO, Err....Wrong!!!! Why? Because I checked them all last night, and a few this morning and I know they are on page 1, as i manually checked with web history off, if I was following that for SEO metrics and didn't bother checking then well I wouldn't be doing myself any favours.
That said there is 1 term, out of 14 that I know are on page 1, going on Google Analytics I only have 1 term in the top 10, err....WRONG!!!!!!!! that I didn't know about manly because I've done my keyword research very throughly before hand and know pretty much all the terms I should be hitting, 22 in total working on the other 8 at the mo. However their one is not even relevant to my clients site and if I'm actually getting traffic from that jease that's going to seriously bugger the metrics up and the bounce rate. But hay if Google say it's relevant it's got to be right, right?
SIGH, yet Google's amazing, still have no idea why ppl say that.
Right rant over and now my question?
Can anyone point out any good features or bad for that of using either, struggling to find many good ones that I can't do better, finding loads of crap ones though. :down:
Jaz
Now I check out their following suggestions on page performance, I mean OMG ~
# The first 1 is reduce my 2 CSS sheets into 1.
1 CSS SHEET FOR THE ENTIRE SITE, that's insane, impractical and would actually slow it down.
# Add compression to the following file http://www.google-analytics.com/ga.js That's not even mine it's theirs!!!!!!!
# Reduce DNS lookups from the following domains ~http://www.google-analytics.com/ga.js
WHAT!!!!!!! How can I do that if I want to use their program?????? That makes no sense.
Now don't get me wrong reducing DNS lookups does make sense, but I can't reduce that, reducing CSS sheets again makes sense if you have 3 or 5 per page, but 2? Combining 2 to 1 will actually slow it down not speed it up, as you will have a longer CSS file that wont be needed on most pages, so therefore take longer to download, and if you create a separate CSS file for each page then each 1 will have to be downloaded individually. The best way is to get them to cache long term then add all your main rules to 1, compressed , then add separate changes to certain areas, that way you have a maximum of say 5 or 6 even for big sites, 2 for small sites, these then cache and as they are smaller take miliseconds to download.
Right now as they even say I have the site running 84% faster than the average website, yeah no crap, however by just adding their tracking they have slowed it down by 0.1 of a second on average so it's not helping.
So can't wait to finish my own tracking and then add it to this site, may be overkill but at least I will be following their own advice by deleting their uncompressed js file and reducing a DNS lookup to their site.
Right now analytics.
Just from the 6 terms they have given me and their "apparent positions" I'm doing a pretty bad job at SEO, Err....Wrong!!!! Why? Because I checked them all last night, and a few this morning and I know they are on page 1, as i manually checked with web history off, if I was following that for SEO metrics and didn't bother checking then well I wouldn't be doing myself any favours.
That said there is 1 term, out of 14 that I know are on page 1, going on Google Analytics I only have 1 term in the top 10, err....WRONG!!!!!!!! that I didn't know about manly because I've done my keyword research very throughly before hand and know pretty much all the terms I should be hitting, 22 in total working on the other 8 at the mo. However their one is not even relevant to my clients site and if I'm actually getting traffic from that jease that's going to seriously bugger the metrics up and the bounce rate. But hay if Google say it's relevant it's got to be right, right?
SIGH, yet Google's amazing, still have no idea why ppl say that.
Right rant over and now my question?
Can anyone point out any good features or bad for that of using either, struggling to find many good ones that I can't do better, finding loads of crap ones though. :down:
Jaz