Speed... Website speed it kind of a buzz word these days and I decided to jump into this world to try to pull up my speed rank on this site. So I dug into Google's Page Speed site and plugin and started making modifications to my site to improve its performance. After a bit of trial and error I have finally achieved a Speed Rank of 96 out of 100. Yes!
Smaller Main Page
First thing I did was reduce the size main page. Originally I had all my blog posts put on a single page. Which wasn't that big of a deal, but Hyde has a paginator plugin
hyde.ext.plugins.paginator.PaginatorPlugin that will split the posts up and add a
/page2 etc. for the additional pages. So this reduced the amount of HTML loaded for the main page. Commit
Uglify | Minify
For the CSS I was using LessCSS that allows me to use CSS variables and mixins, so I wasn't sure if I could Minify the CSS. Sure enough, LessCSS has a compression argument that will do just that. After figuring out how to get it to work with Hyde, I have a compressed CSS for speed. Commit
First thing I did was include Caching. I setup the Cache-Control items in the HTTP headers. I did this by adding these:
# 480 weeks <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf|svg)$"> Header set Cache-Control "max-age=290304000, public" </FilesMatch> # 2 weeks <FilesMatch "\.(js|txt|css)$"> Header set Cache-Control "max-age=1209600, must-revalidate" </FilesMatch> # 1 day <FilesMatch "\.(xml)$"> Header set Cache-Control "max-age=86400, must-revalidate" </FilesMatch> # 2 hours <FilesMatch "\.(html|htm)$"> Header set Cache-Control "max-age=7200, must-revalidate" </FilesMatch>
This gave me Caching on some primary files and allows for the items that most often will change to be renewed fairly soon. Commit
Apache can compress the files delivered and most all modern browsers support this. So I needed to add a handler that will compress many of the files. Using Apache's
mod_deflate.c I was able to add compression to many of my files.
<IfModule mod_deflate.c> <FilesMatch "\.(js|css|html|htm)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule>
The only issue I have found is that my Google PageSpeed will sometimes drop to 86/100 because it thinks I'm not using compression. I found out that my host, Bluehost, dynamically turns on and off compression based on CPU usage. So at any given time I may or may not have compression working. I'm not sure how I feel about that though. Commit
Finally I added Character Encoding to the HTTP headers:
According to Google this helps... eh. Commit
What I'm Missing and What's Left
I'm finally left with items I cannot control. Most of this is funny because Google is the one recommending these speed improvement and most of the items that are suggested are because of Google. Sad they cannot follow their own performance rules.
First, we have a Cache Validator problem with my Google Fonts. Apparently Google doesn't validate. hehe.
Finally, It suggests removing unused CSS which I'm not going to do. And using efficient CSS selectors that I'm not going to modify because of CSSLess.
Sot that's my site's speed bump. What do you think? Anything I need to do further, or know of a way to force Bluehost to always compress?