Chances are your website has some reference-types of posts that are refreshed every year. In personal finance, there are plenty like IRA contribution limits or the income tax brackets. One great strategy that can help you accumulate more links is to make the post URLs dateless and shuffling the content around each year.

For example, let’s say you have a post on 2009 IRA contribution limits with the following URL:
http://www.yourblog.com/2009-IRA-contribution-limits.html

In 2010, I recommend that you use this URL for the 2010 version of the post:
http://www.yourblog.com/IRA-contribution-limits.html

When 2011 rolls around, create a new page:
http://www.yourblog.com/2010-IRA-contribution-limits.html

Move all the IRA-contribution-limits.html content into the new 2010-IRA-contribution-limits.html page and update the IRA-contribution-limits.html version with the updated information. The titles of the pages themselves should retain the date, it will help the reader know more clearly what year the information applies to.

This lets the dateless URL accumulate links, thereby increasing its link profile, while keeping the data fresh and accurate. This also gives people a chance to find the archived reference information should they need that as well. This won’t pay huge dividends at first but after a few iterations, you will see good results. At the very least, your site won’t be competing with itself.

I’ve been writing a lot about optimization, be it improving click through rate or site responsiveness, lately because it’s something I’ve been focused on the last few weeks. Getting more traffic is harder than optimizing your site so that it offers the best information to the traffic you’re already getting. Boosting your conversion rate by making your content more appealing is a lot easier than trying to get more eyeballs!

So in some of my posts I’ve been sharing what has worked well for me and there may be a temptation for you to blindly implement them on your site – I advise against it. In fact, I advise taking any results anywhere and blindly implementing them without any data because you don’t know how that change will perform on your site.

Let’s take a prime example -Amazon.com. It’s a well known fact that Amazon.com tests almost every aspect of their site. With their traffic and their optimization brains, it’s obvious that they are probably very close to “perfect” when it comes to the buying experience. The start of that buying experience is the product page and the pivotal action on that page is clicking that Add to Shopping Cart button.

Amazon Orange

The orange “Add to Shopping Cart” button with the navy blue text is the highest converting button for their layout and that’s after years of optimization and study. Does that mean you should apply it on your site blindly?

Nope. To test this theory, I tried a few buttons on Bargaineering, including their orange button, and my results were different. I had four options, one text link and three buttons. Each button had the same text, were placed in the same place, and I simply calculated CTR (I did this study long ago and so I don’t have more data than this to offer):

  • Text – 1.71%
  • Green Button – 6.44%
  • Blue Button – 5.08%
  • Orange Button – 3.38%

The Amazon orange button performed better than the text but worse than both green and blue. When I had to guess, my guess was that the predominately blue layout of Bargaineering made the green and blue buttons more appealing. In a head to head test, green outperformed blue.

The one color I didn’t test was Red, because the general consensus advice was that red was a “stop” color. It catches the eye but it tells people to implicitly “stop.” However, based on my own advice, I should try red to see if it performs better.

There are two lessons out of this – always test ideas for your site with real data and always use a button. :)

Combining Multiple CSS Files

by jim on January 1st, 2010

When it comes to optimizing your site’s load times, you can lower the amount of data that gets sent (that’s what gzipping your site accomplishes), or you can lower the number of HTTP requests. Every HTTP request takes a small fraction of time to complete and a request is executed every time the browser needs to load another file. If you look in YSlow, under Statistics, it’ll show you both the Total Weight and the number of HTTP requests.

One easy way to cut down the number of requests is by combining, or suturing, multiple CSS files together. With PHP code, you will take all the files and dynamically combine them so that the browser only makes one request.

httpd.conf

First, add the following lines to your httpd.conf file:
AddHandler application/x-httpd-php .css
header('Content-type: text/css');

You can usually find it in the /etc/httpd/conf/ directory. You’ll have to restart your server for the changes to take effect.

Combining .css files

From there, simply use PHP to include each of the CSS files:
<?php
  include("stylesheetA.css");
  include("stylesheetB.css");
?>

This tip is taken from the O’Reilly book Website Optimization, by Andrew King, in the section on suturing CSS and Javascript files.

Don’t Optimize Google Adsense eCPM

by jim on December 31st, 2009

On numerous occasions, when I’ve talked to other bloggers about monetization, the topic of optimizing Google Adsense almost always comes up. For many new bloggers, Adsense is a quick, low maintenance, and non-intimidating way to monetize a blog. Blog readers are comfortable with Adsense because they see it everywhere and likely don’t find that it jars their experience.

Adsense Metrics

In talking about Adsense, there are really only a few metrics to look at – impressions, clicks, clickthrough rate (clicks divided by impressions), eCPM (effective CPM, or earnings be 1,000 impressions), and revenue. So with any metric, we are always trying to improve them through testing, right?

Unfortunately, too many times bloggers focus on the wrong metric to improve. They recognize that revenue is a product of the other metrics and impressions is a product of how many ad placements you run. So after that you
really are left with clicks, CTR, and eCPM. CTR and clicks are intertwined, so working with CTR alone will get you the results you want.

Why is eCPM is a red herring?

If you think about the math aspect of it, eCPM comes after revenue, which we’ve already determined is itself controlled by other factors. Revenue is a product of the clicks times the value of each click, which is never revealed on an individual click by click basis (though you can figure this out if you track multiple channels). eCPM is then calculated by dividing revenue by the number of 1000s of impressions.

What this means is that it’s clicks, and consequently CTR, that you should be focusing on. When optimizing, you should focus on how to improve each individual block’s CTR (or deciding it’s so low that you prefer to remove it to improve the user experience). You can increase eCPM by simply removing under-performing (low CTR) Adsense blocks, but that could cut into your earnings and has you focus on the wrong metric.

Adsense eCPM does have value, it tells you how much that placement is worth to you as a publisher. If a company wants to buy the placement for a month, you have a good idea of how much that spot is generating each month.

Just don’t focus your optimization efforts with that metric in mind.

How to GZIP Your Site’s Content

by jim on December 30th, 2009

GZipping your content is the easiest and best way to speed up your site. For a full discussion about Gzipping content, I recommend this page on gzipping content (the code snippets I present below are taken from that site). If you just want the nitty gritty, without the technical explanations, as well as some real world results, let’s jump right into it.

The easiest way to gzip your content, if your site’s server supports it, is to throw one of these lines in your .htaccess file:
# compress all text & html:
AddOutputFilterByType DEFLATE text/html text/plain text/xml

# Or, compress by file extension:
<Files *.html>
SetOutputFilter DEFLATE
</Files>

Be sure to change the .html to .htm if you use that as your HTML file extension.

Or, if you don’t have access to your site’s .htaccess file, throw this PHP snippet at the top of every page (in WordPress, throw it in your header.php file):
<?php if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start(); ?>

I was helping Jeff Rose of Good Financial Cents navigate the various “tips” in YSlow and the first one we knocked out was GZipping all of his content. With that one change, we trimmed his homepage size from 805.5K to 706K (unprimed cache). A 12.4% reduction in page size with the addition of one line… not bad!

After you’ve made the change, check to see that it worked using this site’s gzip test.

How to GZip CSS Files

by jim on December 29th, 2009

One of the easiest ways to trim the weight of your website is to gzip your content and one of the best candidates for this is your site’s cascading style sheets, or CSS.

The process for gzipping your CSS files is very simple. First, we will download a .php file that will process your CSS files, appending gzip-related code if the visitor’s browser can process it (most can). Then, we will tell .htaccess to send all .css files through this .php file before being sent out to the visitor.

Download CSS Gzip Processing File

You will need to download this csszip.php file (right click and save the file locally as a .php file, click to read the code) and put it on your server. As with any file you download and install on your server, read through it and make sure you understand everything that happens in it. Don’t take my word on what it actually does!

This file was adapted from this post on gzipping CSS files.

Modify .htaccess

The next step is to tell your server to send every .css file through this processor to be Gzipped. By doing it this way, you don’t have to add gzip-related code to every .css file on your server. This will let your web server do the heavy lifting and ensures every .css file, even the ones you forgot, are being gzipped if the visitor’s browser supports it.

RewriteRule ^(.*).css$ /csszip.php?file=$1.css [L]

If you didn’t put the file where .htaccess is, modify the /csszip.php portion of the above code to reflect where the file is located.

Once you save your .htaccess file, test to see that everything is in order. First, load up your website to make sure everything is still working properly. You will want to clear your server cache if you have one (which you definitely should be using if you’re on WordPress) and then clear your local cache, that way you are ensuring you get a fresh copy of everything from the server.

If anything looks wrong, disable the .htaccess directive by putting a #, hashtag, at the head of the line and check your work. If all looks OK, check to see that it’s gzipped. If you are using YSlow, just run YSlow on your site. One of the grading factors is “Compress components with gzip,” look to see if any of your .css files are on that list of text files not being compressed.

Real World Savings

By gzipping the .css file on Bargaineering, its size dropped from 28.2K to 7.3K, a 74.1% savings. Response time for that file went from 53ms to 39 ms, a 26.4% savings. Regardless of the actual savings, it’s a no brainer move because faster is better, especially when it can be achieved with so little extra work.

Site Speed May Soon Affect Search Rankings

by jim on December 29th, 2009

Matt Cutts, who for those who don’t know is head of Google’s Webspam team and seen as the oracle for all things Google,

At around 2:38 in this video from Web Pro News interview of Matt Cutts where he discusses the new Google site speed tools that you can use to make your site faster. The toolkit is more extensive than YSlow but the end goal is the same – identify the slow points in your site and help you improve it.

The gem in that interview was when Matt revealed speed might be used in the futureas a metric in rankings, as it’s already integrated into Adwords. It is in line with Google’s desire to continually improve user experience and whether or not it is a ranking factor is really irrelevant, you want a fast site anyway.

How to Determine Display Ad Placements

by jim on December 23rd, 2009

When it comes to ads on your site, something you shouldn’t consider until you have a few hundred (or at least a thousand) visitors a day, it’s very important that you are selective and tactical when it comes to ad placements. If you put too few placements, you might be leaving money on the table. If you put too many, you’re creating a bad user experience. You want to find the sweet spot and the only way to do that is by testing.

For the purposes of this discussion, let’s assume everything is a cost per click (CPC) ad like Google Adsense.

Create a Good User Experience

Goal number one should always be to create a good user experience because without your readership, your site will languish. Without comments, the site will feel lonely. So in all of your testing, you should keep this in mind – “If I came to this site for the first time, do the ads turn me off?” If you don’t feel like you can determine the answer to that, ask your friends. Ask someone who doesn’t know you run the site and ask if they think the ads are annoying. If you aren’t sure, chances are the ads are annoying. If they’re clash, they’re annoying. If they’re everywhere, they’re annoying. If there’s a point on the site where you expect content and instead there’s an ad, it’s annoying.

Avoid these at all costs.

One of the reasons why I recommend doing this after you have a thousand visitors per day is because you need that many to have enough data to analyze. If you only have 100 visitors a day, it will take you a much longer time to get enough data to make a meaningful decision. Testing at that stage is less effective.

Test Placements & Track CTR

CTR stands for click through rate and it’s the number one metric you need to be aware of when it comes to cost per click ads on your site. Whenever you put an ad up with Google Adsense, make sure you assign it a channel so you can track the Page CTR. Then try a bunch of different placements independently and see how each of them performs. You should see CTR values all over the map with some placements performing better than others.

The left sidebar skyscraper (120×600 and 160×600) has always, on every site I’ve ever tested it one, performed the best. It beats out the right skyscraper, a right 300×250 rectangle, it beats a 468×60 banner in the header (to the right of a logo), and beats the 468×60 at the end of a post. The only one I haven’t tried is a 300×250 rectangle underneath a post title but before the post content, I think that violates rule #1 – create a good user experience (though I hear it’s good for generating revenue).

Test them independently because the first block on the page will usually have the most targeted and profitable ads. By only have one ad per page, you are keeping everything but the ad constant. If you use multiple ads, you run the risk of one block having a better targeted ad and thus getting a higher CTR because of relevancy rather than placement.

Pick Two, Test Some More

Once you’ve played around with some ad placements, put ads in the top two performing spots. After you run that for a while, review their CTRs and consider dropping one of the two. You may even consider dropping both if the CTRs are so low or the revenue is so low that you prefer to keep the real estate for more valuable content.

On Bargaineering.com, the homepage does not have a left skyscraper advertising placement because in reviewing the data, I saw the CTR was low relative to the CTR on individual posts. This is because most visitors to the homepage are regular readers (my guess) who are not interested in ads (proved low CTR). Why show ads to people who don’t want them? I took them off so that I could show more content on the homepage.

With data, specifically CTR data as a proxy for interest, you can make an informed decision on where to put or remove ads.

I’ve been discussing some optimization topics with Mike at Four Pillars, a personal finance blogger in Canada, and he asked how many impressions I will need before I determine one version is better. He specifically said “It seems like some [people] make changes and then immediately draw conclusions about the effects when they really should be doing a longer test.”

Yes!

Far too many times people make changes, immediately see a clear cut winner across a small number of impressions and swap versions. Huge mistake.

Here’s data from Bargaineering from a test I kicked off on December 15th:
Early results can be misleading

Through the first two days, it’s clear that Blue (original) is kicking Orange’s (alternative) ass big time. Through day three, the gap is narrowing but across the first three days it’s obvious that Blue is better right?

Then suddenly on day four the script flips and the alternative is performing better and has been for the next four days. According to Website Optimizer, the alternative has a 70.1% change to beat the original with a 2.51% improvement in CTR (that is to say we can increase the original’s 39.8% to 40.8%).

How is this possible? The number of impressions across eight days isn’t a lot – ~330 a day split across two versions. That’s 165 each… it doesn’t take many clicks to move the needle a few percentage points. That’s why it’s important to let the test run and not trust early results.

How many impressions do I need? In the case of Google Website optimizer, I let it tell me when the results are statistically significant. If I am split testing, I want to see 5,000 impressions (10k split across two versions) to be sure unless the CTR difference is very large through 2,500 each. If you can’t get that on a particular test within a week, I’d focus on other aspects of the site.

I know the best way is to decide beforehand what your level of statistical significance is but considering I’m the final arbiter anyway (and it’s arbitrary!), I just wing it.

When you focus on optimization efforts, what are your criteria?

Improve Site Speed with YSlow

by jim on December 23rd, 2009

YSlow is a great Firefox add-on that is integrated with the Firebug extension, which is itself a great extension for web development.

First, download Firefox. Then visit the YSlow website to get Firebug and YSlow. You’ll have to restart but once you do, look for the little bug icon in the lower right.
YSlow and Firebug in Firefox

Once the Firebug panel comes up, find the YSlow tab (likely the right most one) and click it.

Running YSlow

I use the “Small Site or Blog” ruleset when looking at blogs but I recommend running it on all three to see how their rules differ for reach “type” of site. Then click on “Run Test.”

YSlow Results

From here, you probably will discover a laundry list of items you can optimize on your site. The results I show above are from Bargaineering and they’re an “A” because I spent a few hours implementing all the optimizations they recommended. When I first ran it, the overall results were a D or an F (I forgot, but at that point it doesn’t really matter, does it?).

If some of their recommendations don’t seem to make sense in the brief explanation they give, clicking on read more will send you to their Best Practices for Speeding Up Your Web Site website with more detailed information.

Low Hanging Fruit

If you’re looking to fix only a few things and want the most bang for your buck, I recommend jumping over to the components tab and looking at all of your images. You can try optimizing the size of your images using smush.it, click on the smush.it link to the right and a pop up will show you the size savings of smushing the images. This will only work on images hosted locally because you won’t be able to modify images you may be hotlinking (which is a bad practice anyway). If you want to save it, just click on “Click here to view or save the result image” and overwrite your existing image.

Another quick win? Tell your server to gzip content from your site. “Gzipping generally reduces the response size by about 70%. Approximately 90% of today’s Internet traffic travels through browsers that claim to support gzip.” That’s a pretty substantial savings for very little work.

If you’re using WordPress, I throw this at the top of my theme’s header.php :
<?php
if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) ob_start("ob_gzhandler"); else ob_start();
?>

As always, test after any change to ensure you didn’t frak something up.