Category Archives: SEO

Awesome heat mapping tools to increase your conversions.

Considering that this is the year the web concentrated on chasing real-time services, it’s no wonder we’ve seen web analytics packages jumping on the bandwagon. Watching your visitors exploring your site as it happens, can highlight problems with landing pages, and give you a better understanding of the experience your site visitors are having.

Heat mapping tools pick up where many analytics services leave off – highlighting the popular sections of a web page, and in some case highlighting where people are trying to get to. Heat mapping tools are often used within the field of  usability testing, as well as providing additional business intelligence.

Whilst many site owners are probably on the Google Analytics bandwagon (considering it’s free free free!) – there is value in supplementing this package with a decent heatmapping tool. This post showcases some of the best out there on the web, both commercial and open source tools.

ClickDensity

URL: http://www.clickdensity.com/

Pricing: Freemium Model first 5,000 clicks are free.

click-density

ClickDensity easily distinguishes between heat, clicks and hovers actions from the users mouse. ‘Heat’ is determined by combining both clicks and hover actions – portraying both user hesitation and actual intent. You can also break the page down into the  two separate elements.

Showing just clicks can highlight problematic parts of the page e.g. elements of the UI that look clickable – but aren’t.

Showing just hover actions can show you elements of the UI which feel particularly ‘unsafe’ to the user -e.g. a delete button which looks like it may delete everything rather than one particular element.

CrazyEgg

URL: http://www.crazyegg.com/

Pricing: Starting from $9 a month

crazy2

Whilst CrazyEgg doesn’t have any free options for trial purposes, its does have a demo allowing you to see both the interface and reporting functionality.

Functionally, CrazyEgg offers four different ways to analyse your click data. Overlay view, list view, HeatMap and Confetti view. I found list view particularly useful, as it highlighted the percentage of clicks each element receives quickly, and lets you export to Excel for further cross referencing against other data.

Heatmap and Overview view are similar in nature, as they are visual displays of click data. The heatmap graphics are superior to the way that clickDensity show the data – the “dots” are joined to create much more visually appealing heat- “maps” – as opposed to “heat dots”.

Confetti view is also a great feature, as visually it lets you see how different website referrers behave. This could let you see if advertising is working, or if a particular referrer is less likely to click on your advertising.

ClickTale

URL: http://www.clicktale.com/

Pricing: Freemium model – starting at 400 pageviews a month

clicktale

Clicktale came to my attention as one of the only web analytics package that can actually record mouse movement in real time, and play activity back to you. I even had a bit of fun working out how they went about doing it.

In addition to this functionality, Clicktale offer two types of heatmap. Firstly their scrolling heatmaps allow you to see when people pause on a page, and the impact that the fold has on your content. I’d highlighly recommend giving this a test run before you engage in a new redesign of a site – understanding how visitors engage with the content on your site, can help you immensely at the information architecture stuff. It also lets you better decide what should stay above the fold, and what really matters to your audience.

As with the other packages reviewed here, click tale also offer a “click” heatmap. Their link analysis helps you to discover not only where visitors click, but also where they hover, how long for. If you really want to get into the nitty gritty of how to improve conversions – using data rather than gut instinct, this is a must see package for web marketers.

ClickHeat

URL: http://www.labsmedia.com/clickheat/index.html

Pricing: Open Source

clickheat

Click heat is an open source heatmap tool that runs on the Unix platform. For my site install, it was a breeze to install, with nothing other than the insertion of the required javascript to do. (Along with the obvious upload of the files to the server).

There’s also a clickheat plugin ready to roll with the system, which adds the necessary code for WordPress. The feature set is as you would expect – pretty minimal, but it gets the job done. It does however, allow you to break the results down into browser widths, and versions – which can highlight patterns that you wouldn’t have otherwise seen with the results averaged out.

If you are on a tight budget, and have a client requiring heatmapping functionality, it may just be what you are looking for.

Other Links

http://www.jnathanson.com/blog/client/jquery/heatcolor/index.cfm

http://www.rogerstringer.com/projects/wpclickmap

http://blog.corunet.com/english/the-definitive-heatmap

slow-mo

The Tortoise or the Hare approach to blogging?

If you are blogging for traffic and increased exposure, you’ll know that the more you write, the better your website will perform. But how often should you post on your blog?

From talking to (and reading) a few other bloggers, some people choose to publish often, but with shorter posts (the hare approach), other’s take a slightly longer, more substantial approach (the tortoise ) that takes a few days to either research or write up blog posts. There are benefits and pitfalls to each methodology from a site traffic perspective, with the end goal being the same; increased visitors, increased subscribers, and ultimately more business as a result.

The Hare Approach

speedie

Benefits

You are likely to accumulated a greater percentage of subscribers faster than bloggers who leave it a while before they post, and your statistics are likely to reflect a greater number of repeat visitors from people hunting for your freshest content.

Negatives

Every post that you add, is an extra post in Google’s index. You can play the numbers game and churn out a couple of posts every week to increase long tail searches, but ultimately to get higher traffic  (and hit a few short tail keywords) you are going to need incoming links to those posts. Collections of links are unlikely to obtain many incoming links compared to say, a short snappy newsworthy piece, but will provide visitors with entertainment.

Prior to the google update that SEO folk termed “The Florida update” – having lots of pages on your site was a perceived indicator of authority – once this was out of the bag, spammers abused the priviledge, and churned out buckets of auto generated content, with the result that it wasn’t long before Google canned this in the algorithm. Having a large site no longer means that you will generate more organic traffic from the search engines, so posting for the sake of it doesn’t help you anymore. A good thing for the web on the whole.

Its an extremely slow process to try and increase traffic through the long tail by just playing the numbers, and increasing the number of posts you have indexed in Google, so persistance and commitment are the key here. (BTW to find out how many you currently have do a “site:domain.com” search on Google). Eventually, you will stumble across a few decent keywords that generate a nice level of traffic for your site.

Maximising your posts if you post short and snappy.

If you are a discoverer or a collector, you’ll need to supplement existing finds with either summary posts (that people are keen to bookmark or share on social bookmarking sites) – or do some lists of the best finds for the year.

For example, if I ran a blog in the niche of “fashion trend” and collected items from various retailers daily that I though my readers would enjoy – a summary post would consist of something like “Winter fashion for 2009″  – including a collection of the garments that go well with each other to create an outfit. Likewise if a collection of posts are on a similar topic, pulling them together into one place can make their combined value much more linkable to.

The Tortoise Approach

slow-mo

Benefits

You are much more likely to become authoritive in your niche if you are a tortoise. Reason being, that a comprehensive post which covers everything on a particular topic is linked to more, and referenced than five smaller ones which are disjointed. Incoming links are statistically more plentiful on a post which has been worked on longer, researched and polished.

Negatives

Slower posting patterns can put some subscribers off. With social media being used to promote many blog posts now; the slower you post, to a degree, the less people you’ll  be getting in front of. Writing long posts also takes more time that may be better spent working on your business itself.

Maximising your posts if you post long and length.

I’ve spoken before about the need to pimp your shit yourself. If you run a small site, even a linkbait article is unlikely to get links unless you promote it around the web yourself. Once you’ve got an audience, this balls starts rolling itself.

There are plenty of places to do this and seed the content initially. Your goal should be to get your content to go viral. This will give you the greatest reach, and in turn more incoming links.

Overview

Mixing up the different styles allows you to benefit from both, with some short posts some weeks, mixed in with longer more comprehensive posts the next.  Knowing the differences between these styles will help you to become a better blogger, and hopefully wack you in the side of the head if you are stuck in a rut only using one technique or the other.

What do you guys do from a technique point of view? Are you a tortoise? Or a hare?

mattcutts-video

Hangin’ with my home boy Matt Cutts.

Got this pleasant surprise this week, as I got one of my questions on SEO answered by the head of the spam team at Google – Matt Cutts. Matt frequently video blogs Q&A sessions for the webmaster community over on Youtube. I’ve been finding for some time that certain sites (for most likely commercial reasons) on Google’s part get a bit of a boost in the SERPs at inception. Whilst this video doesn’t confirm or deny this theory, it certainly backs up its likelihood.

If you think about it, with splogs being used daily by spammers etc, its not terribly difficult for Google to footprint a particular piece of software, and use that to help in its algorithm. E-commerce sites for example will have certain elements that uniquely identifies them – add to cart buttons, view shopping pages – and that sort of thing. In much the same way that spamming software can find blogs and vbulletin forums – Google’s spiders can do exactly the same thing.

This video confirms that they do have a go at doing so, but whether that is for an enhanced user experience or for algorithmic advantage remains to be seen.

rene

Out of the box link building ideas.

Links are the currency of the web. If you really want to see growth in the number of visitors to your site, you need to fight tooth and nail to win new incoming links to your site. Not only does it result in traffic from the links themselves, but more importantly it sees you getting more traffic from keywords in Google.

Growing your authority is one way to help increase your exposure, but for many getting to that point is difficult. In addition, even good content can sit without attracting links if you are small and don’t promote your content.  So you’ve already started doing that, what else can you do?  How do other’s achieve white hat links?

Themes

Think of the number of software packages out on the web, running people’s websites, and the number of them that require graphically based themes. WordPress, Magento, Blogger, Tumbler.

If you are a dab hand at photoshop, and can make something beautiful, you’ll already positioned to get new links to your site. If you are not, consider commissioning someone who is.

WordPress is probably the main contender for themes online, you only have to do some keyword research to realise that the number of searches per month is absolutely mental. As well as potentially capitalising on this traffic, if your theme is anyway half decent it will get picked up by larger sites when they are compiling a list of themes. Free themes = link porn.

Icons

I’ve seen this over and over again. Free icons are another link magnet. Not convinced? Ever heard of Fam Fam Fam? Course you have. They turn up on a ridiculous number of searches for free icons. Have a skinny at their link profile on Yahoo site explorer. For a site that hasn’t been updated since 2006, there are over 2 million backlinks to it. That’s insane.

There’s a boat load of action to get in on there. Many professional icon designers give away a small subset for free, to attract people to their site in a bid to make them a customer of their commercial icons, which isn’t a bad tactic. With software products not generally having loads of time to create their own, this is clever online marketing.

Data

Data is a secret weapon for many sites. It’s providing that content in an easy to consume, newsworthy way that’s the problem. Campaign Monitor have it nailed.  Email client popularity reports. I think they released data on the words which generated the most email opens in their email marketing near the start of the year as well.

Information which is related to their product, yet helps ordinary folk make decisions is gold dust. If you are storing data on visitors to your site at all, look at it from an outsiders perspective, and instead of using it just to power your site;  analyse, compile and present it in a beautiful way to help people make informed decisions. Bottom line? Do everything in your power to become the Google trends within your industry.

Time Savers

There’s a whole subset of the online community obsessional over saving time, and becoming more productive. Lifehacker and Zen Habits, pretty much have this niche tied up from a content perspective.

That said, there’s still people hungry for new desktop applications that save time.  If you are a programmer, and want to attract links there’s nothing quite like giving away a software product for free. It attracts links like bees to honey.

Think in terms of quicker deployment for developers, quicker downloads for user, batch saving of files. Anything really that you’ve grokked as part of your day job to speed a task up, is a candidate for fresh relevant links.
Overall, there’s nothing like a bit of creative thinking to help you build fresh and relevant links to your site. You just have to keep at it, and over time, the rewards will come.

local-traffic-leads

10 useful resources to generate local traffic and leads.

Relevant traffic is important for small businesses for a variety of reasons. If you own a business that deals primarily with local people, and are looking for your website to increase sales, there’s absolutely no point in having a shed load of visitors who have a quick read of your content, then disappear without actually converting into sales. Unless of course they are providing you with either backlinks or user generated content which will help grow your site. This is especially important for the folks who are providing services or products at a local level. For some businesses its better to have 100 visitors from the local area, than to have 10,000 from around the world.If you are providing a service or product that can be quantified via your Analytics package, then conversion rate should be the metric you are watching – not the quantity of visitors to your site. So how do you go about getting additional localised traffic?

Google Local Search

URL: http://www.google.com/local/add Google Local allows you to add your business to the Google map listing which is starting to come into play more and more for localised searches. To illustrate here’s an example. Let’s say that I’m at home, hungry and fancy a pizza. I might type “pizza” into Google, but then that will mean (to Google) that I may be looking for a definition of what pizza is, I may be looking for a pizza image or video, or indeed looking to become a pizza franchise owner. localresults Because the search term is not qualified to a locale, and has no verbs the listing has to be broad. If however I include a regional qualifier – e.g. “pizza in Belfast” or “pizza in Dunmurry” this denotes that I’m looking to find businesses in an area that provide that service. Simply put try a noun followed by ‘in’ followed by a location, and chances are you’ll come across a local listing. “cars in newcastle“, “boats in London” – you get the picture. It follows then that local traffic will come, if you can integrate these regional qualifiers in your copy and website title, and take care of your on – page ranking factors. Google also has its own proprietary algorithm for determining who gets the first listing, but getting reviews is thought to help. It also can have a positive impact on the number of click throughs you receive from the serps. To actually get listed in Google local, you have a couple of options. You can add a telephone, and an address to your website copy, and hope that Google finds you, and adds you. Some people who were added in directories such as Yell, automagically got added into Google local. Or you can just visit this link, and follow the on-screen instructions, it is of course free and easy. ;o)

Bing Local Search

URL: http://www.marketlocation.com/changereq/

Update: URL: http://www.118information.co.uk/contact/

URL: http://www.bingforbusiness.com/ New on the scene yesterday – Bing is Microsoft’s new offering. They haven’t as yet sorted out their local offering fully, insteading pulling data from other services for their organic local listings.The search engine for local results is also outside of the main index on a different URL in comparison to Google which implements the full shebang inside the results. Bing Local is powered by Multimap, which in turn pulls data from Marketlocation.com – a bit of a round the houses approach to get going, which will undoubtedbly change over time. For that reason to get listed you’ll need to go to market location at this URL to get listed.

bingresults The offering is more detailed than Google’s and each local listing in Bing provides additional snapshot information – including directions. However they do go down the “service” then “location” route – with two input boxes to help users perform the search more accurately and put them in that mindset.

Yahoo Local Search

URL: http://search.infoserve.com/are_you_listed.asp?action=NoBusinessFound Yahoo local search gets its data from infoserve, and also feeds data to cityvisitor amongst others. Yahoo supplements its data with user based reviews of products and services. Its also worth getting positive reviews from happy customers here. Don’t be tempted to fill in fake reviews. This is the web, and you’ll be taken to the cleaners by whoever spots your self service. With social media being what it is today, it only takes one small snowball to start rolling to ruin your credibility.

Thompson Local

URL: http://www.thomsonlocal.com/free-listing.aspx Thompson Local is one of the better directories to get listed in, as they have an already well established online and offline presence. Thomson began operating in 1980 and quickly established itself as one of the leading Local directory publishers. Today, Thomson produce 173 editions of the directory, distributing some 22 million copies. All this information, and more, is available and now easily searchable on ThomsonLocal.com. As well as commercial advertising options it is possible to get listed in this local directory for free. They have a wide range of partner sites which use their data, and ad network, so getting included is useful for regional traffic.

Yell.com

URL: http://www.yelldirect.com/internetadvertising/standardlistings/ The Yellow Pages I’ve mentioned before as a business the web changed forever. Whilst I did give them a bit of a pasting before, they do offer free listings for businesses. If you are listed in this you are likely to at the very least appear in some additional searches, and whilst your website traffic may not improve as a result, you may get some additional sales leads off the back of it. Yell do however offer their Business Database as Yell Data – which they can do what they want with, including selling it to third parties for telemarketing purposes.

Scoot

URL: http://www.scoot.co.uk/advertise/free-listing.html Scoot is a wholly owned subsiduary of ITV plc. in order to strengthen its online advertising arm. They have a free listening available, which includes a website URL, and interestingly even a Twitter account Id. They offer a mobile site for those of you who can access the web with your phone. They also have a couple of additional fields, which denote whether you can sell online, what time your business opens amongst others. Obviously the more you give, the more you get.

Other Local Resources

URL: https://www.touchtarget.com/product/search?skip_confirmation=0

Touch Local at time of writing has over 6 million monthly searches for businesses across the UK and Ireland.

URL: http://thephonebook.bt.com

No doubt you are already in there, but just in case. The phone book may also lead to free listings in other places.

URL: http://www.bizwiki.co.uk/ Feeds into numerous other directories – including Townpages.

URL: http://www.city-visitor.com/products/payment_free.html

bing-logo

Ba-da bing? Or not.

bing-logoToday, Microsoft have just relaunched their latest addition to the search space, Bing. They tout it as a serious competitor to Google but then, we’ve heard it all before with Live search. However since that launch MS have acquired Powerset with an aim to integrating natural language features (and wikipedia data mining) into Live Search. Bing is the result.

Microsoft are marketing Bing, not as a search engine as such, but instead as a decision engine. Wolfram Alpha yes. That’s a decision engine. Bing. Not so much. To me it’s yet another clutch at straws and smells of competitive insecurities. The main part of the decision engine appears to have been implemented as predictive search – a feature Google rolled out from labs to the main engine some time ago.

Feature Set

The feature set is as you would expect pretty similar to what Live.com was With a few added extras. Firstly, one of things I noticed was that it offers RSS results. Hooray for anyone looking to build an application from the results in any way, and if you are into brand monitoring and reputation management, RSS will be a welcome addition. Although you can (with a bit of work) do the same thing in Google, this is much more transparent and offers a good bit more scope.

SiteLinks for Strong Domains

Microsoft have also implemented one word results for domains which are particularly strong. Try doing a search for something like “Digg” and you’ll get sitelinks and one result. I’m not sure if this is a clever move or not, I guess it will take more refined search terms such as “Microsoft finances” to get a deep search down into the particular domain in question.

To see who and what everyone’s searching for most, in comparison to Google’s QDF algorithm, Bing offers xRank. See more here on the documentation for XRank. If you are a bit of a celebrity hunter or fancy yourself as the next Periz Hilton, Bing’s XRank browses the web for recent hot topics on certain people. Incidentally, as I was writing this article I did a quick check for the term “celebrity blogger” on both engines. Periz Hilton came top on Google. Bing got it way wrong – and brought back a Myspace result as first. Not good.

Shopping Searches

A shopping search on Bing brings you off site to ciao.co.uk – another MS owned web property filled to the brim with adverts. Google shopping search trumps this site ten times over – and will continue to do so until MS integrate inline shopping results algorithmically into their engine.

Image Searches

Something I was however impressed a bit more was image search. It seems as good if not better at bringing back relevant images for my terms. See below screenshot for Google:

Overall

Overall Bing seems like another rushed job. With Wolfram offering something different than Google, and the already failed attempts of Live search to impress. Bing really had it all to do. Algorithmically, from what I can see it is Live search, with a few bolt ons that really make no dent in Google’s dominant position in search. And you don’t get a second chance to make a first impression.

toolkit-seo

Bookmark these SEO tools.

SEO is a wide topic – one minute you are researching, the next managing someone’s reputation in the SERP’s. Luckily some of this can be automated, and over the years some sites have positioned themselves as leaders in providing tools that every SEO needs. Hopefully this roundup summarises some of them. Feel free to bookmark. ;o)

404 / 500 detection

Detecting when a link is either broken or giving the wrong status code is an important part in the diagnostics part of an SEO campaign. Time to pull out the automated tools to find these links that are potentially not indexed in Google, or worse, passing incorrect status codes.


HTTP Status code checker – check you are getting the correct status code for pages.

Xenu’s Link sleuth – great for finding 404’s on your website.

Online Broken Link checker – online version of Xenu – doesn’t crawl as deep though.

WordPress Broken Link checker. plugin checks your posts and alerts you to problems.

W3C 404 Link checker tool.

1000 links checked by the Link Tiger.

Sitemaps

A sitemap can help to get your website further indexed in the search engines. There’s no point in having bucket loads of content, if the Big G cant see it.  Each of the search engines now share the sitemap protocol, so you can just generate one, and submit to the right place.


Generate Sitemap online – XML Sitemap crawler

Google XML Sitemap generator plugin for WordPress

Dagon Design – alternative site map generator plugin to the above.

Submit Sitemap to Google here, MSN here, and Yahoo here.

Keyword Research Tools

Keyword research is an important part of a site development and promotion as well. I’ve mentioned it before. So here are some of the tools out there that can hopefully help you in finding out what keywords to chase. Some of these tools can also help brainstorm secondary phrases and terms, as its important to mix it up in your copy, and in external link text.


Priority Submit’s research on keywords

Webmaster Toolkit Keyword Research Tool.

Google Adwords Research Tool.

Keyword suggestion tool from SEO Chat.

Keyword forecast tool from Microsoft Adcenter

Highposition Keyword research Tool

Keyword Discovery – free api access.

Google Trends – keep an eye on seasonal trends.

Powered by Wordtracker Aaron wall’s keyword tool is great.

Google Sets tool finds related terms

Msn clustering tool

Rank Checking Tools

Rank checking allows you to see progress when you’ve started building up authority and links. Some of these require you to login and add API key’s that collect data over time; others you can visit at various intervals to quickly lookup results at set times.


Market Leap’s Rank checking Tool

The infamous Digital Point rank checker

Search Engine Genie Rank Checker.

Mike Ranking Reports across multiple engines.

Keyword Envy Tool

Page Rank bot - determines pr across datacenters.

Keyword Density Tools

Keyword density on page is a bit 1999, when the search engines were a bit stupid.  I don’t really stress much about keyword density – I’d rather concentrate on off page seo..but a few people still feel that is is relevant. Read here for other SEO opinions on Keyword density. But for those of you who want to  knock yourself out.


Analyse on page keyword density with Ranks spider

Rankquest Keyword density tool

SEO Chat Keyword density tool

EvrSoft Keyword density.

SEO Plugins WordPress

WordPress has become one of the more popular blogging platforms out there, and with a few SEO tweaks can rank really well organically. The below are some seo plugins for WordPress that can help it along in the right direction.


Joost De Valk’s Meta Robots WordPress plugin

HeadSpace Plugin.

SEO Title tag plugin

All in one SEO Plugin for WordPress

301 Redirection plugin – useful for moving blog posts

Another WordPress Redirect plugin.

Removes common words from your URL slugs.

Search Meter – Record internal search terms.

Page Strength Tools

The ‘moz have pretty much rapped it up with their awesome trifecta tool, which can measure social media, estimate traffic and can be used to find out where your weaknesses lie. PopUri.Us pulls in data from a couple of services to give a feel for site popularity – but isn’t really that  serious a tool.


SEOMoz Trifecta – Domain blog and page strength tool.

URL Analysis Tool.

PopUri.Us – Quickly check popularity

Backlink Tools

Backlink tools can let you see where your competition are picking up links, and if there is low hanging fruit, you’ll be able to pick it up too. They are also useful for sitting back and admiring at your linkbuilding efforts. ;o)


Yahoo Site Explorer – retrieves backlinks to your site.

LinkHounds Yahoo Backlinker tool – filters the results into unique IP

Domain Backlinks Checker. – sorts by first and second level domains

Anchor Text Backlinks Analysis – finds the backlink text that you’ve been linked with.

Smart Backlinks

Analyse Backlinks

Link Diagnosis

Backlink Watch

New Page Inlink Checker.

Link Analysis Tool from Majestic SEO.

Page Speed Tools

Page speed isn’t directly linked to SEO, but a fast site, is a site with a lower bounce rate. Visitors get a better experience, and leave your site with a positive opinion. It also helps to encourage the spiders to crawl your site faster. If you get crawled faster, you break news faster.


Rankquest’s Web Speed Report

Broadband Speed Tester.

Web Page Speed Report

Priority Submits Page Size Extractor

Smush.it from Yahoo compress loads of stuff.

WordPress plugin for Smush.it

Firefox SEO Extensions

Firefox gives talent SEO the processing power of the desktop and client machines meaning that they can often execute faster than online tools. The following firefox SEO tools cover on page highlighting of SEO factors such as link text and no follow attributes.


Firefox seo analysis of current page via Firefox

Aaron Wall SEO plugin for Firefox

Search Status Firefox plugin. Includes No-follow highlighting.

SEO Quake for Firefox plugin.

Bruce Clay Firefox SEM Toolbar.

Rank Checker for Firefox.

User Agent Switcher to detect cloaked content.

SEO Workers Analysis Tool for FF.

RedFly’s Google Global Tool for country specific results.

Pagerank tools

Pagerank is only one of 200 other factors google may use to rank you in the SERPs. If you concentrate too hard on it, you are missing the other important things to take into account. Ultimately this should only be used to determine sites that are worth getting links from and likely to be generating decent traffic – rather than a bragging right.


DigPageRank -checks over 700 datacenters.

Another online pagerank checker.

Page Rank Checker - checks alexa rating / Dmoz / Domain Age amongst others.

Check the Pagerank of your backlinks

Reputation Management

Reputation management allows you to keep an eye on your brand and brandnames across the web. The following tools should help you do that, and may also show interesting patterns and opinions that you’d ordinarily miss.


Who Tweeted your URL’s? Backtweets.

Find out where your competitors comment.

Alert Rank

DIY reputation management with Netvibes.

Track conversations on the Web with UberVU

buzzstream monitoring

Twitter brand monitoring tool SplitTweet

Scout Labs

Who’s Talking about you?

Facebook Lexicon – Monitor Facebook

Domain Monitoring Tool

BuzzAgent

AddictoMatic Inhale the Web.

Buzzding.

Analytics / Usability Testing

I’ve lumped anaytics and usability tools in together here as there is some overlap. Some of the tools below also track users mousemovements and clicks on a website in real time. This can be really insightful, as it shows things such as site objects which potentially look like buttons, and are getting un-neccessary clicks. It can also highlight other potential problems which are preventing site goals from being met.


Google of Course.

Stuffed Tracker.

Reinvigorate – Another real time Analytics package

Clicktale

Woopra – Real time tracker.

Userfly – Records mouse movement on a site, and plays back to you.

Robot Replay. Similar to Userfly only FREE!

ClickTale

CrazyEgg – Heat Mapping Analytics package

SilverBack – Usability testing for Mac’s from the guys at clearleft.

Bad Neighbourhood Detection

A bad neighbourhood  is a section of the web which is potentially known by Google to be spammy. You should avoid obtaining links from these places to avoid penalties in the SERPs.


Queries Google’s Safe Browsing API.

Neighbourhood checker from Majestic SEO

Compilations

Some sites offer a variety of tools, both commercial and free for SEO research. The following are the sites that I know offer a variety of different tools, and are relatively well known in SEO circles.


Randfish and the SEOMoz Team’s Tools

Yoast De Valk’s Tools

Bruce Clay’s List of Tools.

Khrindo Free SEO Tools compilation

SEO Consultants Tools.

Aaron Wall’s SEO Tools.

Raven John’s SEO Toolkit

Information Tools

These are just general tools which may be useful for webmasters and developers alike. Still worthy of a mention as some research needs to find out  other information.


Domain information – Whois

Visitor IP address / Browser Info. – Great for Web Dev’s too.

Is Google down or is it just me?

A variety of domain lookups etc.

Check your Mx Records online (for mail diagnostics).

Reverse IP Lookup – find other domains on the same IP Address.

Whats that server running and uptime monitoring at Netcraft.

Estimate Web Traffic with Web Traffic 24

More competitor analysis tools over here. (shameless plug)

PiggyBank

Piggy backing off authority sites to maximise traffic

If you’ve been writing content online for any length of time, you’ll recognise that the SERP’s tend to be more competitive in certain areas than others. In order to gain traction, you’ll have to come up with something that brings visitors in through the backdoor. This is where the technique of piggy-backing comes in.

Piggybacking is the practise of using an existing websites authority to drive traffic to your own, if used wisely it can also spread your content butter further than it normally would go.

piggy

Here’s how it works.

Instead of thinking in traditional terms, and solely promoting content on your website, the key here is to write your content on your site first, then use these secondary websites to do the promotion for you in the SERPS. Due to the fact that they have existing authority, they rank much higher by default, on some competitive phrases. The visitor finds the secondary website first, then bounces along to your website pretty much instantaneously, giving you the welcome boost in traffic.

There are a number of different types of websites that let you do this, all with varying levels of control on the information published to them.  Normally these secondary sites have quite good on page SEO already – especially if they are a big player online, and whilst you don’t get to tinker with the source code as such, the page can sometimes aid in your pursuit of major keyword domination.

Before I start its worth pointing out that if you are planning on copying and pasting information around the web – go ahead. See what happens. It wont be pretty, and you will crash and burn. Instead concentrate on distributing complementary content on the sites listed below to reinforce your brand, your website and ultimately your traffic.

Content Hubs

stockxpertcom_id116936_jpg_96783c97bfb17feec6acc78cd05b76d2There are shed loads of websites out there with pretty good page rank and authority that allow you to submit actual content to them.  I tend to keep a list of them bookmarked for future reference when doing site promotion. You can think of these as sales / landing pages that should convince a visitor to click through to your website.  Do however make your content unique on these to avoid a duplicate content penalty.

Agglom, Squidoo, Hubpages, Google Sites, Google Knols, Wikipedia, AboutUs.Org-  and more recently Google profiles. All of these sites allow you to create content, and indeed link to another site from them. The majority of these websites are user generated, and screaming out for new content – if you provide this for them (and ultimately the search engines) – you win.

Social bookmarking sites

These sites can provide additional presence in the SERPs, and many of them pass pagerank into the bargain. Here’s a really good list of social bookmarking sites that do pass pagerank. However this isn’t a vital part of this process. The key here is to work out which sites are consistently trusted, and are likely to appear high in the SERP’s. Digg would be another to add to the list as it is frequently crawled by Googlebot, and often appears in the index higher than your own site.

You should work on alternative titles for distribution, as this will have you effectively ranking (indirectly) on multiple phrases. You should be examining pagerank, size and authority when selecting these sorts of sites – and keep a list of them available for the next time you write. Monitor their effectiveness through Google Analytics to prevent you from tying up resources the next time around.

Video Sites

tv socialThe addition of YouTube Video in the SERPs was an important development on Google’s part. Video has already received an artificial inflation – especially if it is on the Youtube platform. After all it makes sense as it is EXTREMELY engaging for the end user, and another one of Google’s online properties. The more traffic it receives, the more Adsense exposure Google gets etc etc.

Anyway – if you market a video, and it has both your website branding embedded within it, and a link to your website from the description – you will get traffic off the back of it. As far as I’ve seen other video sites don’t count as much as Youtube, e.g. MetaCafe or Vimeo – but it’s only a matter of time before one of the major players buys them as well, and integrates their content in the results.  Take your time when working out what description and title to use. You may want to engage in some keyword research prior to promoting – also as I’ve stated before it pays to tag your content properly, as it will get better internal search results exposure.

Ownership Sites

These sites allow ownership of keyword phrases both within the domain name (known as a Vanity URL), and somewhere onpage. Examples would be Twitter, WordPress (hosted), or any of these sites over here. This is an increasingly popular implementation with web applications as vanity URL’s help site traffic for that particular application, but many fail to take advantage of it.

Powerpoint Clone Websites

There are a couple of online powerpoint presentation sites that rank well for phrases. Slideshare comes to mind, as does Slideboom. There are probably loads of others out there, but if you’ve done a presentation at a conference or otherwise, you should be using that content to maximum effect, by distributing around the various web 2.0 powerpointish websites, and piggybacking off their traffic.

Article Marketing Sites

Article marketing sites were traditionally a key part of an SEO strategy. It seemed like the ideal way to gain new backlinks quickly. Give away articles for free, embed links inside them, and voila! inbound links.

Well, you can pretty much throw that out the window now, Google is no longer stupid – and their algorithm can recognise duplicate content, and where an article originated from originally. However article marketing sites can still be used as content hubs – in much the same way as I’ve stated previously. Particularly the larger ones such as EZine Articles and Article Depot

Overview

Essentially, there are a plethora of websites out there that can be used to maximise your search profile, and ultimately (indirectly) increase your traffic. The key is to source the authorities, and filter out the crap. If you’ve got a particularly challenging search phrase that you are trying to rank for- it may be more worthwhile to concentrate on secondary satellite sites that you can place both links and content, than try to gain links to your own site.

If you took one article on your site, and reworked it to submit to the sites above, along with ammended titles – what difference would it potentially make to your traffic?

free-philanthropy

The psychology of the backlink.

Why do people link? If you can work out the answer to that question, you are well on your way to website success. I’ve my own thoughts on the different types of content that work online, and why they work. People link to web content for a variety of reasons,  but for any webmaster backlinks should be your number one goal. They will define you in front of the eyes of the almighty that is Google, and provide you with the additional stream of visitors that you have an opportunity to convert into regulars.

You can think of bloggers and other webmasters as the people who are most likely to link to you -probably less than 40% of the web community at large. Everyone else is a part of the sharing process, and a potential passer of the link virus to a webmaster.

This got me thinking. Why do people share and link to content? What provokes the response.  Here’s a bit of background reading on the psychology of the backlink, and how you can apply it to your site to further your exposure.

Resource

Content isn’t king. That’s a tired mantra. Resource is king. If you can become a useful resource around a subject matter you’ll gain new links. Resources could be roundup posts, listbait or a collection of useful tools. Essentially combining a resource post, with the correct timing for a particular user, will almost definitively result in a backlink.

bokk2

Teach

Tap into the desires of your audience, and work out what it is they want to achieve. How can you further their goal, and what skills can you reflect through your blog that establishes authority. If for example you run a blog about music, you should have posts in that teach someone how to play guitar. The web for many represents a learning mechanism that is both vast and deeper than any library – and the key is to work out which shelf of reference material you wish to sit on.

teach-linkbait

Review

Think about any major product that you’ve bought online, and you have inevitably been in the position where you have needed an unbiased review of that product prior to purchase. Become the one stop shop for unbiased review type searches that needs an instant answer, and you’ll gain links. For example: WordPress verses Movable Type.

review

Research

Take a more academic approach to a blog post, and do some research. Have you come across the state of the blogosphere from Technorati, or the A list apart survey. Course you have. Why? Because they are magnets for links, and go the extra mile in delivering something that people want to see year on year. If you can apply the same thought process to your niche, you will be onto a winner. Wondering how to conduct your survey? – hop over here from some web 2.o survey tools.

research-blogs

Entertain

People love to be entertained, and to entertain others. You can learn alot from figuring out what people enjoy watching or reading. Ask yourself the same question of the content you share with others within your chosen business area. How can you recreate a similar experience, or provide one as rewarding? Images and Video content are two ways of satisfying quickly and easily a users visual needs – work out how you can use them effectively within your message. I would rarely post a blog post without an image somewhere in it – simply because it helps draw a reader in and keep them hooked.

entertain

Give Back

As much as people enjoy taking online, you can win by giving, and being philanthropic. If you are providing your content for free on your website, you are already giving back something to the community, which is a start. Competitions and free wallpaper, screensavers etc all attract links.  However people will also link heavily to good causes, and charitable organisations – especially if they feel an emotional connection to the cause. In much the same vein, you have to give links out to get links back in, which is especially important for younger sites that aren’t yet on the radar. At the very least, an outbound link will get you a noticed by that blogger within an analytics package. Remember as well that .org and .edu domains are gold as far as search engine optimisation is concerned – a common element with charity websites, and will further increase your profile.

free-philanthropy

The better seeded your  content is the more of a chance it  stands at going viral, and you can give it a fighting chance against the numerous other sites and blogs out there  by firstly promoting it effectively via the social web, and getting it in front of as many eyeballs as possible.

If you can meet at least a couple of these goals in the content you stand a fairly good chance of receiving incoming links. Some people manage to combine all six of these together into content which just screams “link to me”. This is where the big wins are to be had. What do you guys think? Why do you link or share something?

A glossary of SEO terms and jargon

Within any professional industry there are some in house terms and phrases used that need definition and clarity to the uninformed. Search engine optimisation is no different. I’ve written some post in the past that mention some of these, without elaborating on them – this post will hopefully serve as a reference post as time goes on, educating you guys who aren’t familiar with the terminology. It’s very easy to include some of these phrases within my writing, without realising that some people may not have a clue what I’m on about – so for completeness, here’s my SEO glossary.

Long Tail / Short tail

Long tail keywords are google keywords which have multiple words or phrases in them. For example “greengrocers who sell pink apples” would be long tail. They generally generate lower traffic, but are easier to rank for (less competition). Short tail Keywords on the other hand are one or two word phrases like “greengrocers” – they generate lots of traffic, but are difficult to rank for. (more competition).

Blackhats / Whitehats

Blackhat generally refers to a search engine optimiser who doesn’t play by the rules, and is constantly pushing the boundaries of what the major search engines will allow. If it is well known that for example links are important to rank well, a blackhat will perhaps spam or create automated programs to put links around the web. I think the term initially came from some connection with wizardry – search engine optimisers being the wizards of the web?  Black being evil, white representing pure or good. Whitehats on the other hand are search engine optimisation professionals that plays by the rules and follows things like Google’s guidelines. A whitehat will err on the side of caution, and takes care not to trip any Google penalties, a Blackhat on the other hand is normally involved with naughty tactics on throw away domains, and uses this knowledge to bolster his Whitehat efforts.

Sitewide

A sitewide link is one which is available through your site, for example in the footer or on a menu which exists on every page within your website. A sitewide link will commonly not pass as much link juice as other links.

Nofollow

A link which carries the rel=”nofollow” attribute – this was a measure used to combat spam, and is implemented within common platforms such as WordPress to avoid comment spam.  A link with the rel=”nofollow” attribute, doesn’t carry any weight, and doesn’t count towards your overall pagerank.

Dofollow

A standard link which hasn’t been Nofollowed. This type of link will pass pagerank to another website or page on a website.

Link juice

Commonly used with the term Pagerank – Link juice can be defined as the amount of influence or Pagerank being passed between webpages.

Pagerank

PageRank is Google’s algorithmic way of deciding a page’s importance. It is determined by the number of inbound links pointing at the page and used (as one of the many factors) in determining the results in the SERPS.

On Page techniques

Google uses both on page and off page techniques to determine the results in the SERPs – on page techniques refer to changes made to the web page code which help align the physical page code with relevance for a particular search phrase or term.

Off Page techniques

Off page techniques are used to enhance the relevance of a webpage by gaining links. Off page links may include traditional link building or link baiting

Traditional link building

Traditional link building refers to the practise of obtaining links from third party websites. This will commonly be performed by a human, asking for link exchanges from other webmasters, submitting websites to relevant directories or commenting on blogs which are dofollow.

Link Exchanges

Link exchanges or link swaps are used between two webmasters to share traffic and link juice between two websites. They are frowned upon generally as the two links cancel each other out, and look spammy. If in doubt don’t exchange links – you may end up getting into a bad neighbourhood.

Search phrase

A search phrase is the keywords someone uses to find your website in Google. Some search phrases are in more competitive niche’s than others, and are harder to rank for.

Link graph

When used in context a link graph represents the network of links that connect sites together. It is the overall picture of how a site is linked to and from.

Link bait

Link bait is content written solely for the purpose of gaining additional new links and a high influx of new traffic – simply from the content nature. I’ve written a post on link bait over here for further reading.

List bait

List bait is the same as link bait but takes the format of a list. e.g. 10 amazing widgets or 25 top tips for x y or z.

Competitive Niche

A hard to enter market online which caters for a particular niche, but is tough to rank for. For example pharaceuticals or mortgages. As the price paid per click is high for Adsense adverts,many people chase terms such as mortgages.

Keyword Research

Keyword research is the practise of working out how much potential traffic a keyword gets. See more over here. It may also cover Adsense keyword research (to figure out how much Adsense income a potential keyword may bring a website owner).

Keyword Density

Used within on page techniques keyword density refers to the distribution of a keyword within a web page. Each search engine favours its own keyword density. To figure out the keyword density get the word count of your webpage (WC), count the occurance of a particular keyword (OCC) in it, and divide into it. Want to know what the major search engines favour ? Have a peak at some of this keyword research.

(OCC / WC) * 100 = Keyword Density for a phrase

Internal Links

Internal links are any links which link (internally) to other pages on that website. A good internal linking structure is imperative for good SEO results. Linking back and forth with good link text can help Google determine what your website is about, and thus increase the likelyhood of you being found for particular terms. Internal links also help to improve the overall number of page views your website receives, as the pages are well linked together.

Outbound Links

Simply put – outbound links are links which link to other websites, and are sometimes known as external links.

Trust Rank

Trust Rank is a link analysis technique that many SEO’s believe is present somewhere within Google’s ranking algorithm, that uses the research conducted (PDF link) at Yahoo and Stanford University for identifying spam.

Link Text or Anchor Text

Link Text or Anchor Text are the words which are underlined when a link is created. For example Web Design Ireland – this will aid Google in identifying what a site is about. SEO’s commonly use the link text of a link to increase relevance for keywords or phrases.

Spider

A spider in the context of SEO, is an automated program which is used to collect and or mine data. In order for Google to find out about your website, it has to spider (or crawl) the web, clicking on link after link to find your website. The more links to your site, and the more frequently you update your content the more frequently you will get crawled. The information Google picks up from your webpage content is correlated in Google’s databases, and after your ranking has been decided, shown in the SERP’s. A search engine spider is also sometimes known as a robot.

Robots.txt

A robots.txt file contains instructions for spiders or robots on which pages they are allowed to index. If you wish to disallow access to certain parts of your website and not get listed for that page in Google, you need to have a robots.txt file in place. This is a good resource on the usage of a robots.txt file.

Backlink

A backlink is a link obtained from a third party website to a page on your website. The more of these that exist the more traffic you will receive. In addition, multiple backlinks has the added advantage of boosting your pagerank, and increasing your relevance in Google. Backlinks are sometimes referred to as inbounds.

301 Redirect

There are a variety of status headers that SEO optimisers need to be aware of, but one of the most important is the 301 redirect. This allows search engines to determine when a page has changed from one part of a site to another, or if a site has migrated from a subdomain to a main domain – or indeed completely changed. You can check the headers a particular webpage has, but if you are migrating, ask an expert.

Bad Neighbourhood

Bad neighbourhood’s describe areas on the web that have been penalised by Google in the past, and have engaged in dubious linking practises or cloaking. Gaining an inbound or more importantly adding outbound links to bad neighbourhoods can hamper your SEO efforts at best, or at worst; get you completely banned from Google.

Cloaking

Cloaking refers to the practise of sending human visitors one copy of a page, and the site engine spiders another. It is commonly performed by detected the User Agent of a browser or spider, and sending alternative content to Google. The perceived advantage to this is that keyword densities, link structures and other search engine ranking factors can be manipulated further without worrying about the readability of a page, or the navigation for humans. It is generally a serious faux pas to engage in cloaking of any kind, and it is well known to be a Blackhat technique.

Deep Linking

Deep linking refers to obtaining inbound links to content which is buried (deep) inside your site. Generally the majority of links to your website will hit the homepage. It is better to achieve  links to content off the home page, which will improve the pagerank distribution across multiple pages. These are known as deep links.

Black Hole

A black hole site is created when a large tier 1 authoritarian site stops providing outbound links – or if it does provide outbound links they are made nofollow.  If another source is needed, another page is provided on the site for the citation, and as a result all inbound link juice is retained within the blackhole. A great example of this would be Wikipedia, which tends to dominate the SERPs for some keywords. A few newspaper sites have started to create blackholes to try and retain their linkjuice.

Rank

Rank is used to describe where abouts a particular site appears within the SERPs for particular keywords. Sometimes you will hear of SEO professionals talking about outranking someone else. This simply means that they have overtaken them in the SERPs.

SERPS

SERPS stands for search engine results page. It is the first page you see after you hit search on any major search engine, and lists the results for your particular search query.

Google Sandbox

The Google Sandbox is conceptually a place that new domains sit for a while once they are launched with a algorithmic lowered pagerank. No one knows if the sandbox exists or not, and many dispute its existance. Matt Cutts has stated in an interview however that “there are some things in the algorithm that may be perceived as a sandbox that doesn’t apply to all industries”.

Domain Age

Domain age refers to how long a particular domain has been registered for. It is thought to be a ranking factor inside the Google algorithm, with older domains thought to be more likely to be relevant than new ones. Again this is something that is fraught with heresay.

User Agent

The User agent is the client application identifier that is used to access a web page. For example if you access a web page using Internet Explorer the user agent will contain a string that pertains to it – e.g. MSIE 9.0 beta or if the user agent is a search engine spider is browsing a web page it will likely identify itself as a bot of some sort. For example Google identifies its search engine spider as Googlebot. This is used by web analytics software such as Google Analytics.

Bait and Switch

The practise of attracting links for a term, think once ranking has been achieved for it, switching the content on the page. May be used for a landing page for a service or product.

Duplicate Content

Webpage content that is an exact duplicate of another piece of content elsewhere on the web. Duplicate content doesn’t rank as well as unique, original content for obvious reasons. In other words – stop copy pasting articles, and start writing your own!

MFA (Made for Adsense

Made for Adsense sites or (MFA sites) are created with the sole purpose of monetizing the site using advertising programs – typically Google Adsense. Typically they scrape content (often ignoring copyright licenses) and rely on Google rankings to make money.

Crawl rate

The speed at which a search engine robot returns to visit your website. Frequently changing content encourages robots to return to your website to check for other content. Googlebot crawls the web in pagerank order (descending) so the higher your pagerank, the more frequently it is called.