Thoughts on building a quality website.

July 21, 20115 Comments
Facebook Twitter Pinterest Plusone

With the Panda update hitting many websites earlier this year, Google have been forced to be much more vocal about what constitutes a quality website. Back in May, they published these guidelines, which helped give us a glimpse into the thought processes at the Googleplex, and how quality is determined.

I’ve decided to share my thinking on each of the points mentioned in the original Google post. Some ideas are left-field, others based on information already in the public domain.  You can take much of this information with a pinch of salt, as it is of course only speculative thinking but ultimately Google are, and always have been an algorithmic company. As a result any thing they do to improve their results has to be scaled across millions of web pages.

Have a read of the original, and come back here for my thoughts.

Would you trust the information presented in this article?

How can Google determine trust? Links of course. The more links, the more trust, and this has always been the backbone of Google’s algorithm.


Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?

Is the article placed on a website which has written on this topic several times before? Is it placed in a category related to this topic , and has the author received links to pages on this topic previously.  ‘Shallow’ may be a reference to those articles which chase a topic which has become popular recently, yet have no authority on the topic. Think hot ‘Google Trends‘ type queries.


Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?

Self explanatory really. Don’t use article spinning software that chases keyword variations, and matches the content of another article too closely.


Would you be comfortable giving your credit card information to this site?

This may be a hint that visuals make a difference. For e-commerce sites, what is the bounce rate / time on site like? Do users come, puke and leave? This is going to be different across different verticals though, and please do take this into consideration.


Does this article have spelling, stylistic, or factual errors?

Make sure you raise the bar here by checking spelling and grammar of your articles. I’m a big fan of this plugin, which helps to counter this for publishers using WordPress. It uses parts of the open source Open Office spell check and grammar engine, and works really well to improve that part of the article writing process.


Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?

Automation is a no-no, and using content that is generated auto-magically, or chasing popular keywords of the day is not in the best interests of a visitor. Have a think about the low quality copy paste content that may be putting you at risk.


Does the article provide original content or information, original reporting, original research, or original analysis?

Making sure that you are the only one on the web writing about a particular topic makes sense to separate yourself from the crowd anyway. Ensuring that you aren’t copy pasting or syndicating from others will ensure a healthy flow of traffic from Google.


Does the page provide substantial value when compared to other pages in search results?

If you think about a particular keyword phrase or term, and work out what you are trying to rank for, then it makes sense to examine the competition. Google can easily take the top ten results for a particular query, build a quality score for them, and compare against your site. So improving your content in terms of its depth, and quality helps you rank.


How much quality control is done on content?

This comes back to grammar, spelling and stylistic points of your article, but if Google can perhaps see that errors are corrected over time, and that your points are made succinctly.


Does the article describe both sides of a story?

Impartial reporting – not sure how they managing to determine this algorithmically. Answers on a postcard. Other things to consider are linking out to two opposing views within the article.


Is the site a recognized authority on its topic?

Has the site received links to pages on this topic previously. Backlink text analysis showing repeated use of the topic in question would help immensely here. Also shows why you need a niche.


Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?

Battery hen content shouldn’t rank in Google, so the speed of output from a site may be being considered, and the depth of articles that are output from that publication will reflect whether it has been designed with quality in mind. Sometimes it makes sense to sit back, reflect and create truly link worthy content giving additional analysis on something, than to race to be first.


Was the article edited well, or does it appear sloppy or hastily produced?

Third time that this was mentioned in the original article, so you can pretty much guarantee that Google are looking at the small print when it comes to your content style, grammar and context.


For a health related query, would you trust information from this site?

Alot of people picked up on this (slightly strange) guideline. Personally I think it again is referring to how comfortable the site makes you feel as a user, and is akin to point (4).


Would you recognize this site as an authoritative source when mentioned by name?

Does the site receive much brand traffic? In other words do people return to it by typing its name into Google, or link to it using the brand name that it uses?


Does this article provide a complete or comprehensive description of the topic?

How much ‘learning’ content are you providing on the topic area? Would it be useful to someone coming to the page for the first time without prior knowledge of the subject area? Would they leave feeling they not only know something on the topic, but something on the topic itself?


Does this article contain insightful analysis or interesting information that is beyond obvious?

Do Google want us to add opinion to pieces, to add value? This could be determined easily enough. Using words such as ‘In my opinion’ or ‘perhaps’ would be indicative of an opinion in text. Interesting information could be subjective, but again capable of being determined algorithmically by taking a baseline on the topic, and seeing whether it is exceeded by the article in question.


Is this the sort of page you’d want to bookmark, share with a friend, or recommend?

Social metrics coming into play here. Have a read of this article if you haven’t already done so. Have a glance at the Postrank section, and you’ll get a feel for the sort of metrics Google are taking notice of. Delicious (bookmarks) are being taken notice of, as are other social platforms and the activity found therein.


Does this article have an excessive amount of ads that distract from or interfere with the main content?

Again, from a design perspective, follow the rules of quality sites in terms of where their advertising blocks are placed. (Hint) – it’s not slap bang in the middle of an article, it’s generally around the edges of a site, making advertising an optional part of the site experience, not a necessary one. This guideline does however, interfere with Google’s own internal quest to generate revenue. The more clicks they get on ad units the better surely? Bizarre.


Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?

Quality content is generally substantial content , so content length must be a factor in determine where to place articles in particular query spaces.


Are the pages produced with great care and attention to detail vs. less attention to detail?

Fourth time we’ve seen a reference to stylistic, spelling and grammar. Would obviously help weed out the articles which have been mass produced at low cost, and obviously detect non native speakers of the article, as their will likely be more grammar errors. Article spinning software also does a pretty poor job at grammar checking, so you can imagine this type of automation will be hit.


Would users complain when they see pages from this site?

Google have already mentioned previously that the ‘Hide this result‘ button on their main SERPs has been used to help their algorithm learn what to show and not.

What other factors could determine user experience or quality?

Lets imagine for a second that you are examining a website, to try and determine quality. Considering that everything has to be algorithmic, attention to detail when a website is being built would be one potential quality factor. At a site level, rather than a page level. Have a think about how many of these Google would be able to test for.

What things improve the experience for a visitor.

Mobile version – A mobile version would be trivial for Google to detect, with the mobile bot fast at work indexing the mobile web,  then comparing that to the contents fetched from regular Googlebot would be a walk in the park. Catering for mobile browsers could be a sign of a quality site.


Sitemap  / Navigation – 
Clear navigational structures improve the experience for visitors whilst navigating between pages. Your site should never have black holes for example where you ‘herd’ your traffic all into one final page. Googlebot will be able to determine how clear your site navigation is, simply by following the links on your pages. An HTML sitemap shows a clear path through your site for bots, and it follows that a score for how quickly Google can index your site would be linked to how easy it is for users to navigate.


Links
– Links are the backbone of the web, but typically commercial results tend to not give them liberally. I’m a big believer in linking out. Additional information you can’t provide yourself is good for users, and separates you from the crowd – particularly in the e-commerce space, I could count on the fingers of one hand the retailers who recognise the importance of providing additional product information in the form of links. They are a massive opportunity to improve rankings.


Breadcrumbs
– Related partially to navigation, breadcrumbs on a site give competitive advantage in that when Google uses them in a SERP you gain additional real estate for people to click on. They also allow users to navigate easily up and down your site structure, improving navigation and in turn user experience.


Speed
– Google have placed significant focus on speed, pretty much since day one when they realised it was one of their killer features.  The cynical amongst you will make the observation that Google lose money on information retrieval, (i.e. the smaller and faster your site is, the better for their storage capacity etc) but I think that’s a side issue. Speed is an important part of the user experience overall.


Attention to Detail
– Meta descriptions different from the page content. Titles present. Page well marked up. Tidy code. All things which separately couldn’t provide any insight into site quality, but combined begin to form a picture about the site owners care for their visitors.

What aspects of a site do you suppose Google are potentially using to determine quality?

Filed in: Website Promotion
Tagged with:

About the Author ()

Paul is a regular 30 year old web bloke / programmer with a penchant for online marketing. This blog is a personal outlet, with an eclectic mix of articles.

Comments (5)

Trackback URL | Comments RSS Feed

  1. JEREMY says:

    Nice article. With regards to a Mobile version of your website should it be the same website, just displaying to suit a mobile device or should you have a completely seperate website? eg. http://m.webdistortion.com

    I notice The Telegraph go with the first option while the BBC (http://www.bbc.co.uk/mobile) and Wikipedia (http://mobile.wikipedia.org) do the later. Any thoughts?

    I recently came across this – http://www.wolf-howl.com/seo/dangers-multiple-website/

  2. lubos says:

    Nice article, Paul. My little comment is that I wish that Google added some ranking methodology directly into the browsing experience. Instead of simply guessing which sites are valuable or not based on some metrics, users should be able to submit a feedback. Google already does this partially with its +1 button, but I really wish there was also the opposite -1 option to flag spammy sites. It’s seem to be the rule now that regardless of what topic you search for, one of the top results will be some site containing only super short, unsubstantial snippets surrounded by a plethora of ads. But I guess as long as Google also keeps selling these ads, it’s probably not in their best interested to penalize sites like those since they likely drive much ad revenue…

  3. Paul Anthony says:

    @Jeremy – yeah, totally agree with that concept. Not keen on the subdomain issue at all. The two things shouldn’t be separate, especially considering the link equity built up on a primary domain.

    For sites such as Wikipedia and the BBC with established brand traffic, its not so much of an issue. The BBC has ts mobile content is on a different subdirectory, with a mobile sitemap as well.

    @lubos – take a glance at the paragraph marked ‘Hide this result’ and follow the link- think that’s what you are keen to see?

  4. JEREMY says:

    Should the full website redirect to the mobile website if you visit on a mobile device?

  5. Paul Anthony says:

    @Jeremy – yeah, typically that is an accepted way to do it, based on the UserAgent. Might do an article on mobile SEO just for the purposes of clearing this up.

Leave a Reply

Back to Top