Web Site Architecture Factors

websitearchitectureURL Structure – the golden rule when it comes to URLs is to keep ‘em short, pretty and include the page level keyword – never ever build on a platform that doesn’t allow for URL rewriting because not only are ugly URLs, well, ugly, they are also a mistake SEO wise. Short, sexy and search friendly URLs make it easy for the user to share with their social networks or link to – not to mention how much easier a logical URL structure makes website management!

Website structure accessibility – inaccessible navigations are a real headache when it comes to SEO. A navigation wrapped in javascript is bad and a menu made from Flash is worse. Now I bet you are thinking “Javascript makes a website more user friendly because it creates things like drop down menus helping the user to make better sense of the page options.” This might be true, but we need to balance usability with search engine friendly. Firstly, we shouldn’t forget that a slick looking menu/navigation bar could render a website unusable on certain devices and in certain browsers (try switching off Javascript or Flash) but from a strictly SEO perspective it could mean that pages deep within your site aren’t being indexed because the only links to them are from a menu wrapped in code that the spiders can’t decipher.

Considered use of Javascript – following on from the point above…whatever Google says, there is clear evidence that the search engine struggles to handle javascript. Reams and reams of unreadable code could mean Googlebot heads somewhere else rather than crawling any deeper into your site. It might also be causing other issues like crawl errors and damaging your website’s crawl rate neither of which are good

things!

Canonical URLs – use the attribute to specify your preferred URL for a page. This is useful in situations where almost identical pages appear at different URLs because of something like a category choice or session ID being added. It is important to tell Google and Bing which page is the one they should index and pass all relevant link juice and authority to. Failure to implement canonical

URLs can mean duplicate content issues but more crucially loss of rankings as search engines divide link juice and page authority between the copies of the page something which could have been avoided if the correct page had been stated in the rel=canonical tag.

Unique meta titles and descriptions – to many, on-page optimization is just about changing a meta title here or there…hopefully this list will show you otherwise. Whilst making Meta title and description changes might feel like SEO from 1997, in my experience it is still a part of the bigger on-page optimization jigsaw. In my mind, it is quite a simple step in the on-page optimization process… a unique title and description for every page front-loading page level keywords in a natural non-spammy way. There are of course other meta tags that you can include, e.g. ‘keywords’ and whilst I am sure some people will disagree with me on this, I only see the use of optimizing the titles and descriptions, tags like the keyword data have been abused to the point of rendering them almost a complete waste of time. Google might not always use the title and description you give a page but at least you’ve told the search engines what the page is about and if Google does decide to use your title and description, you have some influence over encouraging a user to come to your website over the other choices in the SERPs.

Robots.txt file – a good starting point for robots.txt best practice is this guide from SEOmoz. It is always worthwhile ensuring a robots.txt file doesn’t contain any unwanted directions for the search bots, even if you haven’t added anything, someone or something working on the site before might have.

XML Sitemap – fairly common practice nowadays but still worth a mention. An XML sitemap should always be available. It helps make the search engines aware of all the pages on your website and increases the likelihood of faster inclusion in the index for newer pages.

Website speed – I’m sure this issue is right at the fore of your mind when it comes to building websites because it is a really hot topic right now. Google recently enabled all webmasters to monitor page loading speed directly from their Google Analytics dashboard; if they’ve made it that easy for you, you can bet they are using this data as part of their calculation as to where to rank your website. Google loves to improve user experience and since a fast loading page is definitely a better user experience, I can see this playing an increasing role in SEO of the future, particularly in competitive markets. Also, Amazon.com conducted a study and found that for every 100 millisecond increase in page load time, their sales decreased by 1%. Therefore the reasons for improving page speed go way beyond just SEO! There are multiple ways to improve site speed so I won’t go through them all here but all I will say is code responsibly, choose a good host and set up a CDN (content delivery network) if your client is targeting users worldwide.

 

Add a Comment

Your email address will not be published. Required fields are marked *