Search Engine Optimization (SEO)
Ten SEO Mistakes Made on Database Driven Websites
Search engine friendly websites is one of those often heard phrases, both from web site development companies and from their clients. Everyone knows that this is important to have, and yet it is one of the things that is actually often overlooked.
Search engine optimisation companies actually spend a lot of their time analysing a website and removing barriers to the search engines ranking a site highly. At the web development level, it is possible to build a site that is perfectly search engine friendly. One of the hardest types of sites to get right though are database driven websites. Listed below are ten of the most common issues that are created, often unknowingly, in the development process of a dynamically generated web site.
1. Pages with duplicate content - not enough differential areas within the pages, so that only small areas of the page change from page to page. It is essential that enough of the page text changes for the search engines to see an appreciable difference between one page and the next.
2. Pages with duplicate page titles - the page title is a great indicator to the search engines of the primary content of the page. Whilst this is often unique on sites such as e-commerce websites, it is often overlooked in other sites, particularly where small areas of the site are generated from a database, such as news pages.
3. Pages with duplicate meta descriptions - again, this is easy to overlook and set a global or category level meta description. These give the search engines a reason to penalise your site for not giving them enough information, and again, creating a unique meta description for every page is an essential SEO task.
4. Using auto-generation of pages as a shortcut instead of creating good content. This is linked quite closely to point 1, where it is possible to create pages that have only a tiny percentage difference between them. Databases are fantastic ways of storing information, but you still need to put the work in to fill them with content. Unique information about the subject of the page will immensely help both the long tail and the ability of the search engines to determine that a page is valuable.
5. Creating pages that are hidden behind form submissions or javascript postbacks that cannot be accessed by a search engine crawler. This is far more common that is generally realised. For instance .NET creates postback links by default instead of proper links - potentially making huge sections of a site unreachable. Likewise, it is easy to hide lovely content rich areas of your site behind a drop down selector in a form that means certain areas of the site are not visible.
6. Too many query strings - this is a common bugbear of the professional SEO, where complicated database selections create deep levels of pages, but with seven or eight &id= type strings. Additionally, some bad development methodology can leave pages with null query strings that appear in every URL but don't do anything. The answer to this is generally URL rewrites, creating much more search engine friendly and user-friendly URLs!
7. Putting query strings in different orders when accessed through different places - this can create duplicate content issues, which can cause major penalties.
8. Not using user language to generate automated pages - if you are going to create a database driven website that uses words in the query strings (or better in rewritten URLs) make sure that you use words that will help you with SEO - if you sell widgets, make sure you are using the word widgets somewhere in the URL instead of just product= or id= - keyword research can assist with this.
9. Not allowing the meta data and title to be edited easily after the site build. It is possible to hardcode the generation of meta information into a database that doesn't allow it to be edited later. Creating a mechanism for modifying this information initially helps everyone at a later stage when the information needs changing without shoehorning it into an already developed structure.
10. Creating keyword stuffed pages by using auto-generation. Once upon a time, search engines quite liked pages with high densities of your keywords, but now these are likely to get you marked down rather than up. So be aware when creating pages that long pages with lots of your products on can create too high a density. For instance listing blue widgets, light blue widgets, navy blue widgets, sky blue widgets is going to create a page with a very dense page for the phrase "blue widgets".
These are just 10 of the most common potential optimisation pitfalls when creating dynamic websites. There are many more facets to producing a great database driven site, including user friendliness, speed, performance and security, but they all add together to make the best solution to your needs.
About the Author: Mark Stubbs is a freelance writer who specialises in internet marketing and web site development. For more information on database driven websites he suggests that you visit www.obs-group.co.uk.
Source: Entireweb Newsletter * July 29, 2008 * Issue #461
Home | Affiliate Marketing | Blogging | CMS | CSS | Domain Names | Ecommerce Marketing | Email Marketing | Internet Trends | Link Building | Mobile Web | ORM | PPC | QR Codes | Real-Time Web | Resources | RSS | SEM | SEO | Social Media | Video Marketing | W3C | Website Design | Website Hosting | Website Statistics | Sales 101 | How S.M.A.R.T. is Your Website? | Got Traffic? | "Getting S.M.A.R.T." | References | Privacy Policy | Contact Us
NASIKS Productions - SEMply The Best!
Internet Marketing
Headquartered on Long Island, New York, but virtually everywhere