Tag Archive: Search Engine Optimization


Tables used to be the “in” thing when it came to designing web sites. In easy words, tableless web design is basically a technique whereby page layouts manage is achieved without the use of HTML tables. Instead, text and other elements on a page are arranged using CSS (Cascading Style Sheets). This language is the brainchild of the W3C (World Wide Web Consortium). It was designed in such a way as to improve web accessibility as well as to make use of HTML for semantic purposes rather than presentational purposes.

Continue reading

The one cardinal rule that all search engines insist on is that your Web site be primarily designed for humans and not search engines. The easiest way to do this is to bring your site into W3C (World Wide Web Consortium) compliance.The W3C is the standards body that defines development standards for Web technoligies,and making your site W3C-compliuant will give you a bosst in the ranking with at least one of the search engines.There are many validation tools that you can use to find out how compliant(or non-compliant)your Website is. A basic “validator” for individual Web pages is available at http://validator.w3.org.

Bringing your site into W3C compliance is hard. The upside through , is that when used in combination with CSS(Cascading Style Sheet). You can get a much cleaner and better-performing site. In addition, from an SEO perspective, the actual content will move up higher in the page code and hence be treated as more valuable by the search engines.

From a pure SEO perspective, compliance is not the holy grail to climbing up the SERP. The objective behind attempting to become compliant is that it ensures that the copy is marked up so that it is “clear” to search engines. Achieving compliance also ensures better compatibility with mobile devices, with less chances of the code not working in mobile browsers. In any case, the point is to minimize errors rather than seek to achieve full compliance overnight.

Often, search engines donot crawl very deep into your site; you can remedy this by submitting your site and the deeper pages in your site directly. Also, create a site map based on the XML Site Map standard- which is supported by the Big three and others. There are many free tools that can quickly generate and XML site map of your site.Create a folder called “public_html”in the root folder of your Web site, put the site map XML file there, and submit the URL of the Xml file to the search engines. This way, the spider will crawl the entire site and index all the pages. One added benefit is that when your site does come up as the first result in SERPs it will automatically show the internal links under the main result heading, making it easies for users to quickly reach your internal pages faster.