your link to better business solutions

Latent Symantec Indexing

In some of my previous posts, I have discussed about the factors that help in getting a site optimized to the highest rank in the Google search pages. Today we will be discussing about the ranking algorithms that Google uses to find the relevant results for your search query. The latest on the block being Latent Symantec Indexing or LSI. Initially the search engines solely looked at the frequency of the search query text in the web pages but with LSI, the bots ascertain the true theme of the webpage and the particular data segment of the document being accessed.

Have you recently noticed a wide shamble in the relevancy of your search results while you have been Googling around?

Thanks to the powerful Google LSI engine that does all that work for you. Many of you must have come across LSI and its associated mechanism but then for those of you who have not, let us take a look.

Simple indexing is a process where by a keyword is searched in an entire page and the results are presented. Frequency of occurrence of a particular keyword further streamlines the search results in the process. Thus, ensuring a proper syntax or a linear search policy is enough to produce the requested search results. However the question that arises naturally is, “How relevant are they for your search context?”

Latent Semantic Indexing (LSI) technology adds semantics to the existing search mechanism thereby improving the relevance of the search results. During the indexing process, the Google bot not only verifies keywords against the document being indexed, but also takes a holistic approach in determining which other documents can be linked with those keywords. Keywords may be semantically close or distant. The LSI utilizes a knowledge base of related keywords and some intelligent computing to semantically differentiate related words and accordingly improves on its indexing mechanisms. LSI thus allows a search engine to determine the overall information a page wants to project by looking for intelligent patterns in the content thus making the page more available against a particular keyword.

You will be able to leverage the enormous semantic power that LSI has to offer for the website of your organization. Improving the lexical arrangement of keywords and their related tags to an object on your page can now astonishingly reveal more improved page ranks. Not only this, the future looks good with a completely new Web 3.0 in the horizon and Google offering services that is ready to get you started right now.

Should Google Earth be censored?

Should Google earth be censored? This question has been going around for quite a few years and now after the terrorist attacks in Mumbai last month, a petition has been submitted in the country’s high court by legal advocates to blur sensitive areas such as Bhabha Atomic Research Centre in Google Earth.

According to the TIMES report, the petition says that Google Earth “aids terrorists in plotting attacks” and offers “absolutely no control to prevent misuse or limit access” of the services. Investigation to Mumbai terror attacks and interrogation of the only live terrorist reveal that the terrorists used GPS and other high tech tools to plot, execute and then monitor the attacks with ease.

This is not the first time when Google Earth has been asked to blur the satellite images of the sensitive areas. In 2005, Australian officials asked to remove the pictures of their only nuclear reactor Lucas Heights from Google Earth. In 2006, Bahrain officials barred Google Earth, and China too banned websites that sold unapproved images. Some countries like Holland have gone into an agreement to block or censor the sensitive areas especially their military bases. Some countries have even concealed their sensitive military bases by putting them underground. According to a USA Today report, some countries like India can detect when a satellite passes overhead and conduct sensitive military activities accordingly so that the satellites do not capture the images of these activities. Even in USA,  google mappers and google street view photographers were banned from accessing Pentagon and other military bases.

Google earth has revolutionized the way the internet works today. However, one should also understand that gifts for some might be curses in disguise for the others. When it comes to sensitive issues like national security and the like, Google should co-operate with the respective authorities.

Websites as Web Services

As we step forward into the next generation of computing, the internet is experiencing a major revolution in its domain. The web is slowly morphing from a Web 2.0 to a whole new Web 3.0. However, web 3.0 has still, a long way to go before it can be implemented for real.

“So what is the big deal?”

Web 2.0 had revolutionized the way in which websites present their content to  users. Structured layouts, the use of layers in presenting information, the buzzwords that replaced flashy banners, sleazy elements, pop up(s) and so on. The message was loud and clear: if you wanted to increase your website traffic and do business, your design had to be “user friendly”.

Today, Web2.0 has achieved whatever it had set out to achieve. The user now  see terabytes of information, laid out on a palate in a precise, structured and presentable manner. Well today, more or less this is what your perspective of the internet is. However, what does this information convey to its machine counterparts?

Of course, we are talking about all sorts of servers and end systems that collectively participate in delivering your requested information to your browser. With an increase of focus towards presentation, the information content has become more abstract for the machines to process. The machine views a web page as simply information stacked between HTML tags and formatted in ways that hide the intrinsic meaning of the information. Thus it is a difficulty to further process and mine this information for the machines.

This is exactly what Web 3.0 is being built on. However, it is not here to replace Web 2.0, but will render a parallel layer in addition to it. The new layer will add semantics to the information that will be transferred over the web. Machine intelligence can then be applied to process as well as mine this information so that web sites are no more dull pages of text and graphics.

Welcome to the world of web services and the domain of a new Web 3.0, where smart and intelligent services revolutionize what simple web sites have to offer. Web sites will now not only present information, but also expose certain methods that will allow you either full or partial access to the information repository of the service. This will assist an organization to offer their services to their B2B (Business to Business) and B2C (Business to Customer) clientele in an open platform or framework. What is ultimately being aimed  at here is, structuring information for the machines so that they can communicate themselves and present information to you that will be more customized and relevant. Unstructured information shall pave way for more structured and relevant content. A new open framework would slowly evolve for more intelligent computing.

More on this later. Till then, let us keep an eye on how things are likely to evolve!

Using Blogs for SEO: RSS and Internal Links

Many businesses wonder, what is the purpose of having a blog? Depending on your stance and marketing objective, blogs (short for web logs) are great open source tools (known as content management systems) that if wielded properly can be instrumental to SEO and organic search engine positioning.

Blogs implement topical pooling of link flow through using a platform of internal linking that makes it crystal clear to search engines what each sub folder is about.

Keep in mind that most sub folders are still viewed by search engines as entirely different sites, so through strategically cross-linking pages or posts from one sub folder to another, the synergy it creates can create spikes of link flow which translate into rankings.

Another great advantage blogs offer to static or legacy (CMS) content management systems that lack SEO prowess is, the ability to ping and promote its own content through RSS feeds and send search engine spiders deeper into a site.

The logic is simple, if you create a relevant page with a specific ranking objective (such as creating another layer to secure a competitive series of keywords, a hub page or a landing page), you can sculpt which page gets the most link weight from other pages based on site architecture and internal linking.

Blogs take the content and make it accessible instantly to search engines and puts your page in the Que for crawling from using RSS feeds. Once the bots arrive, if you have any links on the page, they follow and ping those links as well (which in turn schedule them for crawling).

This chain reaction can bring pages that have tapered off and fallen to less than favorable positions due to neglect or lack of revision or links. Blogs ensure that (a) more link flow is captured from the RSS and activity from syndication as well as (b) that the pages linked to from it have a new opportunity to make a second impression for search engine relevance.

Once a page is in the search engines’ index, it can start aging and maturing. Based on how it is linked to, the value of the content in context to other related information on your site and the authority it creates over time internally, each page is an asset to be eventually leveraged for SERP domination.

Out of site, out of mind applies for search engine spiders. If your content was fresh (way back in 2007) and you really haven’t done anything to stand out or target any additional phrases, then you give them no reason to return to pay attention to your website.

With the constant pressure of competition vying for coveted market share (which is divisible by rankings earmarked by keywords), your pages are either a relevant result or they are not. Part of that relevance is determined by post frequency (how often your site contribute new pages or revise older content), the other part is, how you link to yourself and how others link to you.

Without a solid foundation any links you acquire from other websites only have a limited shelf life. If your pages have a solid site architecture (like a blog) can move that link flow around within the site to the pages that need it most, the potency of each link can be sustained further with less effort.

Two things to consider the waning factor and the tendency for information to get archived in blogs, both can create drops in position. The remedy to this is deep linking (linking to other pages other than the homepage) with specific anchor text, continuity and volume from other areas of the site.

For example, if I have a page tucked away deep in the site (3 sub folders away from the root folder) and I am expecting to drive traffic there and have that page rank well in search engines, the fact that it is so far away from the primary navigation presents a challenge.

Think of it as a pyramid of glasses and pouring champagne in the top glass and having the spillover funnel down from glass to glass so that even the glasses on the bottom are full.

Link flow works in this manor. However, in this application, links are the fluid moving from one vessel (page) to another and if they have more than 50% fluid, then they can garner rankings in search engines (since they now have passed the threshold of internal relevance).

The point being, before you can acquire rankings, your pages need to be indexed. The more pages you have indexed on a topic, the better. Even more important, the number of pages internally linked on a topic, the easier it becomes to have the aggregate collective coherence of those pages appear for multiple keywords.

Pages with stronger internal links require less external links to rank higher. So, to answer the original question posed at the beginning of this post, the purpose of having a blog is to develop an authority website.

By reinforcing your topic (theming and siloing), your site is deemed an authority (which is the real reason to start a blog), in addition to the ease of posting updates or new content.

The short-term benefits will be obvious, the ability to create spikes in rankings for keywords in the titles and topics of the post. The long-term benefits are immeasurable as your site gains more momentum and becomes a market share devouring ranking juggernaut if managed with purpose.

Thank you for reading the SEO Design Solutions Blog, if you haven’t already, sign up for our RSS feed for more useful SEO tips and Techniques.


Let’s get lost!

Thanks to the Google Maps. Satellite based imagery and point to point detailing of terrains as well as route planning, that is what Google Maps is all about.

Imagine you go on a hiking trip, you are not sure of the terrain and you wish to explore. Google Maps shows you the terrain and plans the route. It is a free Web mapping service application from Google that powers numerous map-based services.

Google created Google Maps API to help the developers in integrating Google Maps into their own web sites with their data points. Developers just need to apply for an API key from Google, which is bound to the web site and directory given whilst creating the key. If you want to create your own web page you need to include Google Javascript into your web page. The Javascript functions add points into your Mapping application.

Initial Google Maps API lacked a geocoder. One had to manually add points, latitude and longitude but now it has been rectified.

One of the main adopters of this API was the real estate mashup sites. Today even the weather stations use it for local weather predictions.

The Google Maps API has been now integrated with the Google AJAX API Loader which creates a common namespace for loading and using multiple AJAX API’s. This allows one to use the optional google.maps.* namespace for all classes, methods and properties that are being used today into Google Maps API. This replaces the existing G namespace. This framework allows to load one API key for all supported Google AJAX API’s and also gives a common namespace for each API. This allows different Google API’s to work together.

Generally, people familiar with Javascript, object-oriented programming and Google Maps will be able to work with Google Map API.

Google Maps API adds support to Mapplets, i.e. Map + Gadgets, which allows the developer to embed externally hosted applications within Google Maps. The Mapplets run within their own iFrames that help developers to create mixed code from multiple sites.

The Google Map API uses the browsers preferred language setting when it displays textual information.

20 Ways a Company can Retain Website Visitors

A company’s website plays an important role today. Companies today have an active strategy to attract customers to their website. A company often spends vital resources to maintain a professional website. It is common for a company today spend resources to market its website. It is hence important for a company to ensure visitors remain on the website long enough to generate sales or leads for the business.

A conversion rate refers to the percentage of visitors who take or complete a desired action on a company’s website. A company often uses conversion rate to measure the performance of its website. It is not standardised and it varies from company to company. It can include anything from a direct sale on website, lead, customer enquiry or online registration.  Achieving a higher conversion rate is one of the key objectives of an effective website strategy. Retaining visitors on the website for longer has a direct impact on conversion rates. The longer visitors stay on the website the greater the chances of conversion.

There are a number of factors that affect conversion rate of a website. A website’s usability is important as it ensures users remain on the website and are able to find the information they are looking for. This article discusses the important factors that can improve a website’s conversion rate.

There are many ways for a company to attract visitors to its website. Studies show that a majority of the visitors will navigate away and leave the website. Retaining visitors on a website is thus crucial for a company in order to increase conversion and generate sales or leads.

Here are some essential tips a company can use to keep visitors on its website for longer:

1. Simple Professional design of the website. A simple but professionally designed website encourages visitors to stay and read its content. The trend today is for simple, appealing & minimalist designs. Attention to detail is important to build trust among customers.  The design of the website should reflect the market it targets. Different designs may suit different situations and context.

2. Quality of content. A simple objective of a successful company website is to keep visitors on the website long enough to generate sales or leads for the business. The internet today is full of websites that offer poor content of little value to readers. In order to encourage users to remain on the site for longer, it is vital for the website to offer quality content that is well researched and presented.

A combination of mediums such as text, images, videos or podcasts can be useful to present content effectively and make it interesting for visitors. Simple issues such as spellings and grammar can be easily overlooked and have a negative impact on conversion.

3. Offer up to date & fresh information. Visitor loyalty is important for the success of any website. Very few websites rely solely on new visitors only. If you want your customers to visit the website again it is important to update the website frequently and offer new up to date content. After all a visitor will seldom find the reading the same content again and again of little value or interest. Adding new articles and content to your website will keep your existing users interested whilst attracting new ones.

4. Provide detailed information on products or services. Promoting products and services is one of the key objectives of a successful company website. Research suggests that offering detailed descriptions and images about a product tends to improve the chances of sale via the website. This especially important on the internet as customers have to decide based solely on the product information available to them. It is better to provide more information rather than provide only the important ones. If there is too much information it can easily be separated into brief and detailed descriptions for the products.

5. Offer free resources or benefits. Offering resources or other benefits for free on the website is another effective method to attract visitors and convert your prospects into online customers. Offering free tips and articles related to your business can make your company’s website more appealing to customers who can get additional information related to your products and services. Offering related tips and articles can supplement the information on your products.

6. Offer simple navigation. The navigation or menu on the website plays an important role in usability and ensuring visitors remain on the site and are able to find what they are looking for. Many studies have shown that poor or complex navigation drives visitors away to competing websites and thereby negatively impacts conversion rates.
Navigation should be clear and prominent. It is also important for navigation to be consistent across all pages so users do not have difficulty in finding it.

7. Provide pictures where appropriate. Pictures can speak a thousand words. From design point of view pictures make the site appealing to users and thereby encourage them to stay for longer. However graphics should not be overused as it can have a negative impact. If you are selling a product then offering product pictures is crucial to promote the product and encourage sale.

8. Add a blog. This is related to some of the points discussed earlier. Blogs can be beneficial for the website in more than one way. Blogs offer an easy way to add new content to your website and encourage participation from users. Blogs are very easy to add. A web design company can add a bespoke blog to your website quite easily or you can create one free within minutes using one of the many free blog platforms such as blogger or wordpress.

9. Provide Testimonials. Testimonials are effective in building customer confidence in the company. Confident customers are more likely to be retained or buy products or services.

10. Engage the customer. A successful website takes a number of steps to ensure visitors remain on the site and spend more time. There can be few better ways than to allow customers to interact with the website and the company.  Blogs, forums and other Web2 areas have proved effective. Simple techniques such as customer feedback form or guestbook are also effective in involving visitors. If you are selling a product then allowing customers to review the products is another useful method.

11. Offer Variety of Information. It has proven effective to offer a variety of information on the website and not just information about your company. As much as you may want to promote your products and services directly; your website visitors will not remain on the website to read pages and pages of information about your company and products. If you want retain visitors for long and revisit the website again, your company’s website should offer a mix of information. It is a common approach for successful company websites to offer a range of independent information related to their market. In order for a website to be successful and build a community, a company’s website should not only offer self promoting articles but also independent well balanced information that adds value for visitors.

12. Encourage opt-in email registration. Collecting email address of your website visitors can be an effective marketing tool for your company. This can be achieved easily by offering email or newsletter registration feature on in a prominent place on the website. The list can then be used for a variety of purposes including newsletters or direct marketing.

13. Allow visitors to bookmark your website. Providing a simple link or graphic ina prominent place of the website can have a surprisingly great impact on number of visitors bookmarking the website. This is very easy to do and will promote repeat visits to your website.

14.Offer something new. An average internet user comes across a wide range of information on a day to day level. There are over 40 million websites today and new ones are launched every day. Whilst it may not be possible for every aspect of your company’s website to be completely unique and innovative, there should be something new or unique on the website. Offering something new to your users can be important for conversions as it adds interested for your users and will encourage to visit again.

The internet today is full of marketing gimmicks and piles of hyped up information content to make it attractive to users but ultimately failing to deliver something of genuine value to the end user. Offering new information or services that have genuine value for users will build confidence in your customers and prove an effective branding technique even if the information is not directly selling your product or service.

15. Provide a site map. It can be quite difficult to find information on large websites. Any website can grow large with time.  Providing a sitemap clearly accessible from all pages of the website promotes usability and prevents users form getting lost. It prevents users from leaving the site due to these reasons. Sitemap also promotes better indexing of pages on search engines which can in-turn lead to improved conversion rates.

16. Respond to statistics. Many free website statistics and reporting software such as Google Analytics provide useful information on visitors but also and content on the website. This information can be used to monitor and test the effectiveness of the various website elements. The information should be used to determine what features are appealing to users and what are not.

As we have discussed in this article once our website starts receiving visitors it is important to retain visitors on the site and generate sales and leads for the business.

Remaining points to be added.


In the two previous parts of this post, I discussed mostly about the general factors that help in getting a site optimized to the highest order of ranking on the Google search pages. In this last part, I will concentrate more on some of the technical factors.

Images used in a site should have separate “alt” attribute: After going through this sub-head, those who are not that sound technically might have a question that what is this “alt” attribute? Well, to be precise and simple, the “alt” attribute is the alternative text for the images that are used in a site.  Suppose, a user is watching your site on a browser, that does not support images, and as a result, the user will be unable to see the images you have used in your site. In this case, the description used in the “alt” attribute will help to give information about the image, to the user. Apart from this, you can even use the image as a link, and in this case, the content of the “alt” text will function similarly, as the anchor text does. If you have planned to do something like this, you should always fill the “alt” text with a brief and simple description of the image. This helps Google to understand better about the page, to which the image link is directing to. However, it is advised not to use too many images as links unnecessarily, especially, when the same purpose can be fulfilled with simple anchor text or text links.

Proper use of robots.txt: Using robots.txt properly can help the search engines to know whether they can access a particular part of a site or not. The file for robots.txt is normally placed in the root directory of a site. You may not want some parts of your site to be crept by the search engines, may be because they might not help the users if shown in the search results. In that case, you will have to create the robots.txt file for that. There are certain other ways to prevent some parts of your site from showing in the search results. You can add “Noindex” to the robots meta tag, or can even use some of the Google Webmaster Tools to eliminate the parts that have already been crept.

Use of rel=”nofollow” for links: You might have a site with a blog, which is open to public commenting. In addition, these blog commenting pages are highly at risk to spamming. Therefore, in order to avoid this, nofollowing the links that are added by the commenting users, can guarantee that the reputation of your site is not hampered. In case, you want to assure for the user added links, you can avoid using “nofollow” for the links. However, while doing this, you will have to be extra careful. Because, you might link to sites, which Google believes as being spam, and as a result, the reputation of your site might be at stake.

These are the factors that I find as most important about SEO from Google’s perspective. I hope, by now you have a clear idea about how to go about optimizing your website in the highest order of ranking in the Google search results. So, do not waste any more time and get going with your SEO plans!