Search Aggregation

For a lot of people the thought of starting a ‘Search Engine Optimisation’ program can be daunting. All we hear is “Get me to Number 1 on Google!” But it’s time to face reality – it just ain’t that straight forward.

For a practical understanding of ‘some’ of the steps involved in optimising your website for search engines I asked Zaw Win to put together a checklist of the steps he / we follow when working on a program for a new client.

If you’ve ever been down this path, you’ll see this covers the basics, and if you’re an expert in this area you’ll know there are probably a hundred more things we could have listed. But we wanted to keep this list snappy. For the more extensive version of what’s involved ‘google’ any of this information on Google :) Or watch one of Matt Cutt’s videos in Googles Webmaster Tools.

So to kick off, here’s a summary of the steps we follow at the beginning of a Search Engine Optimisation Program;


Things we will need from you;

  • A minimum of 2 x competitor websites
  • At least 3 x main keywords you want to rank for


Activities we will initiate;

*1. Install Google Analytics tracking code*

This provides fresh insights into how visitors use your site, how they
arrived on your site, and how you can keep them coming back.

*2. Set up a Google webmaster tools account to learn and fix any errors
that may be preventing search engines from crawling your site effectively. *

The Google Webmaster Tools will provide detailed reports on your pages’
and visibility on Google.

This will enable us to see how Google crawls and indexes your site and we can learn
about specific problems they may be having accessing it.

In addition, we can find out which Google search queries are driving traffic to
your site.

*3. Keyword research*

Keyword research is one of the most important, valuable, and high return
activities in the search marketing field. Ranking for the “right” keywords
can make or break your website.

It’s not always about getting visitors to your site, but about getting the
right kind of visitors.

Those keywords can be used when writing copy for new pages or when editing existing pages.

*4. Rewriting meta tags with targeted keywords for main pages*

When you search something in Google, you will see the list of search
results. Each has title and short description – these are referred to as meta

A meta title tag tells both users and search engines what the topic of a
particular page is.

A page’s description meta tag gives Google and other search engines a
summary of what the page is about.

*5. Image optimisation*

Image-related information can be provided for by using the “alt” attribute.

The “alt” attribute allows you to specify alternative text for an image if it cannot be displayed for some reason.

From an SEO perspective, the alt text for that image will be treated similarly to the anchor text of a text link.

*6. Implement ‘h1′ tags on each page to emphasise important text

*7. Apply ‘nofollow’ attribute *

This is simply to tell search engines that certain links on your site shouldn’t
be followed or to pass your page’s reputation to the pages that it links to (i.e not giving any page ranking credit to the linked page).

*8. Generate an XML sitemap and submit it to Google to educate them about your site

To share the information about your site with search engines we can inform them of your pages with an XML sitemap file. This can be submitted through Google’s Webmaster tools, which in turn makes it easier for Google to discover the pages on your site.

*9. Create an image sitemap file and submit it to Google *

Sitemaps are an invaluable resource for search engines. They can highlight
the important content on a site and allow crawlers to quickly discover it.
Images are an important element of many sites and search engines could
equally benefit from knowing which images you consider important.

*10. Set up robots.txt*

To restrict crawling where it’s not needed. A “robots.txt” file tells
search engines whether they can access and therefore crawl parts of your

*11. Generate search engine friendly URLs*

Having a human readable URL helps search engines to understand more
about your page.


To continuously improve your online presence and generate enquiries and more online sales there are a number of activities that can be undertaken. Of most importance is getting the foundations in place so it becomes easier to maintain and roll-out either daily, weekly or monthly.

Ranking strongly within Search Engines is a zero-sum game. You want to know who your competitors are, and on what keywords they are ranking strongly on so that you can either outperform them, or remain in a highly ranked position.

Over recent months, and across 2013 there has been a heavy emphasis on providing original, fresh content and ‘social’ sharing. The important elements of being seen as an ‘authoritative site’, with link-backs from other high reputational sites still remain an important factor but Google have introduced a number of changes focused on rewarding sites that are seen to be adding true value back to the website visitor and online google ‘searcher’.

The goal is to be focused and consistent, whilst speaking the language of your end user (customers) and in a well structured manner for search engines.

Here are a range of activities to include in an ongoing Search Engine Optimisation plan:

- Link building (this can be achieved using different ‘white hat’ strategies)
- Content creation (creating a blog, providing original content)
- Promoting your website through other mediums (e.g other blog sites, social media, newsletters etc.)
- Local + Global Directory submission (identifying priority sources)
- Blog commenting (commenting on other blogs with a reference back to you as the original source)
- Forum participation (participating in online forums
- Continuing to unique page titles and descriptions (maintenance of good SEO practices)
- Page building (adding new & fresh sources of content)

Become a Fan of ETradingGroup

Search Aggregation

Posted by (0) Comment September 14th, 2009

What is search aggregation you ask? Well here’s the official, unofficial version from Wikipedia, and see if you can figure it out.

“A search aggregator typically allows users to select specific search engines ad-hoc to perform a specified query. At the time the user enters the query into the Search Aggregator, it generates the required URL “on the fly” by inserting the search query into the parameterized URL for the search feed.”

Did you get that? Alternatively, why don’t you have a look at the job aggregator that Howie built 7 years ago – He was quite a pioneer back then, whilst still in school (ahh…no, he was a bit older than that!).

As you can see the alljobs aggregator searches for information from each of the various job sites. Alljobs is a ‘real- time’ search engine, which has benefits to the end-user (the job searcher gets to see job postings as instantaneously as they are posted on each of the contributing sites) and there are benefits for the provider (‘real-time’ search downloading puts less pressure on servers as requests are downloaded throughout the day, rather than one large download at a certain point in time).

Some of the search aggregator sites people may be familiar with include, or who each use various forms of search aggregation.

In recent months we’ve given a facelift (thank you marketing), and whilst the area of search aggregation is becoming more competitive there are still a high number of untapped opportunities.