6 SEO Hacks for Dynamic Websites


The search industry is moving at a faster pace than it did 5 years ago. Search giant Google is making monthly changes to its algorithm and it seems that every corner you turn on the Internet there is a start up looking to innovate in the search market.

Competition is quite healthy but makes the life of a SEO very difficult. Let‘s just say the days of KW stuffing, blog commenting and content production are over. This is a new search economy, a white hat economy. Here are some tips to get you started in optimizing your site.

Hope you like some of our older articles on Improve the Conversion Rate of Online Store, Posting On Other Blogs vs. Having Your Own Blog, Best Free File Hosting Sites, Tips for Encouraging Others to Link to Your Blog and Guide to Starting a Home Based Business.

Information Architecture

Information architecture is the process of categorizing all the content on your website so it is can easily be crawled by the search spiders. By having a proper structure in place, it will allow older posts and articles to stay indexed for longer and continue to contribute as traffic sources.


Let‘s use an example to better illustrate the concept of content architecture. Let‘s say you are the webmaster of a programming website that discusses varies types of programming languages. At a categorical level, you might think about labeling them Linux, C++, Java, etc. On a sub category level, you might want to break the Linux into sections such as Ubuntu, Fedora, Debian, etc. Once the categories and sub categories are identified, you would want to begin creating content around each sub-category. For example, you might want to write a post about How to Install Gentoo or Top Linux Blogs on the Web.

Interlinking with Silos

Since links are roads for search engine crawlers, it is important to place links in new articles pointing to related articles that were posted a while back. Here are two interlinking tips that should be implemented immediately. The simplest way to interlink is placing a linking between two articles in the same silo. This means that you would want to place a link in your How to Install Gentoo article to an older article such as The Pros and Cons of Gentoo. Linking out to older articles on your website will count as a new link pointing to the older piece, making the search engines deem it as still relevant.


The second technique that can be used for interlinking is related articles. The majority of publishing platforms and CMS‘s have plug ins that automatically select related articles and place them at the bottom of the post. This is a quick and easy way to get the interlinking done. For whatever reason, you do not have access to a plugin, you could always do it manually. Although this takes a lot of time, the results in the search engine results will be worth it.

Keyword Hierarchy

Keyword hierarchy may require a little more skill as it involves matching the specific keywords with the proper section. This involves a proper keyword research and the know how in terms of strategically placing those keywords in the page titles and descriptions as well as tailoring the content to contain a fair amount of keywords as well.

tails in seo

It is important to note, when placing keywords, which one of your pages are the strongest. Usually this is the homepage or category pages. When you identify the strongest page, place the most competitive (usually short tail keywords) on those pages and work your way done the information architecture. By the time you get down to creating content you might be optimizing them for long tail keywords such as “linux gentoo download guide“.

Another reason longer tail keywords should be written about in lower level pages is because it is typically easier to rank for. This means pages will require less authority to get that page to rank for your targeted long tail keyword.

Duplicate Content

Duplicate content is when two URLs display the identical or similar page content. This is viewed as problematic by Google and can get your website penalized. Here are three tags that can be used in removing duplicate content from the indexes.


Robots.txt is a file where parameters can be placed and blocked from the search engines. For example, let‘s say you originally launched you website with the parameter /c-prog/ and you wanted to change it to /c-programming/, this would be a good opportunity to place the old parameter in the robots.txt file. This will block every URL with the specific parameter in it by only placing one line of code. Note: ensure that all the URLs you want to keep are 301 redirected to the new URLs before placing the parameter in the robots.txt file.


For a quick fix, the Noindex tag is works great to remove duplicate content issues. The code is placed in the meta title and is follows:

<meta name="robots" content="noindex">


Canonical tags act in a very similar manner to the noindiex tag and does a great job removing duplicate content. The code for adding a canonical tag is as follows:

<link rel="canonical" href="..."/>

URL Rewrites

URL rewrites should only be done in extreme cases where the URL cannot be indexed by the search engines because it cannot be processed. If they are not being crawled by the spiders, that means that page is not being indexed and will not bring any traffic. To resolve this issue, URLs should be reduced to a reasonable length and should contain keywords if possible. The best way to go about this is to have it mirror the hierarchy of the site. For example, https://designpress.com/freebies/twitter-tools-to-increase-your-productivity/ is a clean URL that informs both search engines and users what the page is about.

These tips are great to get any webmaster started in the SEO field. I would suggest following some SEO blogs as well as Matt Cutts of Google to stay on top of any changes that may be implemented.

Nisha is the head blogger for Slodive.com. She loves tattoos and inspirational quotes. Check her out on google plus https://plus.google.com/u/0/116437517919411097994.


  1. On the first one “Information Architecture”. Meaning if I have a dynamic homepage
    this one will hurt my SEO, since its always changing.

    Example is a classified ads website wherein in the home page there is this “just listed” section and
    its always changing everyday. Can this hurt my SEO

  2. I am not that much into SEO however this article was quite interesting. I have never heard about couple of tips you mentioned and I think they could work. But it is all about the persistence if you would be persistent enough with your SEO tactics I believe the results will come.


Please enter your comment!
Please enter your name here