Small Business Ideas Forum

Small Business Ideas Forum (
-   Link Development (
-   -   Dynamic Pages Vs Static Pages For Backlinking (

puti1 16th April 2008 09:09 PM

Dynamic Pages Vs Static Pages For Backlinking
Hi everyone,
I have been told that static pages are much better than dynamically generated pages for the SEs to pick up. But i also noticed that when i did a search online, some of the top results are pages from forums and blogs.
That somehow challenged my belief that statics are better.

Does anyone have any views on this?
For example:
between these 3, which is better?

2) abc.omc/aspx=3!1gfakfmlmhslnmhsmh

I have the understanding that point 3 is always the best. But it seems to me that this may not neccessarily be as some of the tops results do reflect sites that has the format as point 1 or 2. If point 1 and 2 do get indexed, then does it mean that it also make sense to get a link from these sites (i know it should - but wanted to clarify this point)


calevans 16th April 2008 10:01 PM


You have asked 2 questions:
Dynamic vs. Static
Spiders cannot tell the difference between a static page and a dynamically generated page. There are clues that it can pickup in the URL but by the time the spider gets the contents of the page, it's straight HTML/CSS/JavaScript.

The exceptions are pages that use JavaScript to dynamically load content. Since a spider does not parse and execute JavaScript, it can only see the original pages content. That's why ajax is bad for Search engines.

So to answer your question from a purely technical standpoint, the spider cannot tell the difference.

Some SEO experts (Disclosure: my feelings on SEO are a matter of public record on my blog) feel that SEF URLs are important. Therefore in your example #3 would be the best if you were selling tests.

I am firmly ambivalent on SEF URLs as my testing has shown to my satisfaction that it makes little or no difference these days.

However, from a human readable standpoint:
is much better than

If it's not too much trouble to do it then it pays off because your users can actually read and make sense of the URL. (In WordPress and Joomla! it's not difficult at all to get User Friendly URLs.)


puti1 17th April 2008 03:19 AM

HI Calevans,
Thanks for the explanation!
You've given me some important clues to what the SEs can do nowadays.
But from a human readability point of view, i am still for SEF URLs.

Tim aka puti1

Logan 17th April 2008 11:39 AM

Great explanation Cal!!

Just to add on with some of what Google is doing in particular, see the following post at the Google Blog that describes how they are testing following html forms now.


While I agree with Cal's statement that search engines won't parse and execute javascript, note that this blog post from google does also state -


We already do some pretty smart things like scanning JavaScript

calevans 17th April 2008 12:06 PM

Hi Logan!

Yeah, when I typed that I knew I should have excluded GoogleBot from it. Google does download JavaScript and while it doesn't execute it, it does parse it. They started this a while back because people were using JavaScript to game the system.

Those guys/gals @ Google are pretty sharp. :)


cpr 29th May 2008 02:46 AM

I don't think it'd matter. I checked lynx view and source code for sites generated both ways and they look identical. I don't think google bots could differentiate between the two. Although I am continuously impressed with their capabilities...

All times are GMT -5. The time now is 05:06 PM.

Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
Copyright 2004 - 2018 - Privacy