Jump to content

Cre8asiteforums Internet Marketing
and Conversion Web Design


Photo

Advice on site structure... Thanks.


  • Please log in to reply
30 replies to this topic

#1 callback

callback

    Gravity Master Member

  • Members
  • 122 posts

Posted 04 August 2004 - 05:00 PM

Hi there!!

I am going to launch a new site that sells US domestic and international toll free services. This is basically the expansion of my current site
http://www.bestcalls...e.com/tollfree/ -- I want a top domain site for this Toll-free section. I will have about 20 pages in total initially. So, I am wondering whether I should make 2-level site structure (main and /), or I should just make every page on the top level without subs.

My toll free service will have 3 products (Follow-Me, Virtual PBX, and Unlimited Voicemail). I don't know if it makes sense to have each product to be in a separate directory because there are not many pages I can create. Please advise.

And please, if could, advise what kind of percentage I should make this brand new site content unqiue to avoid being considered as an identical site to its mother.

Thanks a lot!

#2 o0MattE0o

o0MattE0o

    Ready To Fly Member

  • Members
  • 37 posts

Posted 04 August 2004 - 05:40 PM

not shore sorry

you could place them in differnet directorys as if/when you get more pages I have uploaded my site meany times to make it easy to find stuff when editing...

depends if its the same company or not if so I would not go too much differnt maybe a change of colour at most other wise change it all...

#3 sanity

sanity

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6889 posts

Posted 04 August 2004 - 06:34 PM

I'd say it depends on how much content you have. If I'm designing a small site I'll have the content straight under the root. However if the site/section has a lot of content it's easier to organise it into separate directories.

Think about how much content you have and whether it will increase over time and use that to guide your decision. Which will be easier to maintain?

#4 thirstymoose_2000

thirstymoose_2000

    Mach 1 Member

  • Members
  • 268 posts

Posted 04 August 2004 - 06:47 PM

I am overly overly organized and always have subdirectories for the pages as well as to organize the different images. This also sets you up for future expansion. On the top level I normally only have my index page, subdirectories and maybe global CSS and javascript files (although I will normally put those in a subdirectory too). It just keeps everything cleaner.

#5 Ron Carnell

Ron Carnell

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 2065 posts

Posted 04 August 2004 - 06:49 PM

Where you put them will affect no one but yourself, so you should probably do what seems easiest or makes you the most comfortable.

If you put them in the root and the site later grows to much more than originally envisioned, you'll eventually feel the need to better organize. Moving the pages later WILL affect more than just yourself. If you put them in folders now and the site never grows, you will have lost little beyond the time required to create the folders (little = you'll be adding a few bytes of file size to each page by referencing a longer URL).

#6 sanity

sanity

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6889 posts

Posted 04 August 2004 - 06:52 PM

thirstymoose_2000 I'm glad I'm not the only one who is overly organised. Heck I even hate messy code. So much easier to work with if everything is tabbed out properly eh.

#7 callback

callback

    Gravity Master Member

  • Members
  • 122 posts

Posted 05 August 2004 - 08:48 AM

Where you put them will affect no one but yourself---

So, for the sake of search engine spidering, either mode should not make much difference. Is that correct? I definitely agree placing files in 2 levels is better for management and expansion.

When I woke up this morning and found some many replies, I couldn't be more excited. Thanks and thanks!

#8 Ron Carnell

Ron Carnell

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 2065 posts

Posted 05 August 2004 - 10:54 AM

So, for the sake of search engine spidering, either mode should not make much difference. Is that correct?

In my opinion, yes, that's correct.

Not everyone, however, will necessarily agree. A lot of people have noticed that Page Rank seems to decrease as pages are moved into deeper and deeper subdirectories, but I honestly think they have misconstrued the cause and effect relationship. On most servers, a search engine spider can't even see folders. Go to this link and all you'll get is a Forbidden Access error. That's exactly what the spider would see, too. On a properly configured server, the only way to look inside a folder is to use FTP and a password. You can ask for pages, including default pages where they exist, but a request for a folder just results in being denied access.

Most of us, however, base our directory structure on our link structure -- and THAT is what a spider is going to follow. The decrease in PR that some see has nothing to do with directories, but rather is a reflection of how many clicks it takes to get from the front page (assuming that is your most important page) to the page in question. It is merely a coincidence of human habit that a page several clicks from the home page is likely going to also be several directories deep.

In short, IMO, it doesn't matter to any search engine where you put your page. It DOES, however, matter how many clicks it takes to get to them. You can bury important pages several directories deep and the spiders will still recognize them as important if you keep them a single click from home page.

#9 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6351 posts

Posted 05 August 2004 - 11:17 AM

I agree with "most" of what's going on above. As Ron says, people will tell you that PR goes down the deeper you get in the directory tree, but that's not at all accurate. When Google "guesses" at a page's PR (i.e. the first month or 6 weeks that that page is in the index) then that "guess" is done (in part) by seeing how deep it is in the tree because the tree should pretty much tell you how many links there are between the front page (which has a known PR) and the page in question. But, after 6 weeks or so, then it doesn't matter at all.

The decision to use directory structure or put everything into one directory doesn't come down so much to the quantity of content, but rather, the quantity of indicudual "concepts" the site is going to deal with. If everything on the site deals with the same basic concept, then it should all be together. (The trick, I suppose, is in defining a "concept" - it may be as simple as "company info" and "products" being treated as separate concepts, or it may be more complicated than that).

When you group "concepts" together in subdirectories, there are certain "assumptions" that a search engine can make when ranking the page. If you've done your job right, this can help give the page an early "boost" in relevant searches. If you do it really well, this early boost far outweighs the downside of maybe showing a "guessed" PR that's one lower than it should be. (Though the guessed PR is still usually pretty close to accurate based upon the nature of the web - it's more likely to go DOWN after the math is done if you have it in the same directory as the root since the guess is more likely to be on the high side. )

The idea from all of this comes from my experimenting with setting up SE friendly dynamic sites. The rules apply to static sites, as well, though.

Let's say we're selling cars. We've got a site with a root page and everything about the company itself is in the /AboutUs/ directory and all of your cars are in the /Cars/ directory. Then you break down the cars by manufacturer /Cars/Ford/ and /Cars/Ferrari/

As the spideres go through the pages, they see that everything in the cars directory (and those directories below it) is all about "Cars" In the Ford directory, it's all "Ford Cars" and the other is "Ferrari Cars". After a few months, all of the math is done and the SE's pretty much know about the concept that everything in your "Ford" directory is "Ford <some model> Cars" and the concept is clear. So, if you add a new page for the "Ford Mustang (Cars)" this week, and the words are there, then simply by the page's location, it can assume that the page is about "Ford <something> Cars" and all it has to figure out is what the model is. Two of of the three words that define what the page is about are already taken care of and confirmed by the existing patterns.

And, the same is true if you add a new make: /Cars/BMW/ - It's knows, even before it crawls that the pages are going to be about some kind of "Cars" So, now it's only got to find out what kind it is.

Basically, organizing your site this way just makes it so there's less math that needs to be applied because it can compare it to math that's already been done on pages that share the directory. If the comparison algo works, then it doesn't need to go through the complicated process of figuring it out from scratch. And, your page gets represented will several weeks (or more) sooner than it might if everything was just clumped together in one place.

Now - you have to have 100% consistency in your methods, keep the names of the directories as short as possible, and have a 100% consistent linking structure for all of this to work, but over the long run, it does work. Bear in mind, your site needs to have a good while behind it and show those consistent patterns for a while before the engines will start to make these assumptions (six months, anyway), so if you're not interested in a long term plan, then this won't be as effective as you'd like it to be.

G.

#10 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9023 posts

Posted 05 August 2004 - 12:09 PM

It's probably very foolish to try to fight two big guys, R and G, at once... :lol:

However from over here in the corner, let me just express my counter-view. I'm really only talking about websites that have less than say 50 pages, since above that you would certainly get confused without folders to know where you've put things.

I don't know how any of this stuff can be proved one way or the other since it is impossible to do two versions of the exact same website and see how the same web page ranks if it sits deep in a folder structure rather than in the root directory.

I go along with the thought that what really counts is the link structure among all the pages and subpages. This should be identical whether you have a folder structure or whether you put all pages and subpages in the root directory. You can also have 'bread crumbs' at the top of every web page to show an apparent folder structure either way. That's the kind of line you see in the DMOZ directory:
Home > Web Page > Web Subpage > Web Sub-subpage
In this structure each element is a clickable link to allow you to "move up" the tree structure.

I'm not sure whether the analysis that Grumpus describes is done by the Search Engine looking at the link structure or by looking at the folder structure. Given that all interlinking is the same, the question we are asking is whether a web page will rank higher if its URL is
www.mydomain.com/produce/vegetables/rutabagas/recipes.html rather than
www.mydomain.com/producevegetablesrutabagasrecipes.html

My judgement is that the second might rank very slightly higher. That's based on the view that if a web page is in the root directory, it is likely to be more authoritative in the eyes of the website creator than if they buried it deep in a folder structure. Remember we're talking about a web page with exactly the same content but put either in the root directory or deep in the folder structure. If I was building a search engine, that might well be one of the secondary factors in my long list of over 100 factors considered in my algorithm.

The only factual thing I can point to for support is that the MSN Advanced Search does allow you to look only at web pages in the root directory or in pages at level 2 and above, or 3 and above down to 5 and above. Perhaps that may signal something about their thinking.

So since it can't do me any harm and it might just possibly do me some good, I put all webpages in the root directory.

Now we return you to regular programming ... :)

#11 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6351 posts

Posted 05 August 2004 - 12:45 PM

In your example, the first one would definitely rank higher simply because producevegetablesrutabagasrecipes is a huge word. For various reasons, huge words don't sit so well with the search engines.

Another reason why the first will rank better is that the keywords are in the URL. True, google is experimenting with extracting separate words from run-on words, but they're not very good at it, yet, and they do it only by hindsight rather than foresight. By this, I mean that the word doesn't get extracted from the run-on word until the search term has been entered by the searcher. (Or else they'd have to be doing so many combinations of possible words that it'd be a little rough - "Tables" is in there, but the page isn't about tables...) So there's definitely not as big, if any, bonus there for having the keyword in the URL, even though it is getting highlighted.

And, I'm not sure I'd ever go three subs deep - now you start running into problems with the URL being long. I'd definitely never go more than three deep. At some point, it makes sense to break out into a whole new branch. So, I think if I were doing the site I would probably have a page called "Produce.html" that took care of the "produce section" (thus eliminating the need for that directory). So, produce.html would like to "/Vegetables/index.html" as well as "/Dairy/index.html". Really, if you only have one page that is going into any given directory along your tree, it doesn't require a TLDirectory of it's own - a page will suffice. In my mind, if you don't have at least two pages to go into any given directory, then the directory isn't worth having.

Long run, though: Mathematically site.com/1/2/3.htm and site.com/123.htm should rank the same if they were identical and linked to identically (at least in Google and from what I know about Yahoo's ranking, though the judge is still out) it shouldn't make a lick of difference. Not sure about MSN, but I'm also not sure they're off to a very good start anyway. There's no decent clustering going on, and a lot of the searches I tried on their demo site showed my 6-8 pages from the SAME SITE in the top 10 slots. So, if I go to the one from site1.com that's listed first for my term and it's not about what I want and it doesn't look like it's going to link to what I want, I shouldn't have to scan past 7 more pages that I've already decided aren't relevant just to get to the next site that may have my answer.... Time will tell, but if that's their finished product, they're in for a long winter.

G.

#12 callback

callback

    Gravity Master Member

  • Members
  • 122 posts

Posted 05 August 2004 - 05:34 PM

have a page called "Produce.html" that took care of the "produce section" (thus eliminating the need for that directory). So, produce.html would like to "/Vegetables/index.html" as well as "/Dairy/index.html".


If I name it as Produce.htm, instead of .html, will that make difference in "taking care of that directory"? If I have tomato, broccoli as follows under Produce.html, should I name them tomato.htm rather than tomato.html?

This might be a stupid question, but I just want to be clear about it.

Didn't realize my post received some many hot discussions.

Thanks everyone.

#13 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9023 posts

Posted 05 August 2004 - 05:38 PM

htm and html work equally well AFAIK. It's just a matter of personal preference, I believe.

#14 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6351 posts

Posted 05 August 2004 - 09:25 PM

I don't think I'd mix and match "htm" and "html" on the same site, though. As Barry says, it doesn't make a difference which one, but produce.htm and produce.html would be different files as far as any human or search engine are concerned. It could inspire confusion from either side, as well.

I should also point out - in Barry's example above (that I said wouldn't be true) it probably WOULD be true if the filename was shorter and the site wasn't huge. Maybe not for the exact reasons, but several of my "debunking" points were misleading in that I took them out of the context of the actual example. But, then again, machines aren't particularly good at looking at identifying the context of a situation. My examples give a way to help the SE's do that - the layout is set up to match the way a machine can be taught to "think" - and thus it is useful. (And it'll become more useful if more people design that way).

End of the day, no matter what you are doing in just about anything in life - consistency is the key. If you do the wrong thing consistently, it's probably better for you (when dealing with robots, anyway) than doing the right thing once in a while.

G.

#15 ac

ac

    Light Speed Member

  • Members
  • 917 posts

Posted 05 August 2004 - 09:52 PM

Barry, fight R & G? I do not think Larry & Sergey would try that, don't worry Barry, no one will even try. Remember they scare the hell out of us common members!@!!!

You keep those guys straight for the rest of us Barry.

#16 Ron Carnell

Ron Carnell

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 2065 posts

Posted 06 August 2004 - 12:33 AM

LOL. See, I told ya not everyone would agree. Structure for SEO is almost as controversial as structure for SEMPO these days. :)

G, I've read your theories on link structure several times in the past year, including your sequential numbering advice, and have never had sufficient time to respond in depth. Unfortunately, I still don't have time, at least not tonight. Suffice it to say, I think you are crediting software with far more intelligence than it has. It would certainly not be difficult, however, to determine one way or another. Simply throw an optimized page about Camaros into your well established Ford directory, and wait to see what the search engines deduce. If your theory is correct, the SE will think the page is still about a Ford. I'm betting they would know it's about a Chevy. (It occurs to me that if you are right, throwing a Camaro page into the Ford directory amounts to what humorists call "the sudden twist." Which gives rise to an insane image of Googlebot suddenly bursting into laughter when it gets the punch line.)

Barry, I'm going to have to play a bit with that MSN feature that filters by directory level. It throws a kink in my theory that search engines don't even bother to parse or think in terms of directories. Frankly, I agree with G on this one, in that I would see that as a weakness, not a strength. For example, you said:

That's based on the view that if a web page is in the root directory, it is likely to be more authoritative in the eyes of the website creator than if they buried it deep in a folder structure.

If this were true, no web site on Geocities or Tripod would stand much chance of ever ranking for anything. Yet, they do (though not as often as they once did).

I would almost agree with you on this if you changed it from directories to clicks, though. Sites that don't begin in a traditional root, such as on free servers, still have a "home page" in the sense that one page in a site always terminates a link structure. That's why you can put dozens of sites on the same domain, or thousands as in the case of Geocities, and still have each one be recognizable as a distinct site. The link structure reveals identity.

Not incidentally, back in the mid- to late-Nineties, search engines like Excite and AV did give a tremendous boost in relevance to home pages. So much so, that it was hard to get any other pages in a site to rank well at all. This led to some wild tricks as we all tried to optomize one page for everything relevant to our whole site. I am NOT sorry to see those days pass into history!

I also would like to clarify that my post comparing directory versus link structures was about Page Rank, not relevance. How many clicks you are from your most important page (home) will affect PR simply because of the way PR naturally flows. It should NOT, however, have any appreciable effect on relevance. On the contrary, I think for a well structured and optimized site, any type of hierarchal query such as "brand product model action" will likely result in a deeply buried page -- because that's where the relevant information will be found.

Good discussion, guys! :)

#17 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9023 posts

Posted 06 August 2004 - 06:50 AM

Yes, Ron, I think we're all pretty close in our thinking but with a few small differences.

I've done tests, Grumpus, and I agree that the fact that a word is bolded as part of a composite domain name in a SERP does not confirm that the composite name has been parsed within the Search Engine algorithm. This can be checked by looking at the Cached version of a page from any SERP. For example, one of my clients, a company called Frank Ralphs, comes up as #1 in a search for its name, 'Frank Ralphs'. Its domain name in the SERP appears as www.frankralphs.com. However if you check the Cached version, this does not parse the word FrankRalphs into its two parts within the text of the page. The words Frank and Ralphs only appear in links to the page.

So that leaves me with only one question. Are we sure that URL's play any part in the Search Engine algorithm? Has any one done any testing on that?

#18 Tony

Tony

    Gravity Master Member

  • Members
  • 116 posts

Posted 06 August 2004 - 10:30 AM

if a web page is in the root directory, it is likely to be more authoritative in the eyes of the website creator than if they buried it deep in a folder structure.


But do search engines agree or follow that logic?
The all-in-root or folders/directories for static pages is an interesting topic, I think.

On 2 sites I have different reviews of same film:
http://www.zone-sf.c...rotherwolf.html
http://www.videovist...rotherhood.html
Yet Google gives the 1st one a higher PR.

I mention this only in case it's relevant to, or an example of how views stated above may be right or wrong.

In this particular case, I'm wondering if linking these 2 reviews (with "read another review") would boost or change their PR?

#19 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6351 posts

Posted 06 August 2004 - 11:22 AM

If your theory is correct, the SE will think the page is still about a Ford.


Not if the word isn't on the page. But, it would help it to be in that directory with the more general term of "cars". It's not structure alone that works here - structure is just one tool that can be used to give it an extra punch. The biggest effect is visible when a page is "new" to the index - it's got more power if structure (both linking and sitewise) all match up because it cuts back on the math needed. The more a page ages, the more the playing field levels out.

G.

#20 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9023 posts

Posted 06 August 2004 - 11:34 AM

Tony, it's difficult to be sure in the example you give. We are talking about how high in the SERP for a search on "Brotherhood of the Wolf" each of the two web pages would rank if they were identical and in the same domain. Now of course they're not.

I did a little bit of testing but I'm not sure what I learn from this.

Your first web page has a PR 3 and no backlinks. Your second web page has a PR 2 and 2 backlinks. However you can never put too much faith in either the Google Toolbar PageRank nor in the Google link: search.

Your second web page turned up at # 134 but I couldn't find your first in the first 400. As I said, it's difficult to see what we learn from all this. ;)

#21 Tony

Tony

    Gravity Master Member

  • Members
  • 116 posts

Posted 06 August 2004 - 12:05 PM

Thank you for that, Barry.
;)

it's difficult to see what we learn from all this.


Yes. It's certainly a problem, especially if (like me) you're trying to get to grip with SEO ideas and whatnots, and finding things are confused (and apparently confusing even for those with some expertise!) in these matters.
:roll: :|

#22 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6351 posts

Posted 06 August 2004 - 04:21 PM

Remember - only links of PR4 or higher "show" as backlinks, but links that have a lower PR still count. Have enough links from PR1 pages and you could (technically) reach a PR9 or 10 - and Google would still report zero backlinks. (Actually, I'm not sure you could reach PR9 or 10 with only PR1 links - mathmatically there probably aren't enough PR1 pages on the web to do it even if you had a link from every one of them... But it is "technically" possible...)

G.

#23 Respree

Respree

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 5901 posts

Posted 06 August 2004 - 04:29 PM

only links of PR4 or higher "show" as backlinks,


I believe there have been some recent changes to the filter.
http://forums.search...hread.php?t=668

#24 Grumpus

Grumpus

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 6351 posts

Posted 06 August 2004 - 04:42 PM

Yup. Good catch. I usually write the above sentence like this:

only links of PR4 or higher "show" as backlinks (last I checked)

I was too lazy to do it this time and I got caught. :D

G.

#25 DigitalRoad

DigitalRoad

    Ready To Fly Member

  • Members
  • 27 posts

Posted 07 August 2004 - 10:18 AM

So that leaves me with only one question. Are we sure that URL's play any part in the Search Engine algorithm? Has any one done any testing on that?


Absolutely, keywords within a url are a factor in the G algo. I've tested using keywords that appeared only within the url and were not found in anchor text or on page. I also tested based on occurrence within a string versus using hyphens. Within string occurence (above two words) provided no algo boost as you and Grumpus point out, although it's clear there is after-the-fact recognition. That's why I believe it's just a matter of time before that factor will be integrated into the algo. Frankly, it doesn't make sense not to have it in there already. I'm currently testing within string occurence for two keywords, as Ron suggested non hyphenated Url parsing might be occuring as an algo factor at the two keyword level.

more authoritative in the eyes of the website creator than if they buried it deep in a folder structure... If I was building a search engine, that might well be one of the secondary factors in my long list of over 100 factors considered in my algorithm.


I can't convince you that authoritative/importance is relative to a search query eh? ;)


Jon

#26 bwelford

bwelford

    Peacekeeper Administrator

  • Site Administrators
  • 9023 posts

Posted 07 August 2004 - 10:27 AM

I can't convince you that authoritative/importance is relative to a search query eh?

I don't need convincing. I believe that too. ;)

#27 kelvinbrown

kelvinbrown

    Unlurked Energy

  • Members
  • 5 posts

Posted 07 August 2004 - 05:36 PM

Being my first post, I am not trying to make a lot of noise.

I do have a site with lots of categories / folders.

As someone mentioned earlier do what is and will be easier regardless of what any particular searchengine is doing. I noticed a lot of the previous comments include reference to google and page rank. If you intend for an especially large site to be around for a while it will be much better to organize it now.

Who's to say google will still be doing things the same way 2 years from now. Build a good and easy structure now, and it will likely work for any dominant search engine in the future.

Just my 3 cents worth.

Kelvin

#28 BillSlawski

BillSlawski

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 15644 posts

Posted 07 August 2004 - 05:50 PM

I think that's an excellent suggestion, Kelvin.

Good to have you aboard. Welcome to the forums.

#29 A.N.Onym

A.N.Onym

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 4003 posts

Posted 10 January 2006 - 03:31 AM

Hello there

As stated in this thread, Im working on a website and pondering whether I should bother with folder structure or place all the files in the root directory (guess the latter will be more cumbersome in the long run, though).

Has anything changed regarding this issue?

Is it still preferrable to use folder structure to improve usability, ease the stress on the webmaster at the cost of SEO promotion?

The site Im working on is rather small, but will be gaining pages rapidly during several years to 1-3k I believe. The company offers several services, so creating a basic services folder might be reasonable not to create pages like keyword1-keyword2-keyword3-keyword4-keyword5-services.shtml, but create keyword1-keyword2-services/keyword4-keyword5.shtml and keyword1-keyword-2-services/keyword6-keyword7.shtml instead.

What are your all views on this issue?

Any help will be appreciated. Thank you in advance.

P.S. Would Google be uncomfortable with me using numerous hyphens? Not that I use them to improve SEO, but to make URLs readable for both visitors and the webmaster (me, currently).

#30 DianeV

DianeV

    Honored One Who Served Moderator Alumni

  • Hall Of Fame
  • 7216 posts

Posted 10 January 2006 - 08:12 AM

Those URLs don't sound too terribly horrible, and one has to name them something anyway.

With search engines, algorithms do change. It's probably best not to plan your strategy based on what's happening *today only*.

That said, I'd say that I, for one, don't want to look through a folder with 1-3,000 files in it. I think that would get tiresome the first time around.

#31 A.N.Onym

A.N.Onym

    Honored One Who Served Moderator Alumni

  • Invited Users For Labs
  • 4003 posts

Posted 17 January 2006 - 11:22 PM

well, I decided to go with the folder structure, just because there would be files with same keywords but with different keyword order..not that it would be a huge problem, but it'd be messy in the root folder.

the site structure, most likely, will be the following:
/
/services/ - basic pages about services (service pages)
/services/keyword1-keyword2/ - articles and additional information



RSS Feed

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users