I am specifically thinking about a site that is a bit like a directory.
It relies on people finding pages via a category menu, but each page could belong to four or five categories.
So clicking on any category archive to bring up a list of page excerpts would inevitably have quite a lot of duplicate content with some other categories.
So my question is whether to index all the categories and hope that Google doesn't penalise or noindex, in which case there will be very little for Google as (by the nature of the site which is a sort of directory) the pages are quite thin and any attempt to add content would be irrelevant to the user.
The point of the site:
For people to find audio tracks they can licence. Each track has a page with audio file and info.
What I need is for all pages in any one category to display as excerpts on the archive page, so users can immediately play the audio and compare.
They can either search on the site by keyword or else go to a category menu (which could be a genre, era or a mood). The thing is, a track could have several possible overlapping categories, so the category archives may have a lot of duplicate content.
For example a slow blues track might be in all of these categories: blues, 40s, slow, sad, sleazy.
There could be about fifty categories, plus a category that everything is in.
Edited by Pete, 31 May 2016 - 06:05 AM.