If you care about your website being found in search, you’ll want to help the search crawler do its job. Or at the very minimum, you should remove any obstacles under your control that can get in its way. The more efficiently the search engine bot crawls your site, the higher the likelihood that more of its content that will end up in the index. Here’re some suggestions to SEO site architecture issues and solutions related to files and pages:
- Use descriptive file and directory names – avoid using underscores as word separators; instead use hyphens.
- Limit directory depth – make your site’s directory structure shallow, no deeper than four child directories from the root.
- Limit physical page file size – Keep your individual webpage files down under 150 KB each.
- Use 301 redirects for moved pages to avoid losing all of your previously earned site ranking “link juice.” Follow these Windows Server Internet Information Server (IIS) or Apache HTTP Server guidelines.
- Implement custom 404 pages
- Other crawler traps – search engine bot doesn’t see the Web as do you, so avoid frames on your website, forms, and authentication pages. To keep the search engine bot from going places that might trip it up, see the “noindex” attribute to prevent indexing of whole pages.