One of the problems large websites encounter is getting new pages or recently revised pages in the index. Since large websites also have a tendency to also be very deep, this adds to the complexity in getting these pages indexed and ranked, since links to these pages are often two, three or more levels deep.
If you have an existing large site, but aren’t adding or updating pages often the easiest way is to add links to these pages directly to your home page. Adding a “what’s new” and “recently updated” section with 5-10 links each with the proper anchor text is usually enough to jump start the indexing process. However if you are adding or revising a significant number of pages this strategy won’t work since you’ll be changing the links before they have had a chance to be properly indexed.
For sites with a higher volume or frequency of updates a more effective strategy is to use mini sitemaps. Create a mini sitemap for both “what’s new” and “recently updated”, and add links to the common navigation template. The sidebar or footer are generally the preferred locations, but the top or masthead navigation will work as well. Using a straight list of links will work however you can make it more esthetically pleasing and useful to users by adding a snippet of text from the page as well.
The final part of this strategy is to use dedicated XML sitemaps with a frequent ping schedule. Sign up for webmaster central accounts at Google and Bing. After verifying your account create an XML sitemap for new page, and recently updated pages. Submit both of these to all of the services.
This is going to create a maintenance point, and depending how much and how often you update it could be large maintenance point. To make the process as easy as possible look into automated solutions. Have the mini sitemaps generate from the database. Most CMS system keep track of when data is created and edited use this information to build the pages. Use cron jobs to rebuild the XML sitemaps daily and issue pings to each of the services.