Just a short, simple blog for Bob to share his thoughts.
28 February 2014 • by Bob • URL Rewrite, SEO, Classic ASP
In December of 2012 I wrote a blog titled "Using Classic ASP and URL Rewrite for Dynamic SEO Functionality", in which I described how to use Classic ASP and URL Rewrite to dynamically-generate Robots.txt and Sitemap.xml or Sitemap.txt files. I received a bunch of requests for additional information, so I wrote a follow-up blog this past November titled "Revisiting My Classic ASP and URL Rewrite for Dynamic SEO Functionality Examples" which illustrated how to limit the output for the Robots.asp and Sitemap.asp files to specific sections of your website.
That being said, I continue to receive requests for additional ways to stretch those samples, so I thought that I would write at least a couple of blogs on the subject. With that in mind, for today I wanted to show how you can add additional content types to the samples.
Here is a common question that I have been asked by several people:
"The example only works with *.html files; how do I include my other files?"
That's a great question, and additional content types are really easy to implement, and the majority of the code from my original blog will remain unchanged. Here's the file by file breakdown for the changes that need made:
Filename | Changes |
---|---|
Robots.asp | None |
Sitemap.asp | See the sample later in this blog |
Web.config | None |
If you are already using the files from my original blog, no changes need to be made to your Robots.asp file or the URL Rewrite rules in your Web.config file because the updates in this blog will only impact the output from the Sitemap.asp file.
My original sample contained a line of code which read "If StrComp(strExt,"html",vbTextCompare)=0 Then" and this line was used to restrict the sitemap output to static *.html files. For this new sample I need to make two changes:
Note: I define the constant near the beginning of the file so it's easier for other people to find; I would normally define that constant elsewhere in the code.
<% Option Explicit On Error Resume Next Const strContentTypes = "htm|html|asp|aspx|txt" Response.Clear Response.Buffer = True Response.AddHeader "Connection", "Keep-Alive" Response.CacheControl = "public" Dim strFolderArray, lngFolderArray Dim strUrlRoot, strPhysicalRoot, strFormat Dim strUrlRelative, strExt Dim objFSO, objFolder, objFile strPhysicalRoot = Server.MapPath("/") Set objFSO = Server.CreateObject("Scripting.Filesystemobject") strUrlRoot = "http://" & Request.ServerVariables("HTTP_HOST") ' Check for XML or TXT format. If UCase(Trim(Request("format")))="XML" Then strFormat = "XML" Response.ContentType = "text/xml" Else strFormat = "TXT" Response.ContentType = "text/plain" End If ' Add the UTF-8 Byte Order Mark. Response.Write Chr(CByte("&hEF")) Response.Write Chr(CByte("&hBB")) Response.Write Chr(CByte("&hBF")) If strFormat = "XML" Then Response.Write "<?xml version=""1.0"" encoding=""UTF-8""?>" & vbCrLf Response.Write "<urlset xmlns=""http://www.sitemaps.org/schemas/sitemap/0.9"">" & vbCrLf End if ' Always output the root of the website. Call WriteUrl(strUrlRoot,Now,"weekly",strFormat) ' -------------------------------------------------- ' This following section contains the logic to parse ' the directory tree and return URLs based on the ' files that it locates. ' -------------------------------------------------- strFolderArray = GetFolderTree(strPhysicalRoot) For lngFolderArray = 1 to UBound(strFolderArray) strUrlRelative = Replace(Mid(strFolderArray(lngFolderArray),Len(strPhysicalRoot)+1),"\","/") Set objFolder = objFSO.GetFolder(Server.MapPath("." & strUrlRelative)) For Each objFile in objFolder.Files strExt = objFSO.GetExtensionName(objFile.Name) If InStr(1,strContentTypes,strExt,vbTextCompare) Then If StrComp(Left(objFile.Name,6),"google",vbTextCompare)<>0 Then Call WriteUrl(strUrlRoot & strUrlRelative & "/" & objFile.Name, objFile.DateLastModified, "weekly", strFormat) End If End If Next Next ' -------------------------------------------------- ' End of file system loop. ' -------------------------------------------------- If strFormat = "XML" Then Response.Write "</urlset>" End If Response.End ' ====================================================================== ' ' Outputs a sitemap URL to the client in XML or TXT format. ' ' tmpStrFreq = always|hourly|daily|weekly|monthly|yearly|never ' tmpStrFormat = TXT|XML ' ' ====================================================================== Sub WriteUrl(tmpStrUrl,tmpLastModified,tmpStrFreq,tmpStrFormat) On Error Resume Next Dim tmpDate : tmpDate = CDate(tmpLastModified) ' Check if the request is for XML or TXT and return the appropriate syntax. If tmpStrFormat = "XML" Then Response.Write " <url>" & vbCrLf Response.Write " <loc>" & Server.HtmlEncode(tmpStrUrl) & "</loc>" & vbCrLf Response.Write " <lastmod>" & Year(tmpLastModified) & "-" & Right("0" & Month(tmpLastModified),2) & "-" & Right("0" & Day(tmpLastModified),2) & "</lastmod>" & vbCrLf Response.Write " <changefreq>" & tmpStrFreq & "</changefreq>" & vbCrLf Response.Write " </url>" & vbCrLf Else Response.Write tmpStrUrl & vbCrLf End If End Sub ' ====================================================================== ' ' Returns a string array of folders under a root path ' ' ====================================================================== Function GetFolderTree(strBaseFolder) Dim tmpFolderCount,tmpBaseCount Dim tmpFolders() Dim tmpFSO,tmpFolder,tmpSubFolder ' Define the initial values for the folder counters. tmpFolderCount = 1 tmpBaseCount = 0 ' Dimension an array to hold the folder names. ReDim tmpFolders(1) ' Store the root folder in the array. tmpFolders(tmpFolderCount) = strBaseFolder ' Create file system object. Set tmpFSO = Server.CreateObject("Scripting.Filesystemobject") ' Loop while we still have folders to process. While tmpFolderCount <> tmpBaseCount ' Set up a folder object to a base folder. Set tmpFolder = tmpFSO.GetFolder(tmpFolders(tmpBaseCount+1)) ' Loop through the collection of subfolders for the base folder. For Each tmpSubFolder In tmpFolder.SubFolders ' Increment the folder count. tmpFolderCount = tmpFolderCount + 1 ' Increase the array size ReDim Preserve tmpFolders(tmpFolderCount) ' Store the folder name in the array. tmpFolders(tmpFolderCount) = tmpSubFolder.Path Next ' Increment the base folder counter. tmpBaseCount = tmpBaseCount + 1 Wend GetFolderTree = tmpFolders End Function %>
That's it. Pretty easy, eh?
I have also received several requests about creating a sitemap which contains URLs with query strings, but I'll cover that scenario in a later blog.
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
20 November 2013 • by Bob • Classic ASP, IIS, SEO, URL Rewrite
Last year I wrote a blog titled Using Classic ASP and URL Rewrite for Dynamic SEO Functionality, in which I described how you could combine Classic ASP and the URL Rewrite module for IIS to dynamically create Robots.txt and Sitemap.xml files for your website, thereby helping with your Search Engine Optimization (SEO) results. A few weeks ago I had a follow-up question which I thought was worth answering in a blog post.
Here is the question that I was asked:
"What if I don't want to include all dynamic pages in sitemap.xml but only a select few or some in certain directories because I don't want bots to crawl all of them. What can I do?"
That's a great question, and it wasn't tremendously difficult for me to update my original code samples to address this request. First of all, the majority of the code from my last blog will remain unchanged - here's the file by file breakdown for the changes that need made:
Filename | Changes |
---|---|
Robots.asp | None |
Sitemap.asp | See the sample later in this blog |
Web.config | None |
So if you are already using the files from my original blog, no changes need to be made to your Robot.asp file or the URL Rewrite rules in your Web.config file because the question only concerns the files that are returned in the the output for Sitemap.xml.
The good news it, I wrote most of the heavy duty code in my last blog - there were only a few changes that needed to made in order to accommodate the requested functionality. The main difference is that the original Sitemap.asp file used to have a section that recursively parsed the entire website and listed all of the files in the website, whereas this new version moves that section of code into a separate function to which you pass the unique folder name to parse recursively. This allows you to specify only those folders within your website that you want in the resultant sitemap output.
With that being said, here's the new code for the Sitemap.asp file:
<% Option Explicit On Error Resume Next Response.Clear Response.Buffer = True Response.AddHeader "Connection", "Keep-Alive" Response.CacheControl = "public" Dim strUrlRoot, strPhysicalRoot, strFormat Dim objFSO, objFolder, objFile strPhysicalRoot = Server.MapPath("/") Set objFSO = Server.CreateObject("Scripting.Filesystemobject") strUrlRoot = "http://" & Request.ServerVariables("HTTP_HOST") ' Check for XML or TXT format. If UCase(Trim(Request("format")))="XML" Then strFormat = "XML" Response.ContentType = "text/xml" Else strFormat = "TXT" Response.ContentType = "text/plain" End If ' Add the UTF-8 Byte Order Mark. Response.Write Chr(CByte("&hEF")) Response.Write Chr(CByte("&hBB")) Response.Write Chr(CByte("&hBF")) If strFormat = "XML" Then Response.Write "<?xml version=""1.0"" encoding=""UTF-8""?>" & vbCrLf Response.Write "<urlset xmlns=""http://www.sitemaps.org/schemas/sitemap/0.9"">" & vbCrLf End if ' Always output the root of the website. Call WriteUrl(strUrlRoot,Now,"weekly",strFormat) ' Output only specific folders. Call ParseFolder("/marketing") Call ParseFolder("/sales") Call ParseFolder("/hr/jobs") ' -------------------------------------------------- ' End of file system loop. ' -------------------------------------------------- If strFormat = "XML" Then Response.Write "</urlset>" End If Response.End ' ====================================================================== ' ' Recursively walks a folder path and return URLs based on the ' static *.html files that it locates. ' ' strRootFolder = The base path for recursion ' ' ====================================================================== Sub ParseFolder(strParentFolder) On Error Resume Next Dim strChildFolders, lngChildFolders Dim strUrlRelative, strExt ' Get the list of child folders under a parent folder. strChildFolders = GetFolderTree(Server.MapPath(strParentFolder)) ' Loop through the collection of folders. For lngChildFolders = 1 to UBound(strChildFolders) strUrlRelative = Replace(Mid(strChildFolders(lngChildFolders),Len(strPhysicalRoot)+1),"\","/") Set objFolder = objFSO.GetFolder(Server.MapPath("." & strUrlRelative)) ' Loop through the collection of files. For Each objFile in objFolder.Files strExt = objFSO.GetExtensionName(objFile.Name) If StrComp(strExt,"html",vbTextCompare)=0 Then If StrComp(Left(objFile.Name,6),"google",vbTextCompare)<>0 Then Call WriteUrl(strUrlRoot & strUrlRelative & "/" & objFile.Name, objFile.DateLastModified, "weekly", strFormat) End If End If Next Next End Sub ' ====================================================================== ' ' Outputs a sitemap URL to the client in XML or TXT format. ' ' tmpStrFreq = always|hourly|daily|weekly|monthly|yearly|never ' tmpStrFormat = TXT|XML ' ' ====================================================================== Sub WriteUrl(tmpStrUrl,tmpLastModified,tmpStrFreq,tmpStrFormat) On Error Resume Next Dim tmpDate : tmpDate = CDate(tmpLastModified) ' Check if the request is for XML or TXT and return the appropriate syntax. If tmpStrFormat = "XML" Then Response.Write " <url>" & vbCrLf Response.Write " <loc>" & Server.HtmlEncode(tmpStrUrl) & "</loc>" & vbCrLf Response.Write " <lastmod>" & Year(tmpLastModified) & "-" & Right("0" & Month(tmpLastModified),2) & "-" & Right("0" & Day(tmpLastModified),2) & "</lastmod>" & vbCrLf Response.Write " <changefreq>" & tmpStrFreq & "</changefreq>" & vbCrLf Response.Write " </url>" & vbCrLf Else Response.Write tmpStrUrl & vbCrLf End If End Sub ' ====================================================================== ' ' Returns a string array of folders under a root path ' ' ====================================================================== Function GetFolderTree(strBaseFolder) Dim tmpFolderCount,tmpBaseCount Dim tmpFolders() Dim tmpFSO,tmpFolder,tmpSubFolder ' Define the initial values for the folder counters. tmpFolderCount = 1 tmpBaseCount = 0 ' Dimension an array to hold the folder names. ReDim tmpFolders(1) ' Store the root folder in the array. tmpFolders(tmpFolderCount) = strBaseFolder ' Create file system object. Set tmpFSO = Server.CreateObject("Scripting.Filesystemobject") ' Loop while we still have folders to process. While tmpFolderCount <> tmpBaseCount ' Set up a folder object to a base folder. Set tmpFolder = tmpFSO.GetFolder(tmpFolders(tmpBaseCount+1)) ' Loop through the collection of subfolders for the base folder. For Each tmpSubFolder In tmpFolder.SubFolders ' Increment the folder count. tmpFolderCount = tmpFolderCount + 1 ' Increase the array size ReDim Preserve tmpFolders(tmpFolderCount) ' Store the folder name in the array. tmpFolders(tmpFolderCount) = tmpSubFolder.Path Next ' Increment the base folder counter. tmpBaseCount = tmpBaseCount + 1 Wend GetFolderTree = tmpFolders End Function %>
It should be easily seen that the code is largely unchanged from my previous blog.
One last thing to consider, I didn't make any changes to the Robots.asp file in this blog. But that being said, when you do not want specific paths crawled, you should add rules to your Robots.txt file to disallow those paths. For example, here is a simple Robots.txt file which allows your entire website:
# Robots.txt # For more information on this file see: # http://www.robotstxt.org/ # Define the sitemap path Sitemap: http://localhost:53644/sitemap.xml # Make changes for all web spiders User-agent: * Allow: / Disallow:
If you were going to deny crawling on certain paths, you would need to add the specific paths that you do not want crawled to your Robots.txt file like the following example:
# Robots.txt # For more information on this file see: # http://www.robotstxt.org/ # Define the sitemap path Sitemap: http://localhost:53644/sitemap.xml # Make changes for all web spiders User-agent: * Disallow: /foo Disallow: /bar
With that being said, if you are using my Robots.asp file from my last blog, you would need to update the section of code that defines the paths like my previous example:
Response.Write "# Make changes for all web spiders" & vbCrLf Response.Write "User-agent: *" & vbCrLf Response.Write "Disallow: /foo" & vbCrLf Response.Write "Disallow: /bar" & vbCrLf
I hope this helps. ;-]
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
31 December 2012 • by Bob • IIS, URL Rewrite, SEO, Classic ASP
I had another interesting situation present itself recently that I thought would make a good blog: how to use Classic ASP with the IIS URL Rewrite module to dynamically generate Robots.txt and Sitemap.xml files.
Here's the situation: I host a website for one of my family members, and like everyone else on the Internet, he wanted some better SEO rankings. We discussed a few things that he could do to improve his visibility with search engines, and one of the suggestions that I gave him was to keep his Robots.txt and Sitemap.xml files up-to-date. But there was an additional caveat - he uses two separate DNS names for the same website, and that presents a problem for absolute URLs in either of those files. Before anyone points out that it's usually not a good idea to host multiple DNS names on the same content, there are times when this is acceptable; for example, if you are trying to decide which of several DNS names is the best to use, you might want to bind each name to the same IP address and parse your logs to find out which address is getting the most traffic.
In any event, the syntax for both Robots.txt and Sitemap.xml files is pretty easy, so I wrote a couple of simple Classic ASP Robots.asp and Sitemap.asp pages that output the correct syntax and DNS-specific URLs for each domain name, and I wrote some simple URL Rewrite rules that rewrite inbound requests for Robots.txt and Sitemap.xml files to the ASP pages, while blocking direct access to the Classic ASP pages themselves.
All of that being said, there are a couple of quick things that I would like to mention before I get to the code:
That being said, let's move on to the actual code.
There are three files that you will need to create for this example:
You need to save the following code sample as Robots.asp in the root of your website; this page will be executed whenever someone requests the Robots.txt file for your website. This example is very simple: it checks for the requested hostname and uses that to dynamically create the absolute URL for the website's Sitemap.xml file.
<% Option Explicit On Error Resume Next Dim strUrlRoot Dim strHttpHost Dim strUserAgent Response.Clear Response.Buffer = True Response.ContentType = "text/plain" Response.CacheControl = "public" Response.Write "# Robots.txt" & vbCrLf Response.Write "# For more information on this file see:" & vbCrLf Response.Write "# http://www.robotstxt.org/" & vbCrLf & vbCrLf strHttpHost = LCase(Request.ServerVariables("HTTP_HOST")) strUserAgent = LCase(Request.ServerVariables("HTTP_USER_AGENT")) strUrlRoot = "http://" & strHttpHost Response.Write "# Define the sitemap path" & vbCrLf Response.Write "Sitemap: " & strUrlRoot & "/sitemap.xml" & vbCrLf & vbCrLf Response.Write "# Make changes for all web spiders" & vbCrLf Response.Write "User-agent: *" & vbCrLf Response.Write "Allow: /" & vbCrLf Response.Write "Disallow: " & vbCrLf Response.End %>
The following example file is also pretty simple, and you would save this code as Sitemap.asp in the root of your website. There is a section in the code where it loops through the file system looking for files with the *.html file extension and only creates URLs for those files. If you want other files included in your results, or you want to change the code from static to dynamic content, this is where you would need to update the file accordingly.
<% Option Explicit On Error Resume Next Response.Clear Response.Buffer = True Response.AddHeader "Connection", "Keep-Alive" Response.CacheControl = "public" Dim strFolderArray, lngFolderArray Dim strUrlRoot, strPhysicalRoot, strFormat Dim strUrlRelative, strExt Dim objFSO, objFolder, objFile strPhysicalRoot = Server.MapPath("/") Set objFSO = Server.CreateObject("Scripting.Filesystemobject") strUrlRoot = "http://" & Request.ServerVariables("HTTP_HOST") ' Check for XML or TXT format. If UCase(Trim(Request("format")))="XML" Then strFormat = "XML" Response.ContentType = "text/xml" Else strFormat = "TXT" Response.ContentType = "text/plain" End If ' Add the UTF-8 Byte Order Mark. Response.Write Chr(CByte("&hEF")) Response.Write Chr(CByte("&hBB")) Response.Write Chr(CByte("&hBF")) If strFormat = "XML" Then Response.Write "<?xml version=""1.0"" encoding=""UTF-8""?>" & vbCrLf Response.Write "<urlset xmlns=""http://www.sitemaps.org/schemas/sitemap/0.9"">" & vbCrLf End if ' Always output the root of the website. Call WriteUrl(strUrlRoot,Now,"weekly",strFormat) ' -------------------------------------------------- ' This following section contains the logic to parse ' the directory tree and return URLs based on the ' static *.html files that it locates. This is where ' you would change the code for dynamic content. ' -------------------------------------------------- strFolderArray = GetFolderTree(strPhysicalRoot) For lngFolderArray = 1 to UBound(strFolderArray) strUrlRelative = Replace(Mid(strFolderArray(lngFolderArray),Len(strPhysicalRoot)+1),"\","/") Set objFolder = objFSO.GetFolder(Server.MapPath("." & strUrlRelative)) For Each objFile in objFolder.Files strExt = objFSO.GetExtensionName(objFile.Name) If StrComp(strExt,"html",vbTextCompare)=0 Then If StrComp(Left(objFile.Name,6),"google",vbTextCompare)<>0 Then Call WriteUrl(strUrlRoot & strUrlRelative & "/" & objFile.Name, objFile.DateLastModified, "weekly", strFormat) End If End If Next Next ' -------------------------------------------------- ' End of file system loop. ' -------------------------------------------------- If strFormat = "XML" Then Response.Write "</urlset>" End If Response.End ' ====================================================================== ' ' Outputs a sitemap URL to the client in XML or TXT format. ' ' tmpStrFreq = always|hourly|daily|weekly|monthly|yearly|never ' tmpStrFormat = TXT|XML ' ' ====================================================================== Sub WriteUrl(tmpStrUrl,tmpLastModified,tmpStrFreq,tmpStrFormat) On Error Resume Next Dim tmpDate : tmpDate = CDate(tmpLastModified) ' Check if the request is for XML or TXT and return the appropriate syntax. If tmpStrFormat = "XML" Then Response.Write " <url>" & vbCrLf Response.Write " <loc>" & Server.HtmlEncode(tmpStrUrl) & "</loc>" & vbCrLf Response.Write " <lastmod>" & Year(tmpLastModified) & "-" & Right("0" & Month(tmpLastModified),2) & "-" & Right("0" & Day(tmpLastModified),2) & "</lastmod>" & vbCrLf Response.Write " <changefreq>" & tmpStrFreq & "</changefreq>" & vbCrLf Response.Write " </url>" & vbCrLf Else Response.Write tmpStrUrl & vbCrLf End If End Sub ' ====================================================================== ' ' Returns a string array of folders under a root path ' ' ====================================================================== Function GetFolderTree(strBaseFolder) Dim tmpFolderCount,tmpBaseCount Dim tmpFolders() Dim tmpFSO,tmpFolder,tmpSubFolder ' Define the initial values for the folder counters. tmpFolderCount = 1 tmpBaseCount = 0 ' Dimension an array to hold the folder names. ReDim tmpFolders(1) ' Store the root folder in the array. tmpFolders(tmpFolderCount) = strBaseFolder ' Create file system object. Set tmpFSO = Server.CreateObject("Scripting.Filesystemobject") ' Loop while we still have folders to process. While tmpFolderCount <> tmpBaseCount ' Set up a folder object to a base folder. Set tmpFolder = tmpFSO.GetFolder(tmpFolders(tmpBaseCount+1)) ' Loop through the collection of subfolders for the base folder. For Each tmpSubFolder In tmpFolder.SubFolders ' Increment the folder count. tmpFolderCount = tmpFolderCount + 1 ' Increase the array size ReDim Preserve tmpFolders(tmpFolderCount) ' Store the folder name in the array. tmpFolders(tmpFolderCount) = tmpSubFolder.Path Next ' Increment the base folder counter. tmpBaseCount = tmpBaseCount + 1 Wend GetFolderTree = tmpFolders End Function %>
Note: There are two helper methods in the preceding example that I should call out:
The last step is to add the URL Rewrite rules to the Web.config file in the root of your website. The following example is a complete Web.config file, but you could merge the rules into your existing Web.config file if you have already created one for your website. These rules are pretty simple, they rewrite all inbound requests for Robots.txt to Robots.asp, and they rewrite all requests for Sitemap.xml to Sitemap.asp?format=XML and requests for Sitemap.txt to Sitemap.asp?format=TXT; this allows requests for both the XML-based and text-based sitemaps to work, even though the Robots.txt file contains the path to the XML file. The last part of the URL Rewrite syntax returns HTTP 404 errors if anyone tries to send direct requests for either the Robots.asp or Sitemap.asp files; this isn't absolutely necesary, but I like to mask what I'm doing from prying eyes. (I'm kind of geeky that way.)
<?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rewriteMaps> <clear /> <rewriteMap name="Static URL Rewrites"> <add key="/robots.txt" value="/robots.asp" /> <add key="/sitemap.xml" value="/sitemap.asp?format=XML" /> <add key="/sitemap.txt" value="/sitemap.asp?format=TXT" /> </rewriteMap> <rewriteMap name="Static URL Failures"> <add key="/robots.asp" value="/" /> <add key="/sitemap.asp" value="/" /> </rewriteMap> </rewriteMaps> <rules> <clear /> <rule name="Static URL Rewrites" patternSyntax="ECMAScript" stopProcessing="true"> <match url=".*" ignoreCase="true" negate="false" /> <conditions> <add input="{Static URL Rewrites:{REQUEST_URI}}" pattern="(.+)" /> </conditions> <action type="Rewrite" url="{C:1}" appendQueryString="false" redirectType="Temporary" /> </rule> <rule name="Static URL Failures" patternSyntax="ECMAScript" stopProcessing="true"> <match url=".*" ignoreCase="true" negate="false" /> <conditions> <add input="{Static URL Failures:{REQUEST_URI}}" pattern="(.+)" /> </conditions> <action type="CustomResponse" statusCode="404" subStatusCode="0" /> </rule> <rule name="Prevent rewriting for static files" patternSyntax="Wildcard" stopProcessing="true"> <match url="*" /> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsFile" /> </conditions> <action type="None" /> </rule> </rules> </rewrite> </system.webServer> </configuration>
That sums it up for this blog; I hope that you get some good ideas from it.
For more information about the syntax in Robots.txt and Sitemap.xml files, see the following URLs:
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/