Just a short, simple blog for Bob to share his thoughts.
28 November 2012 • by Bob • Guitar
Many years ago - more years than I will care to admit - I saw Cheap Trick in concert. (Okay, just to give you an idea of how long ago this was - Cheap Trick was touring to promote their Cheap Trick at Budokan album; you can do the math from there.) At this point in my life, I hadn't been playing the guitar for very long, and my main guitar at the time was a cheap 3/4-size nylon-string acoustic that my dad had bought for me from a store on a military base. Military bases aren't known for keeping great guitars in stock, so it needs little explanation that I was fascinated by any cool guitar that came along. This made seeing Cheap Trick even more entertaining, because their lead guitar player, Rick Nielson, used something like 1,000 different guitars throughout the show.
But one particular guitar caught my eye - an Explorer; something about it's futuristic shape seemed to me like the coolest guitar ever. Rick Nielson played an Explorer from Hamer Guitars, but I soon learned that Hamer's Explorer was a copy of the Gibson Explorer, and that became the 'Guitar to Have' for me.
![]() |
Rick Nielson (left) playing a Hamer Explorer onstage with Cheap Trick. (Note: This image is originally from Wikipedia.) |
About this time I was in my first rock band with my good friend Gene Faith. Even though we both actually played the guitar, we liked to create fake instruments for ourselves - I made myself a fake guitar out of scrap wood that looked like an Explorer, even though it was hollow and had strings that were made out of rubber bands. But it was cool - there was no doubt in my mind about that. Once we had some 'instruments' at our disposal, we'd put on a record and pretend to actually play these fake instruments and jump around my dad's living room like we were rock stars. (Hey, don't laugh so hard - I was only 12 or 13 years old.)
My first electric guitar was a cheap copy of a Gibson SG that I purchased at Sears for somewhere around $100. (And believe me - I delivered a lot of newspapers to earn the $100 to buy that guitar.) It was okay as a starter guitar, but I soon found myself wanting a better axe. A year or so later I saved up more of the proceeds from my newspaper route and I bought an Explorer copy from an off-brand company named Seville - it was nowhere near as good as a Gibson, but it was the best that I could do on a paperboy's budget. It had a hideous tobacco sunburst paint job, so I removed the neck and hardware, sanded the body down to the bare wood, stained it with a dark wood color, and then I shellacked the body with a clear finish. When I reassembled the guitar, it looked pretty good. I played that Explorer for a few years, and I eventually sold it to my friend Gene.
Jumping ahead a few decades, another good friend, Harold Perry, was moving from Seattle to San Francisco, so he was parting with a bunch of musical gear. I'm always in the market for seasoned gear that needs a new home, so Harold and I were going through a bunch of his old items while I was deciding what I might want to buy. Harold had bought a 1980 Gibson Explorer II several years earlier as a 'project guitar' - it had been badly treated by a previous owner and needed a lot of repair work. Since Harold was moving, he didn't expect to have time to finish the guitar, and he wanted it to find a good home, so he sold it to me for a great price.
And so my adventure with guitar restoration began as a labor of love.
When I took the guitar home, the first thing that I did was strip all of the remaining hardware off the guitar; thereby leaving nothing but the wood body. I then proceeded to polish every inch of the guitar for a few hours. Whoever had owned the guitar before Harold apparently had some hygiene issues and it seemed like he had never cleaned the guitar despite voluminous amounts of caked sweat that coated much of the surface. What's more, his sweat had corroded all of the stock hardware, so nearly all of the hardware would need to be replaced. With that in mind, I decided that this would be a long-term project and I would take my time with it.
The next thing that I needed to do was to polish the hardware that I intended to keep - which was just the brass nut and frets, all of which looked pretty hideous. I used Mr. Metal to polish the hardware, which seemed a strangely apropos title for a former heavy metal dude.
Badly-tarnished frets and nut. | Dude - it's "Mr. Metal." :-O |
The pile of used cotton patches after I finished polishing. |
Shiny frets and brass nut! |
Over several months I slowly bought new hardware that I needed. I'll spare you most of the details, but suffice it to say that it took a long time for me to locate and purchase all of the right replacement parts that I wanted. I primarily bought the hardware from Stewart McDonald, Musician's Friend, and Guitar Center, and I had the guys at Parson's Guitars create a new truss rod cover to replace the original that had been lost before the guitar had found its way to me. In the end, I replaced the bridge, tailpiece, volume & tone potentiometers, tuning machines, strap locks, toggle switch, and speed knobs. (The folks at Parson's Guitars thought that replacing the stock Gibson parts was a sacrilege, even though I explained that keeping the stock parts left the guitar unplayable.)
Before I started wiring the guitar, I lined the inside of the routing cavities with copper tape - this is supposed to reduce EMI on the guitar. I've never used it before, so it's something of an experiment. In any event - lining the routing took several hours to complete; time will tell if it was worth it.
The next part of the project was to install the new guitar tuning machines. Oddly enough, Gibson won't sell their inline-6 set of tuners for an Explorer to customers, so I had to buy tuning machines from another company. I eventually decided on tuning machines from Gotoh, which I was able to order through Stewart McDonald. The trouble is, once I mounted them on the headstock, I discovered that the screw holes for the tuning machines were off by a little over a millimeter. (If you look at the image, you can see that the screw holes are angled slightly downward on the right side of the machines, but they needed to be perpendicular to the machine shafts.)
After doing some additional research, I discovered that the only Gotoh tuning machines that Stewart McDonald sells are Gotoh's SG381 tuning machines, and I needed their SG360 tuning machines for my Explorer. After a quick call to Stewart McDonald, I verified that they cannot order Gotoh's SG360 tuning machines for me, so I searched the Internet until I found a distributer in Australia who could ship them to me. It took several weeks for the tuners to make the journey to the United States, but when they arrived they were a perfect fit.
Once I had the right tuning machines installed, I started the long process of wiring and soldering the electronics.
Once I completed the wiring, the last hurdles were to re-string the guitar, tune it up, adjust the string height and intonation, and test it out. (Which is the fun part.)
That about sums it up. The guitar looks great and plays great, although I might drop it by the folks at Parson's Guitars and have them them give it a quick tune-up for good measure.
Special thanks go to Harold for hooking me up with this guitar; and I also owe a big set of thanks to my wife, Kathleen, for humoring me while I took over one of the rooms in our house for the several weeks that I spent working on this project. ;-)
30 October 2012 • by Bob • General
After two long years of sacrificing my evenings and weekends in order to complete homework assignments, I just received the following in the mail:
This obviously signifies that I have finally earned my Bachelor's Degree. This is traditionally a four-year degree, but I managed to complete my degree in just over 28 years from when I first started college. (So anyone who is currently on a five-year plan for their four-year degree, take my word for it - you could do a lot worse.)
By way of explanation, I had never finished my Bachelor's Degree; I dropped out of college during my freshman year when I got married and we needed the money. Our idea at the time was that I would work full-time while my wife went to school full-time, then we would swap roles when she completed her nursing degree. Unfortunately, our lives didn't work out that way. Shortly after I dropped out of college I joined the US Army, and that put a temporary halt on both of our college aspirations as the military continuously transferred us from one location to another.
After five years in the Army, I was finally at a time and place in my life where I could go to college in the evenings and do my homework during the weekends. Because of this, I received my Associate's Degree around the time that I finished eight years in the military; this meant that I had earned my two-year degree almost 9 years after I first started college.
A few months after I received my Associate's Degree I left the Army, and my plan at the time was to go to school and finish my Bachelor's Degree. But once again, my plans didn't work out that way. Sometime during my first year back in school, Microsoft offered me a job, and that opportunity was simply too good to pass up. This was ultimately a great decision, but it meant that my college goals needed to be put on hold again.
Sometime around my fifteen-year anniversary at Microsoft I decided that I was once again in a time and place in my life where I could go to college in the evenings and weekends, so I enrolled in an online program through Liberty University. (I chose this school because their online programs are very friendly to current and former members of the military.) My declared major was Multidisciplinary Studies, which is a fancy term for a program that allows you to split your major into two or three concentrated subject areas. (I chose Computer Science and Religion.)
Jumping ahead a couple of years, I found myself studying hard to complete all of my upper-division courses while putting three children through college, flying around the world to speak at various technical conferences, surviving the weddings for two of my children, and juggling a work schedule that typically comprised 50 to 60 hours a week.
In the end, I finished all of my courses at Liberty University in just over two years - and I managed to maintain a 4.0 GPA throughout my studies, thereby graduating Summa Cum Laude. (Which is probably Latin for "You really need to get a life.")
So if I do the math correctly, it took me 9 years to get my two-year degree, and it took me an additional 19 years to get my four-year degree. At this pace, I should have my Master's Degree 29 years from now.
;-]
03 October 2012 • by Bob • IIS, Scripting, FTP, Extensibility, IIS, Scripting, FTP, Extensibility
I was recently contacted by someone who was trying to use Windows Management Instrumentation (WMI) code to stop and restart FTP websites by using code that he had written for IIS 6.0; his code was something similar to the following:
Option Explicit On Error Resume Next Dim objWMIService, colItems, objItem ' Attach to the IIS service. Set objWMIService = GetObject("winmgmts:\root\microsoftiisv2") ' Retrieve the collection of FTP sites. Set colItems = objWMIService.ExecQuery("Select * from IIsFtpServer") ' Loop through the sites collection. For Each objItem in colItems ' Restart one single website. If (objItem.Name = "MSFTPSVC/1") Then Err.Clear objItem.Stop If (Err.Number <> 0) Then WScript.Echo Err.Number objItem.Start If (Err.Number <> 0) Then WScript.Echo Err.Number End If Next
The problem that the customer was seeing is that this query did not return the list of FTP-based websites for IIS 7.0 or IIS 7.5 (called IIS7 henceforth), although changing the class in the query from IIsFtpServer to IIsWebServer would make the script work with HTTP-based websites those versions of IIS7.
The problem with the customer's code was that he is using WMI to manage IIS7; this relies on our old management APIs that have been deprecated, although part of that model is partially available through the metabase compatibility feature in IIS7. Here's what I mean by "partially": only a portion of the old ADSI/WMI objects are available, and unfortunately FTP is not part of the objects that can be scripted through the metabase compatibility feature in IIS7.
That being said, what the customer wants to do is still possible through scripting in both IIS7 and IIS8, and the following sample shows how to loop through all of the sites, determine which sites have FTP bindings, and then stop/start FTP for each site. To use this script, copy the code into a text editor like Windows Notepad and save it with a name like "RestartAllFtpSites.vbs" to your system, then double-click the file to run it.
' Temporarily disable breaking on runtime errors. On Error Resume Next ' Create an Admin Manager object. Set adminManager = CreateObject("Microsoft.ApplicationHost.AdminManager") adminManager.CommitPath = "MACHINE/WEBROOT/APPHOST" ' Test for commit path support. If Err.Number <> 0 Then Err.Clear ' Create a Writable Admin Manager object. Set adminManager = CreateObject("Microsoft.ApplicationHost.WritableAdminManager") adminManager.CommitPath = "MACHINE/WEBROOT/APPHOST" If Err.Number <> 0 Then WScript.Quit End If ' Resume breaking on runtime errors. On Error Goto 0 ' Retrieve the sites collection. Set sitesSection = adminManager.GetAdminSection("system.applicationHost/sites", "MACHINE/WEBROOT/APPHOST") Set sitesCollection = sitesSection.Collection ' Loop through the sites collection. For siteCount = 0 To CInt(sitesCollection.Count)-1 isFtpSite = False ' Determine if the current site is an FTP site by checking the bindings. Set siteElement = sitesCollection(siteCount) Set bindingsCollection = siteElement.ChildElements.Item("bindings").Collection For bindingsCount = 0 To CInt(bindingsCollection.Count)-1 Set bindingElement = bindingsCollection(bindingsCount) If StrComp(CStr(bindingElement.Properties.Item("protocol").Value),"ftp",vbTextCompare)=0 Then isFtpSite = True Exit For End If Next ' If it's an FTP site, start and stop the site. If isFtpSite = True Then Set ftpServerElement = siteElement.ChildElements.Item("ftpServer") ' Create an instance of the Stop method. Set stopFtpSite = ftpServerElement.Methods.Item("Stop").CreateInstance() ' Execute the method to stop the FTP site. stopFtpSite.Execute() ' Create an instance of the Start method. Set startFtpSite = ftpServerElement.Methods.Item("Start").CreateInstance() ' Execute the method to start the FTP site. startFtpSite.Execute() End If Next
And the following code sample shows how to stop/start a single FTP site. To use this script, copy the code into a text editor like Windows Notepad, rename the site name appropriately for one of your FTP sites, save it with a name like "RestartContosoFtpSite.vbs" to your system, then double-click the file to run it.
' Temporarily disable breaking on runtime errors. On Error Resume Next ' Create an Admin Manager object. Set adminManager = CreateObject("Microsoft.ApplicationHost.AdminManager") adminManager.CommitPath = "MACHINE/WEBROOT/APPHOST" ' Test for commit path support. If Err.Number <> 0 Then Err.Clear ' Create a Writable Admin Manager object. Set adminManager = CreateObject("Microsoft.ApplicationHost.WritableAdminManager") adminManager.CommitPath = "MACHINE/WEBROOT/APPHOST" If Err.Number <> 0 Then WScript.Quit End If ' Resume breaking on runtime errors. On Error Goto 0 ' Retrieve the sites collection. Set sitesSection = adminManager.GetAdminSection("system.applicationHost/sites", "MACHINE/WEBROOT/APPHOST") Set sitesCollection = sitesSection.Collection ' Locate a specific site. siteElementPos = FindElement(sitesCollection, "site", Array("name", "ftp.contoso.com")) If siteElementPos = -1 Then WScript.Echo "Site was not found!" WScript.Quit End If ' Determine if the selected site is an FTP site by checking the bindings. Set siteElement = sitesCollection(siteElementPos) Set bindingsCollection = siteElement.ChildElements.Item("bindings").Collection For bindingsCount = 0 To CInt(bindingsCollection.Count)-1 Set bindingElement = bindingsCollection(bindingsCount) If StrComp(CStr(bindingElement.Properties.Item("protocol").Value),"ftp",vbTextCompare)=0 Then isFtpSite = True Exit For End If Next ' If it's an FTP site, start and stop the site. If isFtpSite = True Then Set ftpServerElement = siteElement.ChildElements.Item("ftpServer") ' Create an instance of the Stop method. Set stopFtpSite = ftpServerElement.Methods.Item("Stop").CreateInstance() ' Execute the method to stop the FTP site. stopFtpSite.Execute() ' Create an instance of the Start method. Set startFtpSite = ftpServerElement.Methods.Item("Start").CreateInstance() ' Execute the method to start the FTP site. startFtpSite.Execute() End If ' Locate and return the index for a specific element in a collection. Function FindElement(collection, elementTagName, valuesToMatch) For i = 0 To CInt(collection.Count) - 1 Set elem = collection.Item(i) If elem.Name = elementTagName Then matches = True For iVal = 0 To UBound(valuesToMatch) Step 2 Set prop = elem.GetPropertyByName(valuesToMatch(iVal)) value = prop.Value If Not IsNull(value) Then value = CStr(value) End If If Not value = CStr(valuesToMatch(iVal + 1)) Then matches = False Exit For End If Next If matches Then Exit For End If End If Next If matches Then FindElement = i Else FindElement = -1 End If End Function
I hope this helps!
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
05 September 2012 • by Bob • IIS 8, IIS News Item, Windows Server 2012
Following up on today's public release of Microsoft Windows Server 2012 and Internet Information Services 8.0, you'll notice some big changes on the IIS.net website.
Over the past few months, we've been working hard with several partners to roll out a brand-new design for the IIS.net website that resembles more closely the look and feel of our websites for Microsoft Azure, Windows Server 2012, and Visual Studio 2012.
04 September 2012 • by Bob • IIS News Item, Windows Server 2012, IIS 8
Microsoft has just released Windows Server 2012! You can find out more about this release on the Official Windows Server 2012 Launch Website (http://www.windows-server-launch.com).
In tandem with the release of Windows Server 2012, the IIS team is happy to announce the general availability of Internet Information Services 8.0 This new version of IIS offers a wealth of new features and improvements, and here are just a few of the enhancements that you can expect in IIS 8.0: Application Initialization, Dynamic IP Address Restrictions, Centralized SSL Certificate Store, CPU Throttling, FTP Logon Attempt Restrictions, Server Name Indication (SNI) Support, Improved SSL and Configuration Scalability, support for Multicore Scaling on NUMA Hardware, and more! Additional information about IIS 8.0 is available in the "What's New in IIS 8.0 for Windows 8?" web page.
If you'd like to try IIS 8.0 for yourself, you can download the evaluation version and start experimenting today!
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
01 September 2012 • by Bob • FTP, IIS, Windows
The folks in the TechEd group have uploaded the video from my "What's New with Internet Information Services (IIS) 8: Performance, Scalability, and Security Features" presentation to YouTube, so you can view the video online.
You can also download the slides and the WMV/MP4 for my presentation at the following URL:
http://channel9.msdn.com/Events/TechEd/NorthAmerica/2012/WSV332
One quick side note: around 38:55 during the video, I had just asked the audience if anyone had used the IIS Configuration Editor, when a tremendous thunderclap resounded outside - this prompted a great laugh from audience members. After the presentation had ended, a couple people came up and jokingly asked how I had managed to stage that so well.
28 August 2012 • by Bob • FTP, Extensibility
I recently received a question from a customer about troubleshooting custom FTP providers, and I recommended using the FTP service's Event Tracing for Windows (ETW) features in order to help troubleshoot the problem. I've helped a lot of customers use this little-known feature of the FTP service, so I thought that it would make a great subject for a quick blog.
By way of explanation, the FTP service in IIS 7.5 and IIS 8.0 allows developers to write their own custom functionality, and over the past several years I have written several walkthroughs and blogs that illustrate how you can create your own custom FTP providers:
That being said, sometimes things go wrong, and when that happens, I use some FTP ETW troubleshooting tricks that I'd like to share.
Several years ago I wrote a blog about FTP and ETW Tracing, where I described how to turn on the FTP service's ETW tracing through a batch file, and then it used Log Parser to render the output in a datagrid for analysis. In the interests of completeness, here is the batch file again:
@echo off rem====================================================================== echo Verifying that LogParser.exe is in the path... LogParser -h >nul 2>nul if errorlevel 1 ( echo. echo Error: echo. echo LogParser.exe is was not found. It is required for parsing traces. echo. echo Recommended actions: echo. echo - If LogParser is installed then fix the PATH echo variable to include the LogParser directory echo. echo - If LogParser is not installed, then install echo it from the following location: echo. echo http://www.microsoft.com/downloads/details.aspx?FamilyID=890cd06b-abf8-4c25-91b2-f8d975cf8c07 echo. goto :EOF ) else ( echo Done. echo. ) rem====================================================================== echo Starting the ETW session for full FTP tracing... logman start "ftp" -p "IIS: Ftp Server" 255 5 -ets echo. echo Now reproduce your problem. echo. echo After you have reproduced your issue, hit any key to close the FTP echo tracing session. Your trace events will be displayed automatically. echo. pause>nul rem====================================================================== echo. echo Closing the ETW session for full FTP tracing... logman stop "ftp" -ets rem====================================================================== echo. echo Parsing the results - this may take a long time depending on the size of the trace... LogParser "select EventTypeName, UserData from ftp.etl" -e 2 -o:DATAGRID -compactModeSep " | " -rtp 20 |
When you save and run this batch file, it will display something like the following:
C:\FTP_ETW.cmd Verifying that LogParser.exe is in the path... Done. Starting the ETW session for full FTP tracing... The command completed successfully. Now reproduce your problem. After you have reproduced your issue, hit any key to close the FTP tracing session. Your trace events will be displayed automatically. |
When you see this displayed, you will need to reproduce your problem, and FTP's ETW tracing will record the troubleshooting information.
Once you have reproduced your problem, hit a key to end the ETW session, and you will see the following message displayed:
Closing the ETW session for full FTP tracing... The command completed successfully. Parsing the results - this may take a long time depending on the size of the trace... |
The batch file will eventually call Log Parser to parse the ETW events, and a dialog like the following will be displayed:
Now that you know how to set up FTP's ETW tracing, let's examine what you should be looking for in the tracing information.In all of the examples in this blog, I am using the XML-based authentication provider that is documented in the How to Use Managed Code (C#) to Create an FTP Authentication Provider using an XML Database walkthrough.
The following illustration highlights several lines that show the FTP service starting its authentication process, loading my custom authentication provider, and ending the authentication process after I have successfully logged in:
This example shows what everything looks like when it works as expected, so now let's look at what happens when something goes wrong.
If I use the same provider, but I enter my username or password incorrectly, I will see the following lines in the trace:
This example informs you that the provider was loaded successfully, but the logon failed. The error code that is returned is 0x8007052E - this hexadecimal 32-bit value can be split into 16-bit values:
Logon failure: unknown user name or bad password.
"If I continue to use the same provider as earlier, and I delete the XML file that my provider uses, then I will receive the following error:
Once again, this example informs you that the provider was loaded successfully, but an error occurred. In this specific case you see the actual details that the XML file exists, and that is an error that is returned by a throw() statement in the provider. The error code that is returned is 0x80070057 - and once again this hexadecimal 32-bit value can be split into 16-bit values:
The parameter is incorrect.
"If I replace the missing XML file for the provider, but I remove all of the permissions to the file, I get the following error:
As in the previous examples, this informs you that the provider was loaded successfully, but an error occurred. You can't look up the 0x80131500 error code by using "NET HELPMSG" from a command-prompt, but that doesn't matter since the error description informs you of the problem - access to the path where the file is located was denied.
If I enter a bad provider name, I get the following error:
Unlike the previous examples, this informs you that the provider was not loaded successfully. The description for this error informs you that it could not load the provider, and it gives you the assembly information. In addition to the error description, the error code that is returned by the FTP service is 0x80070002 - and once again this hexadecimal 32-bit value can be split into 16-bit values:
The system cannot find the file specified.
"So now let's look at a common perplexing problem:
This example shows the same 0x8007052E error code that we looked at in a previous example, but you'll notice that any reference to the provider is conspicuously absent from the trace - this means that the FTP service made no attempt to load the custom authentication provider. In this specific case, even though I had correctly registered my custom FTP authentication provider on the system, I had not added or enabled the custom authentication provider for my FTP site.
In this blog I showed you how to troubleshoot several different errors with FTP custom authentication providers by using FTP's ETW features.
As a parting thought, I should point out that the most-common error that I run into when creating my own providers is the last example. Believe it or not, I nearly always miss a step when I am creating a new provider and I forget to add a setting here or there which will cause the FTP service to completely ignore my provider. A perfect example is when I am writing custom home directory providers - I always remember to add the provider to the global list of FTP providers, and I usually remember to add the provider to the list of custom features for my FTP site, but I forget to configure my FTP site to use custom user isolation and my provider is ignored. (Darn, darn, darn...)
;-]
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
25 August 2012 • by Bob • LogParser, Scripting
In Part 5 of this series, I'll show you how to create a generic script that you can use to add some color to your Log Parser charts. As I mentioned in Part 1 of this series, the default colors for Log parser charts are really dull and boring. For example, if I parse one month's worth of log files from one of my low-volume websites with the following query:
logparser.exe "SELECT date,COUNT(*) AS Hits INTO HITS.gif FROM *.log GROUP BY date ORDER BY date" -i:w3c -o:CHART -chartType:ColumnClustered -chartTitle:"" -q:ON
Log Parser will create the following ugly daily hits chart:
Here's the background story for this blog: I have a collection of scripts that I use to format my charts, several of which have faithfully served as the fodder for this blog series. With that in mind, I had a situation recently where I was querying logs with a series of data just like this, and of course the resulting charts were kind of hideous to look at. In one of the scripts that I often use, I create an array of colors to use, and then I apply the various colors to the individual data points in the series.
In the past I have always hard-coded the length for the array of colors based on the data that I am working with, but in this situation I had no idea how many data points I would have, so I decided to put together a quick script with an array that would work with a series of any size.
Here's the resulting script:
// Set a default color for the chart's data. chart.SeriesCollection(0).Interior.Color = "#ffcccc"; // Define a short array of colors. var colors = [ "#ffff99", "#ff99ff", "#ff9999", "#99ffff", "#99ff99", "#9999ff", "#ffffcc", "#ffccff", "#ffcccc", "#ccffff", "#ccffcc", "#ccccff" ]; // Loop through the data points in the series. for (x=0;x<chart.SeriesCollection(0).Points.Count;++x) { // Set the color for the data point based on modulo division of the array length. chart.SeriesCollection(0).Points(x).Interior.Color = colors[x % colors.length ]; }
That's all that there is to the script - it's pretty simple. If I take the above script and save it as "FormatChart.js", I can use that script with my Log Parser query from earlier by adding an extra parameter to the command:
logparser.exe "SELECT date,COUNT(*) AS Hits INTO HITS.gif FROM *.log GROUP BY date ORDER BY date" -i:w3c -o:CHART -chartType:ColumnClustered -chartTitle:"" -q:ON -config:FormatChart.js
Now Log Parser will create the following daily hits chart with a great deal more color to it:
Okay - perhaps that's not the best color palette, but you get the idea. It looks even better when I change the query to use 3D charts:
logparser.exe "SELECT date,COUNT(*) AS Hits INTO HITS.gif FROM *.log GROUP BY date ORDER BY date" -i:w3c -o:CHART -chartType:Column3D -chartTitle:"" -q:ON -config:FormatChart.js
The above query creates the following chart:
I'd like to make a quick change to the script in order to make it work a little better with a pie chart:
// Set a default color for the chart's data. chart.SeriesCollection(0).Interior.Color = "#cccccc"; // Define a short array of colors. var colors = [ "#cc3333", "#3333cc", "#33cc33", "#33cccc", "#cccc33", "#cc33cc" ]; // Loop through the data points in the series. for (x=0;x<chart.SeriesCollection(0).Points.Count;++x) { // Set the color for the data point based on modulo division of the array length. chart.SeriesCollection(0).Points(x).Interior.Color = colors[x % colors.length ]; } // Rotate the chart 180 degrees - just so it looks a little better. chartSpace.Charts(0).PlotArea.RotateClockwise(); chartSpace.Charts(0).PlotArea.RotateClockwise();
For this query I'd like to see a break down by HTTP status, and this necessitates some small change to the Log parser query:
logparser.exe "SELECT sc-status AS Status,COUNT(*) AS Hits INTO HITS.gif FROM *.log GROUP BY Status ORDER BY Status" -i:w3c -o:CHART -chartType:PieExploded3D -chartTitle:"" -q:ON -config:FormatChart.js
The above query creates the following chart:
That wraps it up for this blog - I hope that I've given you some ideas for ways that you can easily add some colors to some dull-looking Log Parser charts.
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
26 July 2012 • by Bob • IIS, PHP, WinCache
The IIS team has officially signed off on the Windows Cache Extension (WinCache) version 1.3 for PHP 5.4, and the files have been uploaded to SourceForge. This version addresses all of the problems that were identified with WinCache 1.1 that customers were seeing after they upgraded their systems from PHP 5.3 to PHP 5.4.
With that in mind, you can download WinCache 1.3 for for PHP 5.4 from the following URL:
http://sourceforge.net/projects/wincache/files/wincache-1.3.4/
You can discuss WinCache 1.1 and WinCache 1.3 in the Windows Cache Extension for PHP forum on Microsoft's IIS.net website.
Since WinCache is an open source project, the IIS team has uploaded the pre-release source code for WinCache at the following URL:
http://pecl.php.net/package/WinCache
For the instructions on how to build the extension yourself, please refer to the Building WinCache Extension documentation.
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
21 July 2012 • by Bob • WebDAV, WebMatrix
The other day I was talking with one of my coworkers, Yishai Galatzer, about Microsoft's WebMatrix. By way of introduction, Yishai is one of our senior developers on the WebMatrix project; I'm not sure if you've used WebMatrix, but it's a pretty handy website editor. Here's a few generic screen shots:
![]() |
WebMatrix 2 Splash Screen |
![]() |
WebMatrix 2 Quick Start Screen |
![]() |
Editing QDIG in WebMatrix 2 |
In any event, I was explaining how easy it is to work with WebDAV, and I mentioned that I had written some some blogs about working with WebDAV websites programmatically. (See my Sending WebDAV Requests in .NET Revisited blog for an example.) Since WebMatrix 2 has some pretty cool extensibility, Yishai challenged me to write a WebDAV extension for WebMatrix. His idea was just too good for me to pass up, so I stayed up late that night and I wrote a simple WebDAV Website Import extension for WebMatrix 2.
With that in mind, there are a few things that I need to explain in this blog:
The WebDAV Website Importer extension does just what its name implies - it allows you to import a website into WebMatrix over WebDAV. This allows you to download your website to your local computer, where you can make changes to your source files and test them on your local system with IIS Express.
It should be noted that this extension is only designed to create a new local website by downloading a copy of your website's files in order to create a local copy of your website - it is not designed to be a website publishing feature like WebMatrix's built-in FTP and Web Deploy features. (I would like to write a full-featured website import/export/sync extension, but that's another project for another day.)
To install this extension, you first need to install WebMatrix. You can find details about installing WebMatrix at the following URL:
Once you have WebMatrix installed, click the Extensions menu on the ribbon, and then click Gallery.
When the Extensions Gallery appears, you will see the WebDAV Website Importer in the list of extensions.
When you click Install, the WebDAV Website Importer details page will be displayed.
When you click Install, the End User License Agreement for the WebDAV Website Importer will be displayed.
When you click I Accept, WebMatrix will download and install the extension.
Once you have downloaded and installed the WebDAV Website Importer extension, it will show up whenever you are creating a new website in WebMatrix.
When you click Import Site from WebDAV, WebMatrix will prompt you for the credentials to your WebDAV website.
Once you enter your credentials and click OK, the extension will import the content from your WebDAV website and save it in a new local website folder.
So - there you have it; this is a pretty simple extension, but it opens up some WebDAV possibilities for WebMatrix. As I mentioned earlier, this extension is import-only - perhaps I'll write a full-featured import/export/sync extension in the future, but for now - this was a cool test for combining WebMatrix extensibility and WebDAV.
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/