Just a short, simple blog for Bob to share his thoughts.
31 January 2013 • by Bob • Scripting, FTP, Extensibility
I was recently creating a new authentication provider using FTP extensibility, and I ran into a weird behavior that I had seen before. With that in mind, I thought my situation would make a great blog subject because someone else may run into it.
Here are the details of the situation: let's say that you are developing a new FTP provider for IIS, and your code changes never seem to take effect. Your provider appears to be working, it's just that any new functionality is not reflected in your provider's behavior. You restart the FTP service as a troubleshooting step, but that does not appear to make any difference.
I'll bypass mentioning any other troubleshooting tasks and cut to the chase - if you read my Changing the Identity of the FTP 7 Extensibility Process blog post a year ago, you will recall that I mentioned that all custom FTP extensibility providers are executed through COM+ in a DLLHOST.exe process. When you restart the FTP service, that should clean up the DLLHOST.EXE process that is being used for FTP extensibility. However, if you are developing custom FTP providers and the DLLHOST.EXE process is not terminated by the FTP service, you may find yourself in a situation where you have a DLLHOST.EXE process in memory that contains an older copy of your provider, which will not be removed from memory until the DLLHOST.EXE process for FTP extensibility has been forcibly terminated.
If you have read some of my earlier blog posts or walkthroughs on IIS.NET, you may have noticed that I generally like to use a few pre-build and post-build commands in my FTP projects; usually I add these commands in order to to automatically register/unregister my FTP providers in the Global Assembly Cache (GAC).
With a little modification and some command-line wizardry, you can automate the termination of any orphaned DLLHOST.EXE processes that are being used for FTP extensibility. With that in mind, here are some example pre-build/post-build commands that will unregister/reregister your provider in the GAC, restart the FTP service, and terminate any orphaned FTP extensibility DLLHOST.EXE processes.
Note: The following syntax was written using Visual Studio 2010; you would need to change "%VS100COMNTOOLS%" to "%VS90COMNTOOLS%" for Visual Studio 2008 or "%VS110COMNTOOLS%" for Visual Studio 2012.
Pre-build Commands:
net stop ftpsvc
call "%VS100COMNTOOLS%\vsvars32.bat">nul
cd /d "$(TargetDir)"
gacutil.exe /uf "$(TargetName)"
for /f "usebackq tokens=1,2* delims=," %%a in (`tasklist /fi "MODULES eq Microsoft.Web.FtpServer.*" /fi "IMAGENAME eq DLLHOST.EXE" /fo csv ^| find /i "dllhost.exe"`) do taskkill /f /pid %%b
Post-build Commands:
call "%VS100COMNTOOLS%\vsvars32.bat">nul
gacutil.exe /if "$(TargetPath)"
net start ftpsvc
The syntax is a little tricky for the FOR statement, so be carefully when typing or copying/pasting that into your projects. For example, you need to make sure that all of the code from the FOR statement through the TASKKILL command are on the same line in your project's properties.
When you compile your provider, Visual Studio should display something like the following:
------ Rebuild All started: Project: FtpBlogEngineNetAuthentication, Configuration: Release Any CPU ------
The Microsoft FTP Service service is stopping.
The Microsoft FTP Service service was stopped successfully.
Microsoft (R) .NET Global Assembly Cache Utility. Version 4.0.30319.1
Copyright (c) Microsoft Corporation. All rights reserved.
Assembly: FtpBlogEngineNetAuthentication, Version=1.0.0.0, Culture=neutral, PublicKeyToken=426f62526f636b73, processorArchitecture=MSIL
Uninstalled: FtpBlogEngineNetAuthentication, Version=1.0.0.0, Culture=neutral, PublicKeyToken=426f62526f636b73, processorArchitecture=MSIL
Number of assemblies uninstalled = 1
Number of failures = 0
SUCCESS: The process with PID 12656 has been terminated.
FtpBlogEngineNetAuthentication -> C:\Users\dude\Documents\Visual Studio 2010\Projects\FtpBlogEngineNetAuthentication\FtpBlogEngineNetAuthentication\bin\Release\FtpBlogEngineNetAuthentication.dll
Microsoft (R) .NET Global Assembly Cache Utility. Version 4.0.30319.1
Copyright (c) Microsoft Corporation. All rights reserved.
Assembly successfully added to the cache
The Microsoft FTP Service service is starting.
The Microsoft FTP Service service was started successfully.
========== Rebuild All: 1 succeeded, 0 failed, 0 skipped ==========
If you analyze the output from the build process, you will see that the commands in my earlier samples stopped the FTP service, removed the existing assembly from the GAC, terminated any orphaned DLLHOST.EXE processes, registered the newly-built DLL in the GAC, and then restarted the FTP service.
By utilizing these pre-build/post-build commands, I have been able to work around situations where a DLLHOST.EXE process is being orphaned and caching old assemblies in memory.
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
16 January 2013 • by Bob • Ponderings
Perhaps it's because the media is going through yet another season of what seems like a never-ending parade of Hollywood awards programs, but I was thinking the other day about all of the awards that I will never win. For example, I will never win a Golden Globe. I will never win a People's Choice Award. I will never win an Oscar, or a Tony, or an Emmy, or any award that is named after some person who might not be real. And despite a lifetime of playing music, I will never win a Grammy or any other award that the music industry is giving out these days. This may be my reality, but to be perfectly honest, I am never saddened by this, nor do I generally give this concept a second thought.
That being said, the most-recent awards show made me think about the reasons why we even care about those kinds of awards. I can't name who won Best Actor or Actress from any of the awards shows that have taken place in the last several years, and that's really not an issue for me; I'll never meet any of the people who win those awards anyway. What's more, I'm not sure if I would want to meet most of the people who actually win those awards, seeing as how the evening news and morning talk shows are always spinning stories of their latest transgressions. I think the part that gets me the most is how - after throwing their lives away on one selfish pursuit after another - the world eventually calls them "artists," and everyone waxes poetic about how these artists have suffered for their cause; as if they woke up one day and consciously chose to take the road less travelled in Robert Frost's famous poem. When I was younger, I think I bought into that illusion, too. But the older I get, the less I am impressed by their actions - and perhaps I should explain what I mean by that.
If a man whom you knew personally walked out on his wife and family, in most cases you would probably think he was acting like a selfish pig. But if it was a famous actor from Hollywood or a legendary singer from Nashville, you might think to yourself, "Gee, that's too bad...," as if their fame has excused their adverse behavior for some inexplicable reason. You might even go so far as to feel sorry for said person; after all, it's just so sad that their family doesn't understand how hard an artist's life must be.
But why do we feel this way? Why do we put these people on some sort of undeserved pedestal? Is it because they're artists? The more I think about it, I don't believe that they've chosen the road less travelled - I think they've chosen the easy path; they've chosen the path that's all about them. Perhaps that's why they need so many awards shows; they need the constant reassurance that all of the suffering they cause is for a noble purpose. But I just can't bring myself to see it that way.
Let me briefly tell you a true story about my life, and this is difficult for me because it is always dangerous when you open up your life to public scrutiny; you never know what people are going to think. When I was much younger, I faced one of those situations where it seemed like two roads were diverging before me and I had to pick which path I would travel.
I had just celebrated my 19th birthday, and my rock band was starting to do really well. We weren't great by any means, but we were just coming off a series of really great gigs when my fiancé told me that she was pregnant with our child. I had a lot of options before me: we could get married, we could put the baby up for adoption, etc. (My girlfriend had additional concerns: what if I suddenly became some sort of jerk and told her that it was her problem and left her to face this on her own.) Once the news began to work its way through the grapevine to all our friends and family, I heard a lot of advice from a lot well-meaning people - all of whom listed off suggestions that were much like the choices that I just mentioned.
But I didn't take anyone's advice. Instead, against everyone else's counsel, I married my girlfriend. We had a baby girl, who is now almost ten years older than I was when I made my choice to keep her. But this decision on my part didn't come without cost; my days of playing long-haired lead guitar for a rock band were over. In fact, my entire youth ended almost overnight - it was time to put aside my personal ambitions and accept the responsibilities that lay before me. My wife and I spent many years in abject poverty as we fought side-by-side to build a home together and raise our children as best we could. Despite the difficult times, my wife and I recently celebrated our 28th anniversary, and we raised three great kids along the way.
However, my life might not have been this way; I could have chosen the other path when I was given the opportunity to do so. I could have chosen something selfish that I wanted just for me, and I could have left my girlfriend to deal with it on her own. Some years later, I could have written a heart-wrenching song about the hard choices that I had to make. Perhaps that could have become a hit, and I could have sold that song to untold scores of fans. Maybe I could have written a book about my life and my admirers might have said, "That's so sad - look at everything he gave up to become who he is."
Every year people walk out on their responsibilities in the hopes that the scenario which I just described will happen to them; they hope they'll be successful despite the pain that they cause to others. What is worse, however, is that popular culture applauds such actions. Songs like Bruce Springsteen's Hungry Heart attempt to spin public opinion in support of egocentric behavior by unapologetically suggesting that a deadbeat dad was simply "following his heart."
Yet in my personal situation this delusion would have been far from the truth; I would have been a selfish punk who left his unwed 18-year-old girlfriend to face the world alone with a newborn baby girl. Perhaps I might have become a successful 'artist' and sent generous child support payments to take care of my daughter's every need, but that's just not the same. Children need parents; they need both a father and a mother to be there to love and raise them.
There is no way that I can say this so it won't sound overly-judgmental, but I think it makes someone a coward when they choose their own selfish desires over their family and their responsibilities. When I chose to become a father, I gave up everything that I wanted for myself; I gave up my personal hopes, dreams, and desires for my life. I sacrificed everything so my daughter would grow up with both a mom and dad. My choice was much harder to live with than I ever could have imagined, but my daughter's life was worth the cost.
So in the end, when I finally shrug off this mortal coil, I will not have won any awards for what I have accomplished in my life, and I'll have no golden statuettes to adorn the shelves in my study. I am sure that I will never win father of the year, but my three children will have had better lives because I chose to be their father. I did not choose the easy path for my life - I chose the road less travelled, and I pray that for my family it has made all the difference.
31 December 2012 • by Bob • IIS, URL Rewrite, SEO, Classic ASP
I had another interesting situation present itself recently that I thought would make a good blog: how to use Classic ASP with the IIS URL Rewrite module to dynamically generate Robots.txt and Sitemap.xml files.
Here's the situation: I host a website for one of my family members, and like everyone else on the Internet, he wanted some better SEO rankings. We discussed a few things that he could do to improve his visibility with search engines, and one of the suggestions that I gave him was to keep his Robots.txt and Sitemap.xml files up-to-date. But there was an additional caveat - he uses two separate DNS names for the same website, and that presents a problem for absolute URLs in either of those files. Before anyone points out that it's usually not a good idea to host multiple DNS names on the same content, there are times when this is acceptable; for example, if you are trying to decide which of several DNS names is the best to use, you might want to bind each name to the same IP address and parse your logs to find out which address is getting the most traffic.
In any event, the syntax for both Robots.txt and Sitemap.xml files is pretty easy, so I wrote a couple of simple Classic ASP Robots.asp and Sitemap.asp pages that output the correct syntax and DNS-specific URLs for each domain name, and I wrote some simple URL Rewrite rules that rewrite inbound requests for Robots.txt and Sitemap.xml files to the ASP pages, while blocking direct access to the Classic ASP pages themselves.
All of that being said, there are a couple of quick things that I would like to mention before I get to the code:
That being said, let's move on to the actual code.
There are three files that you will need to create for this example:
You need to save the following code sample as Robots.asp in the root of your website; this page will be executed whenever someone requests the Robots.txt file for your website. This example is very simple: it checks for the requested hostname and uses that to dynamically create the absolute URL for the website's Sitemap.xml file.
<% Option Explicit On Error Resume Next Dim strUrlRoot Dim strHttpHost Dim strUserAgent Response.Clear Response.Buffer = True Response.ContentType = "text/plain" Response.CacheControl = "public" Response.Write "# Robots.txt" & vbCrLf Response.Write "# For more information on this file see:" & vbCrLf Response.Write "# http://www.robotstxt.org/" & vbCrLf & vbCrLf strHttpHost = LCase(Request.ServerVariables("HTTP_HOST")) strUserAgent = LCase(Request.ServerVariables("HTTP_USER_AGENT")) strUrlRoot = "http://" & strHttpHost Response.Write "# Define the sitemap path" & vbCrLf Response.Write "Sitemap: " & strUrlRoot & "/sitemap.xml" & vbCrLf & vbCrLf Response.Write "# Make changes for all web spiders" & vbCrLf Response.Write "User-agent: *" & vbCrLf Response.Write "Allow: /" & vbCrLf Response.Write "Disallow: " & vbCrLf Response.End %>
The following example file is also pretty simple, and you would save this code as Sitemap.asp in the root of your website. There is a section in the code where it loops through the file system looking for files with the *.html file extension and only creates URLs for those files. If you want other files included in your results, or you want to change the code from static to dynamic content, this is where you would need to update the file accordingly.
<% Option Explicit On Error Resume Next Response.Clear Response.Buffer = True Response.AddHeader "Connection", "Keep-Alive" Response.CacheControl = "public" Dim strFolderArray, lngFolderArray Dim strUrlRoot, strPhysicalRoot, strFormat Dim strUrlRelative, strExt Dim objFSO, objFolder, objFile strPhysicalRoot = Server.MapPath("/") Set objFSO = Server.CreateObject("Scripting.Filesystemobject") strUrlRoot = "http://" & Request.ServerVariables("HTTP_HOST") ' Check for XML or TXT format. If UCase(Trim(Request("format")))="XML" Then strFormat = "XML" Response.ContentType = "text/xml" Else strFormat = "TXT" Response.ContentType = "text/plain" End If ' Add the UTF-8 Byte Order Mark. Response.Write Chr(CByte("&hEF")) Response.Write Chr(CByte("&hBB")) Response.Write Chr(CByte("&hBF")) If strFormat = "XML" Then Response.Write "<?xml version=""1.0"" encoding=""UTF-8""?>" & vbCrLf Response.Write "<urlset xmlns=""http://www.sitemaps.org/schemas/sitemap/0.9"">" & vbCrLf End if ' Always output the root of the website. Call WriteUrl(strUrlRoot,Now,"weekly",strFormat) ' -------------------------------------------------- ' This following section contains the logic to parse ' the directory tree and return URLs based on the ' static *.html files that it locates. This is where ' you would change the code for dynamic content. ' -------------------------------------------------- strFolderArray = GetFolderTree(strPhysicalRoot) For lngFolderArray = 1 to UBound(strFolderArray) strUrlRelative = Replace(Mid(strFolderArray(lngFolderArray),Len(strPhysicalRoot)+1),"\","/") Set objFolder = objFSO.GetFolder(Server.MapPath("." & strUrlRelative)) For Each objFile in objFolder.Files strExt = objFSO.GetExtensionName(objFile.Name) If StrComp(strExt,"html",vbTextCompare)=0 Then If StrComp(Left(objFile.Name,6),"google",vbTextCompare)<>0 Then Call WriteUrl(strUrlRoot & strUrlRelative & "/" & objFile.Name, objFile.DateLastModified, "weekly", strFormat) End If End If Next Next ' -------------------------------------------------- ' End of file system loop. ' -------------------------------------------------- If strFormat = "XML" Then Response.Write "</urlset>" End If Response.End ' ====================================================================== ' ' Outputs a sitemap URL to the client in XML or TXT format. ' ' tmpStrFreq = always|hourly|daily|weekly|monthly|yearly|never ' tmpStrFormat = TXT|XML ' ' ====================================================================== Sub WriteUrl(tmpStrUrl,tmpLastModified,tmpStrFreq,tmpStrFormat) On Error Resume Next Dim tmpDate : tmpDate = CDate(tmpLastModified) ' Check if the request is for XML or TXT and return the appropriate syntax. If tmpStrFormat = "XML" Then Response.Write " <url>" & vbCrLf Response.Write " <loc>" & Server.HtmlEncode(tmpStrUrl) & "</loc>" & vbCrLf Response.Write " <lastmod>" & Year(tmpLastModified) & "-" & Right("0" & Month(tmpLastModified),2) & "-" & Right("0" & Day(tmpLastModified),2) & "</lastmod>" & vbCrLf Response.Write " <changefreq>" & tmpStrFreq & "</changefreq>" & vbCrLf Response.Write " </url>" & vbCrLf Else Response.Write tmpStrUrl & vbCrLf End If End Sub ' ====================================================================== ' ' Returns a string array of folders under a root path ' ' ====================================================================== Function GetFolderTree(strBaseFolder) Dim tmpFolderCount,tmpBaseCount Dim tmpFolders() Dim tmpFSO,tmpFolder,tmpSubFolder ' Define the initial values for the folder counters. tmpFolderCount = 1 tmpBaseCount = 0 ' Dimension an array to hold the folder names. ReDim tmpFolders(1) ' Store the root folder in the array. tmpFolders(tmpFolderCount) = strBaseFolder ' Create file system object. Set tmpFSO = Server.CreateObject("Scripting.Filesystemobject") ' Loop while we still have folders to process. While tmpFolderCount <> tmpBaseCount ' Set up a folder object to a base folder. Set tmpFolder = tmpFSO.GetFolder(tmpFolders(tmpBaseCount+1)) ' Loop through the collection of subfolders for the base folder. For Each tmpSubFolder In tmpFolder.SubFolders ' Increment the folder count. tmpFolderCount = tmpFolderCount + 1 ' Increase the array size ReDim Preserve tmpFolders(tmpFolderCount) ' Store the folder name in the array. tmpFolders(tmpFolderCount) = tmpSubFolder.Path Next ' Increment the base folder counter. tmpBaseCount = tmpBaseCount + 1 Wend GetFolderTree = tmpFolders End Function %>
Note: There are two helper methods in the preceding example that I should call out:
The last step is to add the URL Rewrite rules to the Web.config file in the root of your website. The following example is a complete Web.config file, but you could merge the rules into your existing Web.config file if you have already created one for your website. These rules are pretty simple, they rewrite all inbound requests for Robots.txt to Robots.asp, and they rewrite all requests for Sitemap.xml to Sitemap.asp?format=XML and requests for Sitemap.txt to Sitemap.asp?format=TXT; this allows requests for both the XML-based and text-based sitemaps to work, even though the Robots.txt file contains the path to the XML file. The last part of the URL Rewrite syntax returns HTTP 404 errors if anyone tries to send direct requests for either the Robots.asp or Sitemap.asp files; this isn't absolutely necesary, but I like to mask what I'm doing from prying eyes. (I'm kind of geeky that way.)
<?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rewriteMaps> <clear /> <rewriteMap name="Static URL Rewrites"> <add key="/robots.txt" value="/robots.asp" /> <add key="/sitemap.xml" value="/sitemap.asp?format=XML" /> <add key="/sitemap.txt" value="/sitemap.asp?format=TXT" /> </rewriteMap> <rewriteMap name="Static URL Failures"> <add key="/robots.asp" value="/" /> <add key="/sitemap.asp" value="/" /> </rewriteMap> </rewriteMaps> <rules> <clear /> <rule name="Static URL Rewrites" patternSyntax="ECMAScript" stopProcessing="true"> <match url=".*" ignoreCase="true" negate="false" /> <conditions> <add input="{Static URL Rewrites:{REQUEST_URI}}" pattern="(.+)" /> </conditions> <action type="Rewrite" url="{C:1}" appendQueryString="false" redirectType="Temporary" /> </rule> <rule name="Static URL Failures" patternSyntax="ECMAScript" stopProcessing="true"> <match url=".*" ignoreCase="true" negate="false" /> <conditions> <add input="{Static URL Failures:{REQUEST_URI}}" pattern="(.+)" /> </conditions> <action type="CustomResponse" statusCode="404" subStatusCode="0" /> </rule> <rule name="Prevent rewriting for static files" patternSyntax="Wildcard" stopProcessing="true"> <match url="*" /> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsFile" /> </conditions> <action type="None" /> </rule> </rules> </rewrite> </system.webServer> </configuration>
That sums it up for this blog; I hope that you get some good ideas from it.
For more information about the syntax in Robots.txt and Sitemap.xml files, see the following URLs:
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
28 December 2012 • by bob • Hardware
I'd like to take a brief departure from my normal series of IIS-related blogs and talk about something very near and dear to the hearts of many geeks - ripping a computer apart and upgrading its various hardware components just because it's fun. ;-)
Several years ago I bought a Dell Inspiron Mini 1011 Laptop, which is a smallish netbook computer with a 10-inch screen. (Actually, I bought this as an alternate laptop for my wife to use when travelling, since she doesn't like to travel with her full-sized laptop.) This computer eventually became a "coffee-table laptop" for our house, which houseguests use when they come to visit. Since the netbook computer is so small, our family has affectionately labeled it the "Baby Computer."
Recently my wife and I took a trip to Hawaii, for which I decided to leave my full-size laptop at home, and I brought the Baby Computer instead. Since I had never needed to rely on the Baby Computer to do anything more than surf the web in the past, I hadn't realized how quickly it was starved for resources whenever I tried to edit photos or write code. (Yes - I actually write code while on vacation... writing code makes me happy.) The Baby Computer shipped with an underwhelming 1GB of RAM, which filled up quickly if I tried to do too many things at once, and it came with a 120GB 5400rpm hard drive. There's nothing that I could do about CPU speed, but as I slogged through the rest of my vacation using the Baby Computer, I resolved to research if the other hardware in this laptop could be expanded.
Figure 1 - Performance Before Upgrading |
Once we got home from vacation I did some checking, and I discovered that I could expand the Baby Computer's RAM to 2GB, which isn't much, but it obviously doubled what I had been using, and I decided replace it's original hard drive with a 128GB solid-state drive (SSD). With that in mind, I thought that it would be a worthwhile endeavor to document the upgrade process for someone else who wants to do the same thing with their Dell Inspiron Mini 1011. (Of course, you are undoubtedly voiding your Dell warranty the moment that you open your laptop's case.)
First things first - Dell's support website has some great information about tearing apart your laptop; Dell provides a detailed online Service Manual with all of the requisite instructions for replacing most of the parts in the Dell Mini, and I used it as a guide while I performed my upgrades. That being said, the upgrade process was still a little tricky, and some of the parts were difficult to get to. (Although it seems like Dell may have made upgrades a little easier in later models of my laptop.)
So without further introduction, here are the steps for upgrading the RAM and hard drive in a Dell Inspiron Mini 1011 Laptop.
This step is pretty easy - there are only a handful of screws to remove.
Figure 2 - Removing the Screws |
It's pretty easy to pop the keyboard out of the case...
Figure 3 - Removing the Keyboard |
...although once you have the keyboard loose, you need to flip it over carefully and remove the flat ribbon cable from underneath.
Figure 4 - Detaching the Keyboard Cable |
This step was a little tricky, and it took me a while to accomplish this task because I had to wedge a thin screwdriver in between the case and the palm rest in order to pry it off. Note that there is a flat ribbon cable that attaches the palm rest to the motherboard that you will need to remove.
Figure 5 - Removing the Palm Rest |
With the keyboard and palm rest out of the way, you can remove the hard drive - there's a single screw holding the hard drive mount into the case and four screws that hold the hard drive in its mount.
Figure 6 - Removing the Hard Drive |
If you were only replacing the hard drive, you could stop here. Since I was upgrading the RAM, I needed to dig deeper.
Once the hard drive is out of the way, you need to remove the motherboard so you can replace the RAM that is located underneath it. There are a handful of screws above and below the computer that hold the palm rest bracket to the case...
Figure 7 - After Removing the Palm Rest Bracket |
...once you remove remove the palm rest bracket, you can flip over the motherboard and replace the RAM.
Figure 8 - Replacing the RAM |
Rather than reinstalling the operating system from scratch, I cloned Windows from the original hard drive to the SSD. To do this, I placed both the old hard drive and the new SSD into USB-based SATA drive cradles and I used Symantec Ghost to clone the operating system from drive to drive.
Figure 9 - Both Hard Drives in SATA Cradles |
Figure 10 - Cloning the Hard Drive with Ghost |
Once the clone was completed, all that was left was to install the new SSD and reassemble the computer.
Figure 11 - Installing the New SSD |
Once I had everything completed and reassembled, Windows booted considerably faster when using the SSD; it now boots in a matter of seconds. (I wish that I had timed the boot sequence before and after the upgrades, but I didn't think of that earlier... darn.) Running the Windows 7 performance assessment showed a measurable increase in hard drive speed, with little to no increase in RAM speed. Of course, since there was no speed increase for CPU or graphics, the overall performance score for my laptop remained the same. That being said, with twice the RAM as before, it should be paging to disk less often, so regular usage should seem a little faster; even when it does need to swap memory to disk it will be faster using the SSD than with its old hard drive.
Figure 12 - Performance After Upgrading |
That's all for now - have fun. ;-)
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
23 December 2012 • by Bob • Zune
First and foremost - I am not ashamed to admit that I am a card-carrying Zune fanboy. But that being said, as a faithful owner of several Zune devices, I am ashamed of the way that the Zune team at Microsoft so badly botched their product line; the Zune team was so out of touch with their target consumers that it borders on negligence. Here is my totally-biased list of reasons why I personally think the Zune failed.
There were a smattering of MP3 players on the market by the time that Apple's iPod hit the stores. I still have an RCA Lyra device that kicked butt in its day, but my personal favorites were the Creative Zen devices; you plugged a Zen player into your computer and it showed up like an external hard drive. To add music, you simply dragged & dropped music files anywhere you wanted; the Zen devices used your music files' metadata to sort by albums, genres, artists, etc.
When Apple's iPod hit the stores, its main rise to fame was its end-to-end story from iTunes to iPod, all of which belonged to Apple. Their devices were cool, and their advertising was stellar (as always). Even though they were overpriced, the iPod soon became "the product" that everyone wanted. The iTunes/iPod integration was closed to outsiders, which meant that Apple owned the end-to-end experience, and thereby collected all of the profits from it.
When Microsoft eventually realized that Apple was making enough money off their music/devices sales to save their company - which was formerly close to bankruptcy - they decided to create a device and end-to-end experience for themselves. But when Microsoft tried to do so, they mostly opted for feature parity with iTunes. What was Microsoft thinking? Instead of improving on the iTunes model, they were trying to break into an established market with a product that had little to offer that was above and beyond what consumers could already get.
FAIL.
I bought my wife a Zune for Christmas when they first released. Having owned and used several MP3 players in the past, I thought that it would be a similar experience; let me assure you, it was decidedly not a similar experience. I was so frustrated with the first-generation Zune software that I had boxed up the Zune and was ready to take it back to the store within an hour of trying to get it set up for her. I eventually elected not to do so, and I managed to get it working, but it was a crappy experience that made me apologize to my non-technical wife for burdening her with such a mess.
FAIL. FAIL. FAIL.
Customers wanted to use their Zune devices as external storage, but having to use the Zune software to transfer files to the device prevented that. The prevailing argument was that Zune followed the iTunes/iPod model, but who cares if that's the way that iTunes/iPod worked? Zune customers paid good money for their devices, and they wanted to store files on those. USB flash drives were still pretty pricey at the time, so opening the Zune platform to double as external storage would have been a fantastic selling feature, but that concept escaped the Zune team's leadership because they wanted to force users into having to use their @#$% software in the hopes that they would be tempted to buy more music/videos through Microsoft.
DESIGN FAIL.
[On a related note, the Windows Phone 7 team did not learn from the Zune's failure, and their devices still had the same, stupid Zune software requirement. BRAIN-DEAD FAIL.]
Microsoft already made a killer media player application for Windows that worked with all the third-party MP3 player devices, but when Microsoft introduced their own MP3 player it didn't work with their existing Windows Media Player.
EPIC FAIL.
Microsoft spent a bunch of money cozying up to the music industry and MP3 makers with a program that was entitled Plays For Sure, whereby devices could be certified to play all Windows-based music files, whether they had copy protection on them or not. Even though all of these third-party companies went through the certification process, Microsoft's own player didn't have to; the Zune didn't support Plays for Sure.
SCREW YOUR PARTNERS FAIL.
[On a related note, this probably hastened the demise of WMA as a file format. SHOOT YOURSELF IN THE FOOT FAIL.]
For a long time - and I mean a really long time - the software that you needed to use with your Zune was next to worthless. It was slow, buggy, and ugly. By the time that the Zune team finally delivered a version of the Zune software that was actually worth installing, the battle for MP3 player supremacy was over and the iPod ruled uncontested.
SCREW YOUR OWN PLATFORM FAIL.
The Zune pass tried to be the Netflix service of music, and in that sense it was ahead of the curve when it was introduced. Customers paid $14.95 a month, and in exchange they were granted free access to tens of thousands of DRM-based WMA music files - all of which they could download and play on their computers or Zune devices - and customers could play them as long as they kept their Zune pass up-to-date. In addition, customers got to keep 10 free songs a month in DRM-free MP3 format per month.
In September, 2011, some wunderkind in the Zune group decided to take away the 10 free songs and drop the price of the Zune pass to $9.99 per month. This person - whoever they may be - is an idiot. With the incredible amount of free music that is available on the Internet now, the free downloads on the Zune pass was the only feature of the Zune pass that made having a Zune pass worthwhile.
SCREW YOUR CUSTOMERS FAIL.
The rest of the world works with actual money, but the Zune service required customers to use 'points' to buy music or videos, and points did not map directly to dollars and cents. On Amazon or iTunes, music was typically $0.99 per track, but on Zune it was typically 89 points per track. WTF? What the @#$% was a 'point'?
So let's say that you wanted to buy a music file to download; the Zune software would inform you that had to buy points first, which would have some weird exchange rate that didn't make sense. For example, if you bought 400 points for $4.99, that would mean that your 89-point music file actually cost $1.11, which was $0.12 more than Amazon or iTunes. When Microsoft combined their crappy points-based purchasing system with their overpriced music, they created an environment that was a truly horrible customer experience.
WTF FAIL.
The iPod was dominating media player sales all over the world, but believe it or not - there are people that simply don't like Apple. I constantly saw people all over Europe that were clamoring for a Zune, and Microsoft didn't deliver.
DRAG YOUR FEET FAIL.
Believe it or not, I saw a lot of Mac users who were asking for Zune software on the Mac. I'm not sure if these people were also iTunes users or not, but I think that the concept of the original "10 free songs" with a Zune pass was appealing to them. Sadly, Microsoft did not deliver - and a whole slew of potential customers were left high-and-dry.
SCREW YOUR POTENTIAL CUSTOMERS FAIL.
Despite all this negativism, the Zune team did deliver some great products - I still own several Zunes from the various series of players, although it now feels a lot like owning a Tucker automobile or a Betamax VCR.
Here are some of the coolness factors that Zune had:
In the end, it was very sad for me to see the Zune fail; the Zune was simply a victim of being superior device with inferior product management.
22 December 2012 • by Bob • Military, Ponderings
Here's a weird but true story for you: my wife and I went to the movies tonight to see Zero Dark Thirty, (which was a good movie in case you were wondering). Right at the point where the Navy Seals [spoiler alert] pull the trigger on their main person of interest, a man in the theater started yelling, "VIOLENCE ONLY BEGETS VIOLENCE!!! VIOLENCE ONLY BEGETS VIOLENCE!!!", and he ran out of the building while continuing to scream that phrase like a cultish mantra.
This leads me to the following quandary: it was publicly and deliberately advertised what the subject of this movie was about ahead of time, so there can be no question that everyone in the auditorium knew before they walked through the theater doors that they were there to watch the CIA and Navy Seals take down the principle terrorist who planned the tragedies of September 11th, 2001. So why would anyone go to this movie expecting anything other than violence?
This movie has an "R" rating because of the violence; and there is a lot of violence in this movie. But oddly enough, the person in question did not run screaming from the theater when [spoiler alert] a lot of European and American lives (both combatants and non-combatants) were premeditatedly and violently killed throughout the two hours of the movie which preceded the brief actions that were the cause of his outburst.
The whole affair was surreal, and I am sure that several people (not just me) were nervously wondering if we were about to see a repeat of the tragic theater shootings that took place at the Batman premier last summer. I'm beginning to think that I'll just wait for everything to come out on Netflix before I watch it in the future.
30 November 2012 • by Bob • FTP, SSL
For this installment in my series about FTP clients, I want to take a look at Beyond Compare 3 from Scooter Software. At its heart, Beyond Compare is a file/folder comparison tool, so it might seem an unlikely candidate for an FTP client, but it has a lot of great FTP features packed into it.
![]() |
Fig. 1 - The Help/About dialog in Beyond Compare 3. |
Note: For this blog I used Beyond Compare version 3.3.5.
Like many self-proclaimed computer geeks, over the years I have collected a lot of various utilities that perform specific actions that I need to take care of. Sometimes I discover these tools when Binging my way through the Internet, and other times they come highly recommended from other people. In this specific situation, Beyond Compare falls into the latter category - dozens of people had recommended Beyond Compare to me before I tried it out, and after falling in love with it I have recommended it to dozens of my friends. At the time I was using Microsoft WinDiff to compare files, which is still a great application to do simple comparisons, but Beyond Compare does so much more.
![]() |
Fig. 2 - The Start New Session screen. |
![]() |
Fig. 3 - Comparing the files within two folders. |
![]() |
Fig. 4 - Comparing the HTML content of two files. |
I could go on about Beyond Compare as a comparison tool, but that's really outside the scope of this blog since I am supposed to be talking about FTP features. Needless to say, if you're looking for a good comparison tool, you might want to download the trial edition of Beyond Compare 3 and give it a try.
That being said, let's get back to the business at hand. Beyond Compare 3 has a collection of FTP Profiles, which you can think of as analogous to a site manager in more traditional FTP clients.
![]() |
Fig. 5 - Opening Beyond Compare 3's FTP Profiles. |
Inside the FTP Profiles dialog, you can specify a wealth of connection options for remote FTP sites that you would expect to find in any other FTP client.
![]() |
Fig. 6 - Specifying FTP connection options. |
Once you have established an FTP connection through Beyond Compare 3, you can view your local files and the files in your remote FTP site side-by-side, and then you can perform comparisons, updates, merges, etc.
![]() |
Fig. 7 - Viewing local and remote files. |
Beyond Compare 3 has built-in support for Explicit FTP over SSL (FTPS), which you specify when you are creating the FTP profile for a site.
![]() |
Fig. 8 - Specifying an Explicit FTPS connection. |
Once you have established an Explicit FTPS connection through Beyond Compare 3, the user experience is the same as it is for a standard FTP connection.
![]() |
Fig. 9 - Comparing files over FTPS. |
That being said, at first glance Beyond Compare 3 did not appear to support Implicit FTPS. For me this was not a deal-breaker by any stretch of the imagination since Explicit FTPS is preferred. (Even though Implicit FTPS is supported by IIS7 through IIS8, it is really an outdated protocol.)
10 January 2013 Update: I heard from Craig Peterson at Scooter Software that Beyond Compare 3 does support Implicit FTPS, but it does so implicitly. (No pun intended. ;-]) When you connect using FTP over SSL on port 990, it will automatically use implicit FTPS.
Beyond Compare 3 has built-in support for the HOST command, so you can use true FTP host names when using Beyond Compare 3 to connect to FTP7 and FTP8 sites that are configured with host names. This feature is enabled by default, but if you needed to disable it for some reason, that feature can be accessed on the Connection tab of Beyond Compare 3's FTP Profiles dialog.
![]() |
Fig. 10 - Specifying support for the FTP HOST command. |
Beyond Compare 3's login settings allow you to specify the virtual host name as part of the user credentials by using syntax like "ftp.example.com|username" or "ftp.example.com\username", but since Beyond Compare 3 allows you to use true FTP hosts this is really a moot point. Just the same, there's nothing to stop you from disabling the HOST command for a connection and specifying an FTP virtual host as part of your username, although I'm not sure why you would want to do that.
![]() |
Fig. 11 - Specifying a virtual FTP host. |
This concludes our quick look at some of the FTP features that are available with Beyond Compare 3, and here are the scorecard results:
Client Name | Directory Browsing | Explicit FTPS | Implicit FTPS | Virtual Hosts | True HOSTs | Site Manager | Extensibility |
---|---|---|---|---|---|---|---|
Beyond Compare 3.3.5 | Rich | Y | Y | Y | Y | Y | N/A 1 |
As noted earlier, Beyond Compare 3 supports the FTP HOST command, and is enabled by default for new connections. 1 Note: I could not find anyway to extend the functionality of Beyond Compare 3, but it does have a scripting interface; see their Automating with Scripts and Scripting Reference pages for more details. |
So there you have it - Beyond Compare 3 contains many of the features that would make up a great GUI-based FTP client with first-class support for all of the features that I have been examining in detail throughout my blog series about FTP clients. And as I have done with all of my blogs thus far, I included the following disclaimer in all of my preceding posts: there are a great number of additional features that Beyond Compare 3 provides - but once again I only focused on a few specific topic areas that apply to FTP7 and FTP8. For example, one particular feature that I might to experiment with in the future is Beyond Compare 3's support for FTP SSL Client Certificates. But I'll leave that for another day. ;-]
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
29 November 2012 • by Bob • PHP, WebMatrix
With the release of WebMatrix 2, I thought that it would be great to take a PHP class and use WebMatrix exclusively for the entire class. Much to my surprise, this proved to be a great experience. Seriously, I did not expect it to go as well as it did. This has nothing to do with WebMatrix, it's just that I've picked up some cynicism over the years where editors are concerned. This pessimistic outlook is largely due to the fact that I've tried a lot of editors based on the recommendations of my fellow geeks, and those have often been bad experiences. Usually they say something like, "Dude, if you're going to write code in <some language> then you have to use the <some editor> application."
Unfortunately, most of these editors fail to live up to their hype, and I am forced to endure trials and tribulations where I loudly exclaim "If I was using <my favorite editor> I would be done by now!" (I periodically accompany those moments with language that is best reserved for a golf course when you've just hit your last Titlelist into the water hazard.)
But those experiences never happened with WebMatrix 2 - not even once; WebMatrix did pretty much everything that I needed it to do, and it did everything really well. As a result, my cynical skepticism quickly gave way to optimistic impression.
I took copious notes about my experiences with WebMatrix throughout the class, and with that in mind, I thought that it would be great to write a blog with my genuinely unbiased thoughts about using WebMatrix exclusively as my PHP authoring platform for the two-month duration of my class. (As a point of trivia, the PHP class that I took was BMIS 410 - Web Enterprise Technologies at Liberty University. Quick shout out to my professor, Michael Hart, who was a great instructor.)
First of all, the intellisense for PHP was quite good - and having the URLs to the PHP.net reference pages in the tooltip help for PHP functions was extremely useful; I spent a lot of time clicking through to the PHP.net website for assistance for one function or other.
![]() |
Fig. 1 - WebMatrix's Intellisense for PHP. |
Using WebMatrix to preview in IE and WP7/iPhone/iPad emulators was great; in my opinion, this experience was much better than the SuperPreview feature of Expression Web.
![]() |
Fig. 2 - Options for previewing your website. |
![]() |
Fig. 3 - Testing my website in the iPad simulator. |
Using the WebMatrix database editor to create tables for my MySQL database was great - in many ways it was much better than using the MySQL Workbench. The biggest drawback in WebMatrix was the inability to create auto-number fields, and I couldn't enter dates in the correct format in the database UI. (That was undoubtedly something that I was doing wrong.) So every once in a while I had to go back to the MySQL Workbench to fix something. That being said, the interface for creating relationships in WebMatrix is great, and much better than using MySQL Workbench.
![]() |
Fig. 4 - Editing the data in a MySql Database. |
FTP publishing is much better in version 2 of WebMatrix. I used an IIS7 web server, so I was able to use FTP7's virtual hosts to publish to a specific site on a shared server. WebMatrix has no FTPS support, so that is something of a loss. (WebMatrix also lacks full WebDAV support, but I've already talked about that in other blogs.)
![]() |
Fig. 5 - FTP Publish Settings. |
This last point might seem trivial, and I realize that a lot of editors have similar features, but the way that WebMatrix keeps track of opening/closing parentheses, brackets, and curly braces saved me more times than I can count.
![]() |
Fig. 6 - Helping me keep track of what I'm doing. |
Here are the few problems that I encountered with WebMatrix during the course:
My first issue was not a problem that was due to WebMatrix per-se, but every once in a while a page would get stuck in the cache and I couldn't see changes that I had made to a page, so I would have to restart IIS Express. I'll have to investigate why that was happening; it could be IIS Express, or it could be the PHP engine - I'm still not sure where the fault lies. Fortunately WebMatrix makes it very easy to restart IIS Express from inside WebMatrix, but still - it was a minor frustration.
WebMatrix only wanted to validate against HTML5, but my class required all assignments to use XHTML 1.0 Transitional DOCTYPE, and that showed up as errors in WebMatrix. Yes - the world is moving to HTML5, but still - that shouldn't cause an error.
Perhaps the biggest feature that WebMatrix lacks is the really cool local and remote side-by-side publishing view that both Expression Web and it's predecessor FrontPage had.
When you have a lot of pages open the WebMatrix tab bar fills up, and it's really difficult to keep track of which pages are open.
It would be nice to tear pages out of the editor like you can do with Internet Explorer and Visual Studio.
I have to mention this last item because it was in my notes, but it's technically not an issue for WebMatrix. One of my personal coding self-annoyances was that I would write the text for a string and then realize that I forgot to put it in quotes; when I would type an opening quote, WebMatrix would try to help me out by adding the closing quote - which would now be outside my string, so I always had to delete one of the quotes. There is an option to turn off that feature; see File->Options->Code in WebMatrix. But that being said, this is a useful feature when I remember to create the quotes before I start typing in a string. So once again, this is really more of a complaint against myself; it's my fault that I sometimes have lousy typing skillz. [sic]
I should start off by saying that I got an "A" in the course, and I can honestly give WebMatrix some of the credit for that. If I had spent a great deal of time fighting with an editor, I would have had less time to focus on writing PHP code. But in the end, WebMatrix actually made it easier for me to write PHP code.
So in closing, WebMatrix rocks, PHP rocks, and using WebMatrix with PHP definitely rocks.
Note: This blog was originally posted at http://blogs.msdn.com/robert_mcmurray/
28 November 2012 • by Bob • Guitar
Many years ago - more years than I will care to admit - I saw Cheap Trick in concert. (Okay, just to give you an idea of how long ago this was - Cheap Trick was touring to promote their Cheap Trick at Budokan album; you can do the math from there.) At this point in my life, I hadn't been playing the guitar for very long, and my main guitar at the time was a cheap 3/4-size nylon-string acoustic that my dad had bought for me from a store on a military base. Military bases aren't known for keeping great guitars in stock, so it needs little explanation that I was fascinated by any cool guitar that came along. This made seeing Cheap Trick even more entertaining, because their lead guitar player, Rick Nielson, used something like 1,000 different guitars throughout the show.
But one particular guitar caught my eye - an Explorer; something about it's futuristic shape seemed to me like the coolest guitar ever. Rick Nielson played an Explorer from Hamer Guitars, but I soon learned that Hamer's Explorer was a copy of the Gibson Explorer, and that became the 'Guitar to Have' for me.
![]() |
Rick Nielson (left) playing a Hamer Explorer onstage with Cheap Trick. (Note: This image is originally from Wikipedia.) |
About this time I was in my first rock band with my good friend Gene Faith. Even though we both actually played the guitar, we liked to create fake instruments for ourselves - I made myself a fake guitar out of scrap wood that looked like an Explorer, even though it was hollow and had strings that were made out of rubber bands. But it was cool - there was no doubt in my mind about that. Once we had some 'instruments' at our disposal, we'd put on a record and pretend to actually play these fake instruments and jump around my dad's living room like we were rock stars. (Hey, don't laugh so hard - I was only 12 or 13 years old.)
My first electric guitar was a cheap copy of a Gibson SG that I purchased at Sears for somewhere around $100. (And believe me - I delivered a lot of newspapers to earn the $100 to buy that guitar.) It was okay as a starter guitar, but I soon found myself wanting a better axe. A year or so later I saved up more of the proceeds from my newspaper route and I bought an Explorer copy from an off-brand company named Seville - it was nowhere near as good as a Gibson, but it was the best that I could do on a paperboy's budget. It had a hideous tobacco sunburst paint job, so I removed the neck and hardware, sanded the body down to the bare wood, stained it with a dark wood color, and then I shellacked the body with a clear finish. When I reassembled the guitar, it looked pretty good. I played that Explorer for a few years, and I eventually sold it to my friend Gene.
Jumping ahead a few decades, another good friend, Harold Perry, was moving from Seattle to San Francisco, so he was parting with a bunch of musical gear. I'm always in the market for seasoned gear that needs a new home, so Harold and I were going through a bunch of his old items while I was deciding what I might want to buy. Harold had bought a 1980 Gibson Explorer II several years earlier as a 'project guitar' - it had been badly treated by a previous owner and needed a lot of repair work. Since Harold was moving, he didn't expect to have time to finish the guitar, and he wanted it to find a good home, so he sold it to me for a great price.
And so my adventure with guitar restoration began as a labor of love.
When I took the guitar home, the first thing that I did was strip all of the remaining hardware off the guitar; thereby leaving nothing but the wood body. I then proceeded to polish every inch of the guitar for a few hours. Whoever had owned the guitar before Harold apparently had some hygiene issues and it seemed like he had never cleaned the guitar despite voluminous amounts of caked sweat that coated much of the surface. What's more, his sweat had corroded all of the stock hardware, so nearly all of the hardware would need to be replaced. With that in mind, I decided that this would be a long-term project and I would take my time with it.
The next thing that I needed to do was to polish the hardware that I intended to keep - which was just the brass nut and frets, all of which looked pretty hideous. I used Mr. Metal to polish the hardware, which seemed a strangely apropos title for a former heavy metal dude.
Badly-tarnished frets and nut. | Dude - it's "Mr. Metal." :-O |
The pile of used cotton patches after I finished polishing. |
Shiny frets and brass nut! |
Over several months I slowly bought new hardware that I needed. I'll spare you most of the details, but suffice it to say that it took a long time for me to locate and purchase all of the right replacement parts that I wanted. I primarily bought the hardware from Stewart McDonald, Musician's Friend, and Guitar Center, and I had the guys at Parson's Guitars create a new truss rod cover to replace the original that had been lost before the guitar had found its way to me. In the end, I replaced the bridge, tailpiece, volume & tone potentiometers, tuning machines, strap locks, toggle switch, and speed knobs. (The folks at Parson's Guitars thought that replacing the stock Gibson parts was a sacrilege, even though I explained that keeping the stock parts left the guitar unplayable.)
Before I started wiring the guitar, I lined the inside of the routing cavities with copper tape - this is supposed to reduce EMI on the guitar. I've never used it before, so it's something of an experiment. In any event - lining the routing took several hours to complete; time will tell if it was worth it.
The next part of the project was to install the new guitar tuning machines. Oddly enough, Gibson won't sell their inline-6 set of tuners for an Explorer to customers, so I had to buy tuning machines from another company. I eventually decided on tuning machines from Gotoh, which I was able to order through Stewart McDonald. The trouble is, once I mounted them on the headstock, I discovered that the screw holes for the tuning machines were off by a little over a millimeter. (If you look at the image, you can see that the screw holes are angled slightly downward on the right side of the machines, but they needed to be perpendicular to the machine shafts.)
After doing some additional research, I discovered that the only Gotoh tuning machines that Stewart McDonald sells are Gotoh's SG381 tuning machines, and I needed their SG360 tuning machines for my Explorer. After a quick call to Stewart McDonald, I verified that they cannot order Gotoh's SG360 tuning machines for me, so I searched the Internet until I found a distributer in Australia who could ship them to me. It took several weeks for the tuners to make the journey to the United States, but when they arrived they were a perfect fit.
Once I had the right tuning machines installed, I started the long process of wiring and soldering the electronics.
Once I completed the wiring, the last hurdles were to re-string the guitar, tune it up, adjust the string height and intonation, and test it out. (Which is the fun part.)
That about sums it up. The guitar looks great and plays great, although I might drop it by the folks at Parson's Guitars and have them them give it a quick tune-up for good measure.
Special thanks go to Harold for hooking me up with this guitar; and I also owe a big set of thanks to my wife, Kathleen, for humoring me while I took over one of the rooms in our house for the several weeks that I spent working on this project. ;-)
30 October 2012 • by Bob • General
After two long years of sacrificing my evenings and weekends in order to complete homework assignments, I just received the following in the mail:
This obviously signifies that I have finally earned my Bachelor's Degree. This is traditionally a four-year degree, but I managed to complete my degree in just over 28 years from when I first started college. (So anyone who is currently on a five-year plan for their four-year degree, take my word for it - you could do a lot worse.)
By way of explanation, I had never finished my Bachelor's Degree; I dropped out of college during my freshman year when I got married and we needed the money. Our idea at the time was that I would work full-time while my wife went to school full-time, then we would swap roles when she completed her nursing degree. Unfortunately, our lives didn't work out that way. Shortly after I dropped out of college I joined the US Army, and that put a temporary halt on both of our college aspirations as the military continuously transferred us from one location to another.
After five years in the Army, I was finally at a time and place in my life where I could go to college in the evenings and do my homework during the weekends. Because of this, I received my Associate's Degree around the time that I finished eight years in the military; this meant that I had earned my two-year degree almost 9 years after I first started college.
A few months after I received my Associate's Degree I left the Army, and my plan at the time was to go to school and finish my Bachelor's Degree. But once again, my plans didn't work out that way. Sometime during my first year back in school, Microsoft offered me a job, and that opportunity was simply too good to pass up. This was ultimately a great decision, but it meant that my college goals needed to be put on hold again.
Sometime around my fifteen-year anniversary at Microsoft I decided that I was once again in a time and place in my life where I could go to college in the evenings and weekends, so I enrolled in an online program through Liberty University. (I chose this school because their online programs are very friendly to current and former members of the military.) My declared major was Multidisciplinary Studies, which is a fancy term for a program that allows you to split your major into two or three concentrated subject areas. (I chose Computer Science and Religion.)
Jumping ahead a couple of years, I found myself studying hard to complete all of my upper-division courses while putting three children through college, flying around the world to speak at various technical conferences, surviving the weddings for two of my children, and juggling a work schedule that typically comprised 50 to 60 hours a week.
In the end, I finished all of my courses at Liberty University in just over two years - and I managed to maintain a 4.0 GPA throughout my studies, thereby graduating Summa Cum Laude. (Which is probably Latin for "You really need to get a life.")
So if I do the math correctly, it took me 9 years to get my two-year degree, and it took me an additional 19 years to get my four-year degree. At this pace, I should have my Master's Degree 29 years from now.
;-]