Zune Player – why I’ll never buy another

When the Zune 120’s came out, I bought one. After a few months, I began to get the dreaded error 5 contact support message. I read every forum post and KB article I could find to resolve the issue. Do the yoga dance with your fingers and press this button and that button, reinstall this, reinstall that, …. yadda yadda yadda.

Finally, I was able to call support. The service.zune.net URL never did recognize my serial number, so that site wasn’t useful even AFTER calling support until they made several changes behind the scene. Went through all the standard help-desk stuff. Might as well give up if you have any technical experience. Just play dumb, humor the poor folks and go through the baby steps even though you’ve already done all that.

In the end, it was still under warranty and I had to send it back. Fine, 4-5 months left. They call back, they have to replace it with another, but it can’t be a new one, it has to be a rebuilt one. Oh joy.

So, the old one works for about 6 months and same old song and dance all over again. Except, this time, it’s out of warranty. Oh goodie! Well, that was wasted money and time.

I love the Zune software. I like it a lot better than Apple’s iTunes. I also like, and thoroughly enjoy the Zune subscription service. It’s wonderful. I’ll continue to subscribe. But, the Zune player, I’d much rather drive over it with a bulldozer a few hundred times than to ever own one again.

Advertisements

PowerShell and The Case of the Poor Man’s NTBACKUP

One of the things I do is volunteer some of my time and IT experience for my church. We haven’t had the budget to renew maintenance of Backup Exec for a couple of years. This was our previous backup solution. So, we’ve been using NTBACKUP.

I created a simple way to have a PowerShell function to perform the backup without all the parameters and switches and without the need for the gui. I’ve used NTBACKUP in Windows scheduler several years ago. We purchased to external hard drives for rotation of backups. The desired strategy would attempt to give us about an 8 week window for recovery of files.

I decided to combine a strategy of both shadow copy for immediate recovery of individual files and folders and NTBACKUP for recovery. With current economic conditions, the budget only provided a small amount of money for storage.
That was our starting point. I thought about this one for quite a bit and then thought of an idea.

I’m a frequent PowerScripting Podcast listener (http://powerscripting.wordpress.com/). I gathered several ideas I’ve heard over the course of many shows and designed a solution for backups as well as one simple solution for backup job rotation I’ve seen used with VB scripts for several years.

A service account was created with the necessary permissions to perform the backups. Since the service account first had to have a Windows profile before it could utilize a PowerShell profile, I logged into the console to create the profile.

I’m a firm believer in creating your own PowerShell function libraries. My co-worker with exceptional PowerShell skills sold me on that idea long ago. The plan was to have two functions and three Windows scheduler tasks. The backup functions were placed in the network function library in the central repository of scripts on the server. The service account’s default WindowsPowershell profile.ps1 dot sourced the function library containing the backup functions.

The first function simply performs the backup. Because the function would be called using Windows Scheduler, I needed to create the function with parameters to do both normal and differential parameters. The ntbackup function is “start-ntbackup”. Parameters are either “normal”, “differential”, or “incremental”.

The second function would be the backup file rotation function. We needed a way to manage the resulting backup files in an organized fashion across the two external disks.

The maintenance function is “start-BackupRotation” with no parameters.
The Windows Scheduled task for normal backups calls the function in this manner:
C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup normal
The Windows Scheduled task for differential backups calls the function in this manner:

C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup differential

Normal backups run in the evenings each Friday and “daily” differential backups run on all other days. The maintenance task runs at noon on Friday prior to the normal backup running on Friday evening.

Below are the functions:

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20091001
###
### This function executes NTBackup using dates for identification of backups.
### Future improvements - retention aging on files
###
###
###
###
### start-ntbackup normal
### start-ntbackup differential
### start-ntbackup incremental
###
###
###
###
###
### 20091001 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-NTBackup([String]$BackupType){
# Variables
$FullFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$DailyFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$BackupPath = "F:\"
# "Need switch statement - if $BackupType normal or differential
$BackupFileName = $FullFileName
$BksDescription = (get-datestamp)
# Call NTBackup
&cmd /c "C:\WINDOWS\system32\ntbackup.exe backup systemstate c: e: /j `"$BackupType`" /n $BksDescription /f $BackupPath$BackupFileName /v:no /r:no /hc:off /fu /m $BackupType /rs:no /l:s"
}

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20100105
###
### This function will move files between weekly folders across two disks effectively doing backup rotations.
###
###
###
###
### start-BackupRotation
###
###
###
###
###
### 20100105 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-BackupRotation(){

# Get the current Day - this is the "maintenance day"
$DayNow = (Get-Date).DayOfWeek
# Create date variable for log file name
$datestamp = (get-date).tostring("yyyyMMdd")

# do our cleanup on a Friday before normal backups start.
# move the oldest backups to the delete directory, then progressively move backups forward
# leaving the root directory empty for the current week.

if ($DayNow -eq "Friday"){
Get-item G:\Backups-Week8\* | Move-Item -Destination G:\Backups-Delete -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week7\* | Move-Item -Destination G:\Backups-Week8 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week6\* | Move-Item -Destination G:\Backups-Week7 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week5\* | Move-Item -Destination G:\Backups-Week6 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week4\* | Move-Item -Destination G:\Backups-Week5 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week3\* | Move-Item -Destination F:\Backups-Week4 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week2\* | Move-Item -Destination F:\Backups-Week3 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week1\* | Move-Item -Destination F:\Backups-Week2 -Force >>G:\Logs\Transcript-$datestamp.log
Get-Item F:\* -Include *.bkf | Move-Item -Destination F:\Backups-Week1 -Force >>G:\Logs\Transcript-$datestamp.log
remove-item G:\Backups-Delete -Force >>G:\Logs\Transcript-$datestamp.log
# cleanup logs + 60 days
$LogFiles = Get-ChildItem G:\Logs
foreach($LogFile in $LogFiles){
$LogFileDate = ((Get-Date) - $LogFile.CreationTime).Days
if ($LogFileDate -gt 60){
$LogFile.Delete() >>G:\Logs\Transcript-$datestamp.log
}
}
}

}

Using PowerShell with Process Monitor

Testing software for preparing installation/deployment packages or for locking down the local desktop can be quite a chore!

A critical step in collecting data for those changes is understanding what is taking place on the system when that application runs. Are changes made to the registry? Are temporary files written to a directory? Does the user need special permissions to specific folders? These are only a few of the things you need to know. One of the recommended tools that can help provide some of that data is Process Monitor.

While doing some initial testing, I wanted to be able to run through a set of few applications all at one time. It was a bit of a pain to do it all manually and, frankly, I didn’t want to do each of the applications I was testing one at a time.

On the Windows Sysinternals forum there are some examples posted of how to use Process Monitor in batch files. Batch files, uhm, yea I’m not going down that road. I’m trying to get away from batch files. So, I set out to write a PowerShell script to help me automate testing.

One important thing to also notice is the batch script examples show using the procmon.exe with successive switches. You can’t necessarily pipe this to a one-liner. If you using the backingfile option and wish to export your results, the application has to stop, totally close the file and then restart in order to perform the next step of exporting. You also need to delete the backing file to suppress errors if you run the script multiple times in a row. My script will take care of that as well.

I think I found what appears to be a limitation of the option /waitforidle. I didn’t dig deep enough to understand why. But, procmon.exe never came to an idle state and the option never worked for me. So, I had to work around that.

The weak point of Process Monitor is that it’s still a GUI app. It has command-line switches, but it honestly isn’t 100% command-line friendly in my opinion. Other command-line options for applying filters would be highly beneficial. Another great option would be to directly output data to a log file when the application exits so you could bypass having to create a backing file then reload it just to export it.

This script’s workflow is this:

  1. start procmon.exe
  2. wait for it to load with options
  3. load the application to be tested
  4. wait a couple seconds
  5. exit the test application
  6. exit procmon.exe
  7. test for open files (backing file) from procmon.exe
  8. when no open files, restart procmon.exe load backing file and export as csv
  9. exit procmon.exe
  10. delete the pml backing file
# Requires the use of Windows Sysinternals handle.exe to check for open files
# http://technet.microsoft.com/en-us/sysinternals/bb896655.aspx
clear-host
$ProcMonTest = Read-Host "Enter app with path - (C:\Windows\system32\notepad.exe)"
$CSVFile = Read-Host "Enter CSV log file and path - (C:\temp\notepad.csv)"
$ProcMon = "\\s3slc08nm02\installs$\Utilities\Process Monitor\Procmon.exe"
$HandleExe = "\\s3slc08nm02\installs$\Utilities\Sysinternals\handle.exe"
$ProcMonBack = "$Env:Temp\ProcMonTest.pml"
$FileLocked = $false

# make sure backing file isn't present in case it wasn't deleted on last run
$FileExists = Test-Path $ProcMonBack
if ($FileExists -eq $true){
Remove-Item $ProcMonBack -force
}

& $ProcMon /Quiet /AcceptEula /Minimized /backingfile $ProcMonBack

do{
Start-Sleep -seconds 5 # procmon.exe /waitforidle doesn't appear to work well when scripted with PowerShell
$ProcMonProcess = Get-Process | where {$_.Path -eq $ProcMon}
}while(
$ProcMonProcess.Id -eq $null
)
& $ProcMonTest
Start-Sleep -seconds 2

$ProcMonTestProcess = Get-Process | where {$_.Path -eq $ProcMonTest}
Stop-Process $ProcMonTestProcess.Id

& $ProcMon /Terminate

# Test for file lock on procmon.exe backing file before exporting
do{
Start-Sleep -seconds 1
$TestFileLock = & $HandleExe $ProcMonBack
foreach ($line in $TestFileLock){
if ($line -match "pid:"){
$FileLocked = $true
}
else{
$FileLocked = $false
}
}
}while(
$FileLocked -eq $true
)

# Read the procmon.exe backing file and export as CSV
& $ProcMon /openlog $ProcMonBack /SaveAs $CSVFile
& $ProcMon /Terminate

# Clean up procmon.exe backing file
$FileExists = Test-Path $ProcMonBack
if ($FileExists -eq $true){
Remove-Item $ProcMonBack -force
}

"No Comment" – The Never Ending Debate on Commenting Scripts/Code

I have been re-writing this topic for a couple of weeks. I will not get into the discussion on the “how’s and why’s” of the two camps of opinions on this. It’s just not worth the time nor the energy. What I will say is this is my personal opinion and personal preference.

Commenting scripts shared with others is important. And, more importantly, when used in day-t0-day administration it’s very important. There are a variety of reasons for this. The primary reason is documentation. But, doesn’t properly written code will “document itself”? Yes. I’ll get to that in a moment. The commenting you do is not necessarily for YOU. It is useful when you need to troubleshoot your script. It also is useful to others who will refer to your script in the future.

The standard, and widely accepted, method is to include comments documenting the purpose of the code/script at the very beginning. Commenting sections or regions of a script is also very important. This is more important when sharing scripts or the person using the script is a beginner. Most administrators are not programmers. Everyone has to start somewhere; if your script is used by someone new to scripting or new to performing a job function those comments are important in the knowledge transfer process. The comments become a learning tool for that new scripter.

But, back to a previous question. Doesn’t properly written code will “document itself?” Yes, to a point. Again, most administrators are not developers. Like all skills, the less frequent you use that skill the more that skill-set fades away. If you are an administrator who writes an occasional script, commenting is all the more important.

Additionally, if YOU are new to scripting and you make comments in things you do, it makes it that much easier to improve when your scripting skills improve. Many times I’ve looked at scripts that I didn’t comment and thought “what the heck was I thinking anyway?” Writing comments around regions of code or specific lines allows me to understand how I’ve progressed and also allows me to improve those weak areas in a script.

So, my opinion I won’t hide… I think commenting in PowerShell scripts is essential.

PowerShell and Microsoft Security Essentials

We’ve been considering implementing Microsoft Security Essentials on the workstations at my church. I have began a small project to install Windows Management Framework so I can use WinRM to remotely manage workstations in that environment.

The subject of using PowerShell with MSE came to mind and after a quick Google search, I found Richard Siddaway’s blog post from November 10th of last year:

http://richardsiddaway.spaces.live.com/blog/cns!43CFA46A74CF3E96!2603.entry

Amazingly simple. And you can add your own set of functions easily if you wish. However, it’s simple to use through PowerShell:

& $mse -help

& $mse -SignatureUpdate

A good blog post to stumble on and a great idea for remotely managing clients on the cheap. So, I definitely going to use this information by Richard Siddaway.

Indianapolis PowerShell Users Group

Anyone in the Indianapolis area, there’s a PowerShell user’s group forming. First meeting is April 6th at 6:30pm on the IUPUI campus in Taylor Hall.

Details here:

http://powershellcommunity.org/Forums/tabid/54/aff/18/afv/topic/aft/4781/Default.aspx