A Great Example – PowerShell is a Great Problem Solver

I will probably wander a bit here in this post. I’ve still be sick this past weekend, so bear with me. My mind tends to wander, but there will be a point here!

When PowerShell first began to be promoted in the IT world, you often heard that it wasn’t “just another scripting language”. While that is true, it is so much more than that. As someone who prefers the linux shell, I was quickly able to see that. It is a shell environment, it is a command line tool, it CAN be used quite POWERFULLY as a scripting tool. But, as I (continue) to learn and expand my knowledge of PowerShell I see each day that it is a powerful tool. And, it is a great problem solver!

I have just begun to scratch the surface with 2.0. I primarily still use commandlets from 1.0 simply because everyone around me still hasn’t (at least 100%) caught on to the PowerShell bug. All of us still primarily have 1.0 installed.

Let’s face it, anything “new” takes some time to proliferate into environments. The concept has to be sold (to a certain extent) so you get buy in from everyone that’s going to be involved with that “new thing”. Let’s face it, all of us have probably seen a lot of new in IT. Some have been very successful, while others are not.

The nice thing about PowerShell is its flexibility. It is so much more than a command line environment and scripting tool. It’s ALL of the above AND, it gives you the ability to stick your hands right into the heart of Microsoft. With PowerShell integration being a CEC at Microsoft, as time progresses, you’ll be able to use PowerShell to solve all kinds of needs. PowerShell is a problem solver.

One great interview with Jeffery Snover on the Powerscripting Podcast included a fantastic quote from him. I don’t remember the exact quote, but the goal is for them to provide the tools. Microsoft can’t fill in all the gaps. Give away the tool and let 3rd parties and admins solve their own problems.

One classic example of how Microsoft recognizes this is a lot of the video content on the Edge section of the TechNet website: http://edge.technet.com/Media/

There are a lot of great videos/screen casts that give you some great ideas. One I highly recommend as a starting point to creating your own variation is the User Provisioning scripts. You can start with part 1 here: http://edge.technet.com/Media/Automate-User-Provisioning-with-Windows-PowerShell-Part-1/

Given a bit of experimentation, you could use Admin Script Editor or Sapien’s Form builder to create your own GUI front end to create users specific to your AD environment.

I try to encourage the folks who work with users on a day to day basis to try to get into PowerShell and just use it. Use it instead of cmd.exe. Find things you do or would like to do given a customized script. Create your own! Just start using it. Find ways it can be useful to you to solve problems in your environment! You don’t have to be a scripting wiz by any means.

Advertisements

What’s their to think about? POSH 2.0 for Windows XP!

John Cook’s blog had a little post many people may have been missing. Back in November of 2009, he posted a link to the Windows Management Framework which includes Windows RM 2.0 and Windows PowerShell 2.0…. for WINDOWS XP!

Here’s John’s blob post: The Endeavour

Here’s the URL to the KB article on the Windows Management Framework (WMF).

Now, something to consider:

Although it does not appear to be mandatory, it’s probably best to uninstall PowerShell 1.0 BEFORE you install PowerShell 2.0 in Windows XP.

So, what’s there to think about? Lots of benefits to PowerShell 2.0. One big one is the benefits of modules. You can use modules to your benefit by cleaning up variables! Quite easy. Check out Bruce Payette’s interview/demo on using modules.

See each of the videos: http://edge.technet.com/tags/BP.

It Still Pays To Be “DOS Old School” in a Windows World

Recently, at work, I was recently made aware of a problem that also brought back an old pet-peeve of mine. In 2010, I still cannot believe people actually SELL products that were primarily designed for DOS. Very irritating! It’s even more frustrating to talk to some of these people when you have support issues.

Thankfully, there are some issues that can be “bridged” by eliminating the need call to support. One of those is when that pre-historic application needs to exist in a networked environment and has to print.

You’ll typically find that many in the IT world, who are relatively new, don’t realize you can sort of “fool” these old applications. Many old DOS applications print to LPT ports. A simple trick combines this knowledge with some good ol’ command line knowledge. You can accomplish the following steps through any means necessary (batch file, vbscript, and even PowerShell). But, I’ll illustrate it “the old school way” using a batch file.

REM batch file calling application
DOSAPP.EXE

If we’re on a network and local printers aren’t used, the trick is to “capture” an LPT port to a network printer. This can be done within the batch file that calls the app. The LPT port should be captured prior to calling the exe.

REM batch file calling application
REM First, capture LPT1 to a network printer, then call the application
NET USE LPT1: \\SERVER\PRINTER1 /PERSISTENT:NO
DOSAPP.EXE

NET USEwith /PERSISTENT:NO tells Windows NOT to capture the port in a persistent manner. This ensures that we are only capturing the port “as needed” (when the application is run via the batch script. Don’t run your batch file just yet…

We’re only half done! We’ve captured the printer port, but nothing is connected to that port. We’ll need to install a printer and connect it to the captured LPT port in order to print.

Many old DOS apps heavily rely on simple printer drivers. The rule of thumb is to select a “neutral” driver such as the Microsoft Windows provided “HP LaserJet III” driver. If you are printing postscript, then select a similar “neutral” driver. I have always used the “HP LaserJet III” driver because it won’t require me to provide vendor media or Windows installation media. It also supports the standard PCL language and rarely presents any conflicts for older DOS applications.

So, from “Add Printer” in Windows, the procedure is to 1) a printer, 2) select local port and choose LPT1. When asked for a driver, select “HP LaserJet III.” Don’t do a test print at this point. After the print driver is installed and is associated to LPT1 in windows, then run your batch file.

Voila. You’ve successfully made someone in your company happy and are helping to support a dinosaur application that needed to print on your network.

Now, is it absolutely necessary to install a print driver? Not in every situation. If the application is simply printing ASCII text, probably not. If the application needs the driver layer (for whatever reason) then yes. You can test either way. There will be some old applications which require either PCL or PS drivers installed. You’ll just have to test your app both ways. Simplicity is the way to go. If your tests determine you don’t need the drivers, don’t install them.

PowerShell and The Case of the Poor Man’s NTBACKUP

One of the things I do is volunteer some of my time and IT experience for my church. We haven’t had the budget to renew maintenance of Backup Exec for a couple of years. This was our previous backup solution. So, we’ve been using NTBACKUP.

I created a simple way to have a PowerShell function to perform the backup without all the parameters and switches and without the need for the gui. I’ve used NTBACKUP in Windows scheduler several years ago. We purchased to external hard drives for rotation of backups. The desired strategy would attempt to give us about an 8 week window for recovery of files.

I decided to combine a strategy of both shadow copy for immediate recovery of individual files and folders and NTBACKUP for recovery. With current economic conditions, the budget only provided a small amount of money for storage.

That was our starting point. I thought about this one for quite a bit and then thought of an idea. I’m a frequent PowerScripting Podcast listener (http://powerscripting.wordpress.com/). I gathered several ideas I’ve heard over the course of many shows and designed a solution for backups as well as one simple solution for backup job rotation I’ve seen used with VB scripts for several years.

A service account was created with the necessary permissions to perform the backups. Since the service account first had to have a Windows profile before it could utilize a PowerShell profile, I logged into the console to create the profile.

I’m a firm believer in creating your own PowerShell function libraries. My co-worker with exceptional PowerShell skills sold me on that idea long ago. The plan was to have two functions and three Windows scheduler tasks. The backup functions were placed in the network function library in the central repository of scripts on the server. The service account’s default WindowsPowershell profile.ps1 dot sourced the function library containing the backup functions.

The first function simply performs the backup. Because the function would be called using Windows Scheduler, I needed to create the function with parameters to do both normal and differential parameters. The ntbackup function is “start-ntbackup”. Parameters are either “normal”, “differential”, or “incremental”.

The second function would be the backup file rotation function. We needed a way to manage the resulting backup files in an organized fashion across the two external disks. The maintenance function is “start-BackupRotation” with no parameters.

The Windows Scheduled task for normal backups calls the function in this manner:

C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup normal

The Windows Scheduled task for differential backups calls the function in this manner:

C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup differential

Normal backups run in the evenings each Friday and “daily” differential backups run on all other days. The maintenance task runs at noon on Friday prior to the normal backup running on Friday evening.

Below are the functions:

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20091001
###
### This function executes NTBackup using dates for identification of backups.
### Future improvements - retention aging on files
###
###
###
###
### start-ntbackup normal
### start-ntbackup differential
### start-ntbackup incremental
###
###
###
###
###
### 20091001 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-NTBackup([String]$BackupType){
# Variables
$FullFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$DailyFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$BackupPath = "F:\"
# "Need switch statement - if $BackupType normal or differential
$BackupFileName = $FullFileName
$BksDescription = (get-datestamp)
# Call NTBackup
&cmd /c "C:\WINDOWS\system32\ntbackup.exe backup systemstate c: e: /j `"$BackupType`" /n $BksDescription /f $BackupPath$BackupFileName /v:no /r:no /hc:off /fu /m $BackupType /rs:no /l:s"
}

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20100105
###
### This function will move files between weekly folders across two disks effectively doing backup rotations.
###
###
###
###
### start-BackupRotation
###
###
###
###
###
### 20100105 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-BackupRotation(){

# Get the current Day - this is the "maintenance day"
$DayNow = (Get-Date).DayOfWeek
# Create date variable for log file name
$datestamp = (get-date).tostring("yyyyMMdd")

# do our cleanup on a Friday before normal backups start.
# move the oldest backups to the delete directory, then progressively move backups forward
# leaving the root directory empty for the current week.

if ($DayNow -eq "Friday"){
Get-item G:\Backups-Week8\* | Move-Item -Destination G:\Backups-Delete -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week7\* | Move-Item -Destination G:\Backups-Week8 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week6\* | Move-Item -Destination G:\Backups-Week7 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week5\* | Move-Item -Destination G:\Backups-Week6 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week4\* | Move-Item -Destination G:\Backups-Week5 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week3\* | Move-Item -Destination F:\Backups-Week4 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week2\* | Move-Item -Destination F:\Backups-Week3 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week1\* | Move-Item -Destination F:\Backups-Week2 -Force >>G:\Logs\Transcript-$datestamp.log
Get-Item F:\* -Include *.bkf | Move-Item -Destination F:\Backups-Week1 -Force >>G:\Logs\Transcript-$datestamp.log
remove-item G:\Backups-Delete -Force >>G:\Logs\Transcript-$datestamp.log
# cleanup logs + 60 days
$LogFiles = Get-ChildItem G:\Logs
foreach($LogFile in $LogFiles){
$LogFileDate = ((Get-Date) - $LogFile.CreationTime).Days
if ($LogFileDate -gt 60){
$LogFile.Delete() >>G:\Logs\Transcript-$datestamp.log
}
}
}

}