Microsoft CA – Templates Not Showing up in IIS Web Enrollment

I’ve seen a good number of questions posted about someone’s templates not showing up under the IIS web enrollment page. But, they always seem to miss a critical piece of information when someone has created an Enterprise CA that is Windows 2008 R2.

TYPICALLY the problem is one or a combination of the following three things below:

1) In certificate template Subject tab wasn’t switched to Supply in request.

2) Certificate template was configured for validity greater than one year (this is actually not 100% true by the way)

3) the enrollment permissions on the certificate are incorrect

ALWAYS make SURE you know what functional level of AD you have BEFORE you create a new template. If you are 2003, make sure you create the template as 2003 Enterprise. Do not pass go, do not collect $100, do not create a 2008 Enterprise template when AD is at a 2003 functinoal level IF you want that template showing up under web enrollment.

You can create the template. There is nothing that prevents you from doing so. However, you would have to manually create a certificate request using the following procedure:
http://technet.microsoft.com/en-us/library/ff625722(v=ws.10).aspx

Again, if you want that template to show up under web enrollment, your CA is 2008 R2 and your functional level of AD is at 2003, make sure you create that template as 2003 enterprise and follow the other best practices.

Now, the bit about a 1 year limitation causing problems with the template showing up under web enrollment, I haven’t seen that as a valid problem. I’ve gone as high as 10 years in a lab environment and it works just fine.

For the sake of endorsing Microsoft documentation, make sure you read and follow the following information:

http://technet.microsoft.com/en-us/library/cc770357(v=ws.10).aspx

Advertisements

MDT 2010 and Unattended Application Installs

While doing quite a bit of digging and testing with the Microsoft Deployment Toolkit 2010, I put 2 and 2 together and came up with a method of installing applications.

This is quite effective when you do a hybrid lite touch/zero touch deployment of Windows 7. It’s also useful when you need to install an application after the deployment of the desktop OS. The issue with this approach is that this requires a few manual steps. I’m sure they could be scripted further but this post gives you an example of how to isolate the installs in a separate task in MDT 2010.

First, let me set a test lab scenario for you.

1 machine with MDT 2010 installed and properly configured
1 client

Requirements: Whatever applications you choose to install from MDT 2010, make sure they can be deployed via a silent install or a method that won’t require manual interaction.

For my demo lab, I set up Microsoft ForeFront, and Office 2010.

Basically, all you simply have to do is add a custom task. You’ll note that the default custom task in MDT 2010 should only contain one task sequence. It just so happens the task sequence is all we need. It is “Install Applications”. How easy is that!?!

I labeled my task in MDT “Install Applications” with the comment of “Only use this step to manually isntall applications after an operating system is deployed.”

Now, add your application. For Microsoft ForeFront, in my lab environment, I used the following quiet install command: “CLIENTSETUP.EXE /NOMOM” Simple enough.

For Microsoft Office 2010, it’s a tad bit more complex. By default, you typically DO NOT need to include double quotes when dealing with paths that contain spaces. When using arguments, you DO.

I set up an MSP file using setup.exe /admin and followed amply documented instructions for creating a silent install file.

Then, the quiet install command line is:

setup.exe /config "\\server\share\MS Office 2010\Customizations\Office2010.xml" /adminfile "\\server\share\MS Office 2010 Customizations\Office2010.msp"

To follow the syntax, it’s basically setup.exe /config "\config.xml" /adminfile "\config.msp"

I won’t get any deeper into this. As I said previously, this procedure for creating and using the MSP is well documented. What is critical here is that many posts you’ll see about this is you need BOTH the /config and /adminfile switches. If you do not include both, the install will fail.

Ok, at this point, we have two apps and a task. Remember to publish and update your deployment share.

Now, how do you execute the installs?? Here’s the downside to this procedure; you need to log on to the computer where the installs need to occur. From the command line, you need to manually execute the litetouch.vbs file in the deployment share’s script folder.

In a domain with no public shares and guest accounts disabled, you MAY have to map a drive manually first.* Otherwise, just use the UNC path. I always have public shares and guest accounts disabled in my lab to mirror a production environment, so I simply mapped a drive:

net use f: \\server\share /persistent:no

I used a service account (the same one I’ll use when entering credentials for performing the installs after the script executes).

Then, simply execute the litetouch.vbs script:

cscript f:\scripts\litetouch.vbs

Voila! Now, just provide credentials, select the Install Applications task, select the application(s) and sit back and wait.

* Note: more on this in a future post. If you do this right, there’s no need to map a drive. But, you still have to run the script manually. Stay tuned for more.

PowerShell and The Case of the Poor Man’s NTBACKUP

One of the things I do is volunteer some of my time and IT experience for my church. We haven’t had the budget to renew maintenance of Backup Exec for a couple of years. This was our previous backup solution. So, we’ve been using NTBACKUP.

I created a simple way to have a PowerShell function to perform the backup without all the parameters and switches and without the need for the gui. I’ve used NTBACKUP in Windows scheduler several years ago. We purchased to external hard drives for rotation of backups. The desired strategy would attempt to give us about an 8 week window for recovery of files.

I decided to combine a strategy of both shadow copy for immediate recovery of individual files and folders and NTBACKUP for recovery. With current economic conditions, the budget only provided a small amount of money for storage.
That was our starting point. I thought about this one for quite a bit and then thought of an idea.

I’m a frequent PowerScripting Podcast listener (http://powerscripting.wordpress.com/). I gathered several ideas I’ve heard over the course of many shows and designed a solution for backups as well as one simple solution for backup job rotation I’ve seen used with VB scripts for several years.

A service account was created with the necessary permissions to perform the backups. Since the service account first had to have a Windows profile before it could utilize a PowerShell profile, I logged into the console to create the profile.

I’m a firm believer in creating your own PowerShell function libraries. My co-worker with exceptional PowerShell skills sold me on that idea long ago. The plan was to have two functions and three Windows scheduler tasks. The backup functions were placed in the network function library in the central repository of scripts on the server. The service account’s default WindowsPowershell profile.ps1 dot sourced the function library containing the backup functions.

The first function simply performs the backup. Because the function would be called using Windows Scheduler, I needed to create the function with parameters to do both normal and differential parameters. The ntbackup function is “start-ntbackup”. Parameters are either “normal”, “differential”, or “incremental”.

The second function would be the backup file rotation function. We needed a way to manage the resulting backup files in an organized fashion across the two external disks.

The maintenance function is “start-BackupRotation” with no parameters.
The Windows Scheduled task for normal backups calls the function in this manner:
C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup normal
The Windows Scheduled task for differential backups calls the function in this manner:

C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup differential

Normal backups run in the evenings each Friday and “daily” differential backups run on all other days. The maintenance task runs at noon on Friday prior to the normal backup running on Friday evening.

Below are the functions:

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20091001
###
### This function executes NTBackup using dates for identification of backups.
### Future improvements - retention aging on files
###
###
###
###
### start-ntbackup normal
### start-ntbackup differential
### start-ntbackup incremental
###
###
###
###
###
### 20091001 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-NTBackup([String]$BackupType){
# Variables
$FullFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$DailyFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$BackupPath = "F:\"
# "Need switch statement - if $BackupType normal or differential
$BackupFileName = $FullFileName
$BksDescription = (get-datestamp)
# Call NTBackup
&cmd /c "C:\WINDOWS\system32\ntbackup.exe backup systemstate c: e: /j `"$BackupType`" /n $BksDescription /f $BackupPath$BackupFileName /v:no /r:no /hc:off /fu /m $BackupType /rs:no /l:s"
}

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20100105
###
### This function will move files between weekly folders across two disks effectively doing backup rotations.
###
###
###
###
### start-BackupRotation
###
###
###
###
###
### 20100105 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-BackupRotation(){

# Get the current Day - this is the "maintenance day"
$DayNow = (Get-Date).DayOfWeek
# Create date variable for log file name
$datestamp = (get-date).tostring("yyyyMMdd")

# do our cleanup on a Friday before normal backups start.
# move the oldest backups to the delete directory, then progressively move backups forward
# leaving the root directory empty for the current week.

if ($DayNow -eq "Friday"){
Get-item G:\Backups-Week8\* | Move-Item -Destination G:\Backups-Delete -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week7\* | Move-Item -Destination G:\Backups-Week8 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week6\* | Move-Item -Destination G:\Backups-Week7 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week5\* | Move-Item -Destination G:\Backups-Week6 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week4\* | Move-Item -Destination G:\Backups-Week5 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week3\* | Move-Item -Destination F:\Backups-Week4 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week2\* | Move-Item -Destination F:\Backups-Week3 -Force >>G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week1\* | Move-Item -Destination F:\Backups-Week2 -Force >>G:\Logs\Transcript-$datestamp.log
Get-Item F:\* -Include *.bkf | Move-Item -Destination F:\Backups-Week1 -Force >>G:\Logs\Transcript-$datestamp.log
remove-item G:\Backups-Delete -Force >>G:\Logs\Transcript-$datestamp.log
# cleanup logs + 60 days
$LogFiles = Get-ChildItem G:\Logs
foreach($LogFile in $LogFiles){
$LogFileDate = ((Get-Date) - $LogFile.CreationTime).Days
if ($LogFileDate -gt 60){
$LogFile.Delete() >>G:\Logs\Transcript-$datestamp.log
}
}
}

}

PowerShell and Microsoft Security Essentials

We’ve been considering implementing Microsoft Security Essentials on the workstations at my church. I have began a small project to install Windows Management Framework so I can use WinRM to remotely manage workstations in that environment.

The subject of using PowerShell with MSE came to mind and after a quick Google search, I found Richard Siddaway’s blog post from November 10th of last year:

http://richardsiddaway.spaces.live.com/blog/cns!43CFA46A74CF3E96!2603.entry

Amazingly simple. And you can add your own set of functions easily if you wish. However, it’s simple to use through PowerShell:

& $mse -help

& $mse -SignatureUpdate

A good blog post to stumble on and a great idea for remotely managing clients on the cheap. So, I definitely going to use this information by Richard Siddaway.

Populating A Test Lab with AD Objects Using Windows PowerShell

This script name is: Create-ADTestLabContent

The commenting is pretty self-explanatory. I ran across Dmitry Sotnikov’s blog post on creating demo Active Directory environments last fall. You can view that blog post here: http://dmitrysotnikov.wordpress.com/2007/12/14/setting-demo-ad-environments/

I liked his script and used it often, but found I needed something a bit more substatial to save me time while doing tests. So, I used his script as a “base” and have highly modified it since last fall. In January I began to make much more substantial changes to it with the goal of adding it to poshcode.org. You can view and copy the whole script here: http://poshcode.org/1666

In a local VM in VMware Player, I’m able to create all the empty containers, 50 users, accompanying groups and computers in about 2-3 minutes.

Many thanks to Dmitry on his encouragement on the improvement of this script.

Below is the full script minus the comments:

# ---------------------------------------------------------------------------
### Derived From='Dmitry Sotnikov - http://dmitrysotnikov.wordpress.com/2007/12/14/setting-demo-ad-environments/
### This script design uses the original script (base script) written by Dmitry Sotnikov. The script's
### original comments are included below. I am referring to Dmitry's script as "version 1.0"
###
### My goal is to standardize variables, functions and libraries such that the script is portable.
### This is so that I can place all files for PowerShell on an ISO file and re-use the content
### with as little modification as possible from test scenario to test scenario.
###
### My scripts folder is a directory copied from the ISO file. When I build a virtual environment,
### I bring up a completely configured and empty AD domain. I then attach the ISO to the VM and
### copy the "scripts" folder to the root of C:. I then drop in a default profile.ps1 into the
### WindowsPowerShell directory (the default All Users profile) and run this script.
###
### There is more work, yet to do; I want to "pare down" the functions so that the functions could be added to
### a functions.ps1 "library" file.
###
### The labs I set up for testing use an OU structure similar to the following:
###
### OU=DeptName -|
### |- Computers
### |- Groups
### |- Users
###
### The profile.ps1 sets up the PSDrive and then creates a variable to the provider. The profile.ps1
### script is in the root of the scripts directoy which is copied from the ISO file.
###
### Contents of the profile.ps1 file:
###
### New-PSDrive -name ScriptLib -psProvider FileSystem -root "C:\Scripts"
### $ScriptLib = 'ScriptLib:'
###
### The Scripts folder contains a subfolder named "LabSetup". The LabSetup folder contains this script,
### titled "Create-ADTestLabContent.ps1" and all of the text files necessary for creating the user
### objects, OU's, etc. You can create your own files and/or edit this script to match your file names.
### I've listed the contents of each file below.
###
### I deviated from the original text files from Dmitry's script.
### My goal was to have a "true" list of random names by utilizing the "select-random" written by
### Joel Bennett. This can be downloaded from poshcode.org. I found that the combination of the
### select-ramdom on the census files and parsing the extra data was extremely time consuming.
### I went to the census.org page for year 2000 and downloaded the top 1000 names spreadsheet.
### Then, I simply stripped off ALL of the extra data (first row and all columns after column A)
### and saved it as an ascii file called "surnames.txt". The link to that page is:
### http://www.census.gov/genealogy/www/data/2000surnames/index.html
###
### Additionally, I did something similar with the first names.
### I downloaded common male and female names from http://infochimps.org/collections/moby-project-word-lists
### Those files are named fgivennames.txt and mgivennames.txt. You can alternately download a text file
### of 21,000+ common given names from the same site instead of using the surnames from census.gov.
### However, for my testing, a sample of 1000 last names was sufficient for my needs.
###
### departments.txt - Name of each Department which will be both an OU, group, and the department
### property on user objects.
### ous.txt - Name of child-containers for each Department OU (Computers, Groups, Users).
### cities.txt - Names of cities I will use on user properties
### dist.all.last.txt - ASCII file of last names downloaded from the Census.gov website
### dist.male.first.txt - ASCII file of male first names downloaded from the Census.gov website
### dist.female.first.txt - ASCII file of female first names downloaded from the Census.gov website
###
### The descriptions of the deparments match the OU name. This differentiates them from the default
### containers created when AD is set up from those added by this script. This allows for easily removing
### containers and all child items quickly during testing.
###
### Requires ActiveRoles Management Shell for Active Directory. This script will check
### for the snapin and add the snapin at runtime.
###
### History
### changes 01/08/2010 - version 2.0
### - Change Display name and full name properties to format of Lastname, Firstname
### - Change password to p@ssw0rd
### Changes 01/11/2010 - version 2.1
### - Assume base config of empty domain. Create variable for root domain name
### - make sure not attempt is made to duplicate usernames
### - Create containers
### Changes 02/19/2010 - version 2.2
### - added function to create empty departmental OUs and child containers for users, groups and computers
### Changes 02/22/2010 - version 2.3
### - added computer account creation to occur when the user is added
### - dot source functions.ps1
### - added Joel Bennett's select-random v2.2 script to functions.ps1. functions.ps1 in root of scripts folder
### Changes 02/23/2010
### - Made script more readible by using word-wrap
### - Cleaned up description and commenting
### Changes 02/24/2010 - Version 2.4
### - Using new ascii files for first and given names (see notes)
### - Removed original lines for parsing census.gov files
### Changes 02/25/2010
### - added better description for containers added via script to differentiate them to account for
### manually added containers
### - fixed issue with computer object creation - computer objects weren't always getting created
###
### Original Script name: demoprovision.ps1
##################################################
### Script to provision demo AD labs
### (c) Dmitry Sotnikov, xaegr
### Requires AD cmdlets
##################################################
###
### set folder in which the data files are located
### this folder should contain files from
### http://www.census.gov/genealogy/names/names_files.html
### as well as cities.txt and departments.txt with the
### lists of cities and departments for the lab
# ---------------------------------------------------------------------------</code>

#Load Function Library
. $ScriptLib\functions.ps1

# function to create empty OUs
function create-LabOUs (){
# Create Each Dept OU
for ($i = 0; $i -le $DeptOUs.Length - 1; $i++){
$OUName = "Test Lab Container - " + $DeptOUs[$i]
$CreateDeptOU += @(new-QADObject -ParentContainer $RootDomain.RootDomainNamingContext `
-type 'organizationalUnit' -NamingProperty 'ou' -name $DeptOUs[$i] -description $OUName )
}

# Create Child OUs for each Dept
foreach ($DeptOU in $CreateDeptOU){
for ($i = 0; $i -le $ChildOUs.Length - 1; $i++){
new-qadObject -ParentContainer $DeptOU.DN -type 'organizationalUnit' -NamingProperty 'ou' `
-name $ChildOUs[$i]
}
}
}

function New-RandomADUser (){
# set up random number generator
$rnd = New-Object System.Random

# pick a male or a female first name
if($rnd.next(2) -eq 1) {
$fn = $firstm[$rnd.next($firstm.length)]
} else {
$fn = $firstf[$rnd.next($firstf.length)]
}
# random last name
$ln = $last[$rnd.next($last.length)]

# Set proper caps
$ln = $ln[0] + $ln.substring(1, $ln.length - 1).ToLower()
$fn = $fn[0] + $fn.substring(1, $fn.length - 1).ToLower()

# random city and department
$city = $cities[$rnd.next($cities.length)]
$dept = $depts[$rnd.next($depts.length)]

$SName = ($fn.substring(0,1) + $ln)

# set user OU variable
switch ($dept){
$DeptContainers[0].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[0].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[1].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[1].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[2].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[2].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[3].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[3].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[4].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[4].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[5].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[5].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[6].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[6].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[7].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[7].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[8].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[8].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
$DeptContainers[9].name {$UserOU = Get-QADObject -SearchRoot $DeptContainers[9].DN | `
where { $_.DN -match "Users" -and $_.Type -ne "user" }}
}

# Check for account, if not exist, create account
if ((get-qaduser $SName) -eq $null){
# Create and enable a user
New-QADUser -Name "$ln`, $fn" -SamAccountName $SName -ParentContainer $UserOU -City $city `
-Department $dept -UserPassword "p@ssw0rd" -FirstName $fn -LastName $ln -DisplayName "$ln`, $fn" `
-Description "$city $dept" -Office $city | Enable-QADUser
}

# set group OU variable
switch ($dept){
$DeptContainers[0].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[0].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[1].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[1].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[2].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[2].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[3].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[3].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[4].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[4].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[5].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[5].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[6].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[6].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[7].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[7].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[8].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[8].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
$DeptContainers[9].name {$GroupOU = Get-QADObject -SearchRoot $DeptContainers[9].DN | `
where { $_.DN -match "Groups" -and $_.Type -ne "group" }}
}

# Create groups for each department, create group if it doesn't exist
if ((get-QADGroup $dept) -eq $null){
New-QADGroup -Name $dept -SamAccountName $dept -ParentContainer $GroupOU -Description "$dept Users"
}

# Add user to the group based on their department
Get-QADUser $SName -SearchRoot $UserOU | Add-QADGroupMember -Identity { $_.Department }

# set computer OU variable
switch ($dept){
$DeptContainers[0].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[0].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[1].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[1].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[2].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[2].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[3].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[3].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[4].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[4].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[5].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[5].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[6].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[6].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[7].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[7].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[8].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[8].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
$DeptContainers[9].name {$ComputerOU = Get-QADObject -SearchRoot $DeptContainers[9].DN | `
where { $_.DN -match "Computers" -and $_.Type -ne "computer" }}
}

# Create a computer account for the user
if ((get-qadcomputer "$SName-Computer") -eq $null){
New-QADComputer -Name "$SName-Computer" -SamAccountName "$SName-Computer" -ParentContainer `
$ComputerOU -Location "$city $dept"
}
}
Start-Transcript c:\ADTestLabContent.txt
$TestQADSnapin = get-pssnapin | where { $_.Name -eq "Quest.ActiveRoles.ADManagement"}
if($TestQADSnapin -eq $null){
add-pssnapin -Name Quest.ActiveRoles.ADManagement -ErrorAction SilentlyContinue
}

# number of accounts to generate - edit
$num = 50

# Read root domain text
$RootDomain = Get-QADRootDSE

# Read all text data
# OU's to create
$DeptOUs = @(Get-Content "$ScriptLib\LabSetup\Departments.txt")
$ChildOUs = @(Get-Content "$ScriptLib\labsetup\ous.txt")
# read department and city info
$cities = Get-Content C:\scripts\LabSetup\Cities.txt
$depts = Get-Content C:\scripts\LabSetup\Departments.txt

# read name files
# randomly select names from census files
# Use Joel Bennet's select-random v 2.2; saved in functions.ps1
1..$num | ForEach-Object {
$last += @(Get-Content C:\scripts\LabSetup\surnames.txt | select-random)
$firstm += @(Get-Content C:\scripts\LabSetup\mgivennames.txt | select-random)
$firstf += @(Get-Content C:\scripts\LabSetup\fgivennames.txt | select-random)
}

# Let's do the work

# Create OUs first - call function
create-LabOUs

# Retrieve all newly created OU DN's for use in next function
$DeptContainers = @(Get-QADObject -Type "organizationalUnit" | where {$_.Name -ne "Computers" -and $_.Name `
-ne "Groups" -and $_.Name -ne "Users" -and $_.Description -match "Test Lab Container"})

foreach ($item in $DeptContainers){
$item.description
}
# Create users, create dept groups
1..$num | ForEach-Object { New-RandomADUser }

Stop-Transcript
trap{
Write-Host "ERROR: script execution was terminated.`n" $_.Exception.Message
break
}

PowerShell and The Case of the Poor Man’s NTBACKUP

One of the things I do is volunteer some of my time and IT experience for my church. We haven’t had the budget to renew maintenance of Backup Exec for a couple of years. This was our previous backup solution. So, we’ve been using NTBACKUP.

I created a simple way to have a PowerShell function to perform the backup without all the parameters and switches and without the need for the gui. I’ve used NTBACKUP in Windows scheduler several years ago. We purchased to external hard drives for rotation of backups. The desired strategy would attempt to give us about an 8 week window for recovery of files.

I decided to combine a strategy of both shadow copy for immediate recovery of individual files and folders and NTBACKUP for recovery. With current economic conditions, the budget only provided a small amount of money for storage.

That was our starting point. I thought about this one for quite a bit and then thought of an idea. I’m a frequent PowerScripting Podcast listener (http://powerscripting.wordpress.com/). I gathered several ideas I’ve heard over the course of many shows and designed a solution for backups as well as one simple solution for backup job rotation I’ve seen used with VB scripts for several years.

A service account was created with the necessary permissions to perform the backups. Since the service account first had to have a Windows profile before it could utilize a PowerShell profile, I logged into the console to create the profile.

I’m a firm believer in creating your own PowerShell function libraries. My co-worker with exceptional PowerShell skills sold me on that idea long ago. The plan was to have two functions and three Windows scheduler tasks. The backup functions were placed in the network function library in the central repository of scripts on the server. The service account’s default WindowsPowershell profile.ps1 dot sourced the function library containing the backup functions.

The first function simply performs the backup. Because the function would be called using Windows Scheduler, I needed to create the function with parameters to do both normal and differential parameters. The ntbackup function is “start-ntbackup”. Parameters are either “normal”, “differential”, or “incremental”.

The second function would be the backup file rotation function. We needed a way to manage the resulting backup files in an organized fashion across the two external disks. The maintenance function is “start-BackupRotation” with no parameters.

The Windows Scheduled task for normal backups calls the function in this manner:

C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup normal

The Windows Scheduled task for differential backups calls the function in this manner:

C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe start-ntbackup differential

Normal backups run in the evenings each Friday and “daily” differential backups run on all other days. The maintenance task runs at noon on Friday prior to the normal backup running on Friday evening.

Below are the functions:

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20091001
###
### This function executes NTBackup using dates for identification of backups.
### Future improvements - retention aging on files
###
###
###
###
### start-ntbackup normal
### start-ntbackup differential
### start-ntbackup incremental
###
###
###
###
###
### 20091001 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-NTBackup([String]$BackupType){
# Variables
$FullFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$DailyFileName = $BackupType + "-" + (get-datestamp) + ".bkf"
$BackupPath = "F:\"
# "Need switch statement - if $BackupType normal or differential
$BackupFileName = $FullFileName
$BksDescription = (get-datestamp)
# Call NTBackup
&amp;cmd /c "C:\WINDOWS\system32\ntbackup.exe backup systemstate c: e: /j `"$BackupType`" /n $BksDescription /f $BackupPath$BackupFileName /v:no /r:no /hc:off /fu /m $BackupType /rs:no /l:s"
}

# ---------------------------------------------------------------------------
###
### Ted Wagner
### 20100105
###
### This function will move files between weekly folders across two disks effectively doing backup rotations.
###
###
###
###
### start-BackupRotation
###
###
###
###
###
### 20100105 - TWagner - Original script creation:
###
###
###
# ---------------------------------------------------------------------------
function start-BackupRotation(){

# Get the current Day - this is the "maintenance day"
$DayNow = (Get-Date).DayOfWeek
# Create date variable for log file name
$datestamp = (get-date).tostring("yyyyMMdd")

# do our cleanup on a Friday before normal backups start.
# move the oldest backups to the delete directory, then progressively move backups forward
# leaving the root directory empty for the current week.

if ($DayNow -eq "Friday"){
Get-item G:\Backups-Week8\* | Move-Item -Destination G:\Backups-Delete -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week7\* | Move-Item -Destination G:\Backups-Week8 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week6\* | Move-Item -Destination G:\Backups-Week7 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item G:\Backups-Week5\* | Move-Item -Destination G:\Backups-Week6 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week4\* | Move-Item -Destination G:\Backups-Week5 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week3\* | Move-Item -Destination F:\Backups-Week4 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week2\* | Move-Item -Destination F:\Backups-Week3 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-item F:\Backups-Week1\* | Move-Item -Destination F:\Backups-Week2 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
Get-Item F:\* -Include *.bkf | Move-Item -Destination F:\Backups-Week1 -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
remove-item G:\Backups-Delete -Force &gt;&gt;G:\Logs\Transcript-$datestamp.log
# cleanup logs + 60 days
$LogFiles = Get-ChildItem G:\Logs
foreach($LogFile in $LogFiles){
$LogFileDate = ((Get-Date) - $LogFile.CreationTime).Days
if ($LogFileDate -gt 60){
$LogFile.Delete() &gt;&gt;G:\Logs\Transcript-$datestamp.log
}
}
}

}