Tag Archives: Export-Csv

A Basic(ish) Active Directory Look-Up Script

It was just Saturday—it’s been a few weeks now, actually—that I wrote a post proclaiming to be back. Back to writing about PowerShell, that is. Why not take the first script I wrote in my new position and share it? It’s possible it has some concepts that might be helpful for readers.

The first thing I want to mention is that I hate writing scripts. Huh!? What I mean by that, is that I prefer to write functions. That didn’t happen my first time out in this position, as you’ll see, and I’m going to be okay with it. A one-off script may have a purpose. I stayed away from a function (and therefore a module [maybe], working with profile scripts [potentially], etc.). I’ll live with it.

Let’s break the script up into sections and explain what’s happening in each. Keep in mind that this isn’t perfect and there are places where I would make changes after having looked over this a couple of times. I’ll be sure to mention those toward the bottom of the post. Okay, let’s do this! I’ll put the script back together into a single, code block further below.

The beginning of the script creates two parameters: Properties and Output. Properties can accept multiple string values,—notice the []—and we’ll see that in the upcoming examples. Output can accept one string value of three predetermined values. Those are Console, GridView, and CsvFile; Console is the default parameter value.

Param (
    [Parameter()]
    [string[]]$Properties,
    [Parameter()]
    [ValidateSet('Console','GridView','CsvFile')]
    [string]$Output = 'Console'
)

Next, we create a path to a flat file called userlist.txt. This file will contain Active Directory Display Names (DisplayName). By using the $PSScriptRoot variable, all we have to do is keep our script and the text file in the same location/folder in order for it to work correctly.

Once the $Path variable is set, we attempt to run a Get-Content command against the values in the file, storing these in the $Userlist variable. If for some reason the file isn’t in place, the script will make use of the catch block of our try-catch to indicate to the user that the file can’t be located.

$Path = "$PSScriptRoot\userlist.txt"
try {
    $Userlist = Get-Content -Path $Path -ErrorAction Stop
} catch {
    Write-Error -Message "Unable to locate $Path."
}

Following that, we set our $TotalProperties variable to three Active Directory user properties we know we want. Then, if any values have been passed in using the Properties parameter, we combine those with the three properties already in the $TotalProperties variable.

$TotalProperties = @('DisplayName','EmployeeNumber','SamAccountName')
if ($Properties) {
    $TotalProperties = $TotalProperties + $Properties
}

Moving forward, we set up foreach language construct in order to loop through each user in the $Userlist variable. Remember, this variable was populated by our Get-Content command earlier. For each loop iteration, we are adding a PSCustomObject to our $Users—plural— variable. Normally, I wouldn’t store each value in a variable and instead just pump it out right there, but the script includes some output options that we’ll see next.

foreach ($User in $Userlist) {
    $Users += [PSCustomObject]@(
    Get-ADUser -Filter "DisplayName -eq '$User'" -Properties $TotalProperties |
        Select-Object -Property $TotalProperties
    )
}

Finally, we consider the output option the user either defaulted to, or didn’t. If they included the Output parameter with GridView, we pipe our $Users variable to OutGridView. If they included the Output parameter with CSVFile, we pipe out $Users variable to Export-Csv, saving them in a CSV file, and then open our saved CSV file. If they didn’t include the Output parameter, or they did with the Console value, then we display the results directly in the console. That’s it.

Switch ($Output) {
    GridView {$Users | Out-GridView -Title Users}
    CsvFile {
        $Users.GetEnumerator() |
        Export-Csv -NoTypeInformation -Path "$PSScriptRoot\$(Get-Date -Format FileDateTime -OutVariable NewFileOutput).csv"
        Invoke-Item -Path "$PSScriptRoot\$NewFileOutput.csv"
    }
    Default {$Users | Format-Table -AutoSize}
}

Although I took the base code from someone’s previously written script, this really is still much of a 1.0.0 version. Knowing that, there are some changes I might make; it’s not perfect, but it’ll get the job done.

  • While it’s not vital, I kind of wish I used a different variable name for $Path
    • It’s a path, sure, but to a specific file
    • Perhaps $FilePath, $UserFile, or $UserFilePath
      • It could’ve been more specific
  • Ensure properties passed in via the Properties parameter are valid for an Active Directory user
    • If someone sends in properties that don’t exist for an Active Directory user, it’s going to cause problems
      • Maybe check a known user, gather all the possible properties, and compare
      • (Or) Maybe wrap some error checking without having to do any property compare operation
  • Don’t use Default in the Switch language construct
    • It’s not necessary, as the Output parameter will only accept three possible values
    • Default could’ve been replaced with Console

Here are a few examples followed by the full PowerShell code in a single, code block.

I’ve redacted information from each of these images. There’s something that’s vital to know about each, however. In front of the full path (all the images but the last one), is the & operator. This is called the invocation or call operator. It informs PowerShell that everything after it should be treated as a command and that it’s not just a long, string value.

This example invokes the script without any parameters, pulling in two users from the userlist.txt file.

This example invokes the script and includes two additional Active Directory properties, which are then also included in the output.

This example does the same as the first one, however, it opens the results using the Out-GridView cmdlet.

This one opens the results using whatever program—Excel in my case—is associated with CSV files. This option is saving the file to disk, so keep that in mind, as it has no cleanup features.

This final example is included to show that it works when you’re inside the same directory as the script and text file. It also includes multiple parameters being included at the same time. You might know you can do it, but my at-work audience may not have—I’m not sure. As we’re in the directory with the script, you can actually see the inclusion of the invocation operator.

And finally, all the code in a single code block.

Param (
    [Parameter()]
    [string[]]$Properties,
    [Parameter()]
    [ValidateSet('Console','GridView','CsvFile')]
    [string]$Output = 'Console'
)

$Path = "$PSScriptRoot\userlist.txt"
try {
    $Userlist = Get-Content -Path $Path -ErrorAction Stop
} catch {
    Write-Error -Message "Unable to locate $Path."
}

$TotalProperties = @('DisplayName','EmployeeNumber','SamAccountName')
if ($Properties) {
    $TotalProperties = $TotalProperties + $Properties
}

foreach ($User in $Userlist) {
    $Users += [PSCustomObject]@(
        Get-ADUser -Filter "DisplayName -eq '$User'" -Properties $TotalProperties |
            Select-Object -Property $TotalProperties
    )
}

Switch ($Output) {
    GridView {$Users | Out-GridView -Title Users}
    CsvFile {
        $Users.GetEnumerator() |
            Export-Csv -NoTypeInformation -Path "$PSScriptRoot\$(Get-Date -Format FileDateTime -OutVariable NewFileOutput).csv"
        Invoke-Item -Path "$PSScriptRoot\$NewFileOutput.csv"
    }
    Default {$Users | Format-Table -AutoSize}
}

Use PowerShell to Edit a CSV, Revisited

1. Back in January 2019, I started writing on another website. In December of that same year, I modified how I was doing things. Instead of just writing a full post there, I would start a new post there and then finish the post here, on my site. I did that through June 2020. All the posts from January 2019 to November 2019 are gone and all the posts from December 2019 to June 2020 are partially gone (as only a portion was written away from tommymaynard.com).

2. I have recently mentioned that if I hit 416 posts by the end of June 2022, I will have an average of one post per week over an eight-year timeframe. Bringing those posts back to life and publishing them on my site would get me to my goal much quicker. Beyond that, I have long wanted to recapture that work anyway, and realistically, those posts could have just as easily — maybe even more easily — been published here, to begin with.

As it is time to fix this, it would not make sense to do things manually unless it was an absolute requirement. Automation is kind of why we are all here. I intend to recover as many of my posts as I can and publish them here with help from archive.org — the Internet Archive — and their API.

I have been down the CSV road before. While not as popular as my Hash Table to CSV post, with greater than 21,000 all-time views, my Use PowerShell to Edit a CSV post has greater than 15,000 all-time views. Hash Table to CSV has part II or revised post, and now Use PowerShell to Edit a CSV does, too.

At first, I started to manually collect data — the data in the below image. I did not want to do that any more than I had already done. That decision was all I needed to save some time that was better spent prepping and preparing for this post. In this first image, you can see the CSV from which I began.

Do notice a few things: One, we’re missing some dates on the left, two, we’re missing all the URLs from the WB_URL column, and three, all the Notes are empty too. Our goal is to add dates on the left where we can, add URLs into the WB_URL column, and Notes when that’s necessary. There’s nothing to add to the TM_URL column. Those empty lines indicate posts that were never written and published on my site.

The first thing to do is ensure I can import the above CSV file. We will use the $CsvBasePath variable to hold the path to the file we will import with the Import-Csv command. The values returned from this command will be stored in the $CsvFileContents variable. Then they will be output to a formatted table, so they are easier to read. This is just a quick check to ensure the CSV file can be imported. As you can see below, the file’s contents could be imported. This is the identical data we saw in the previous image.

$CsvBasePath = 'C:\users\tommymaynard\Desktop\tommymaynard.com'
$CsvFileContents = Import-Csv -Path "$CsvBasePath/Corrections.csv"
$CsvFileContents | Format-Table -AutoSize

If you want to read about it, and it is short, here is information on the API that we will be using: https://archive.org/help/wayback_api.php. It is too important not to show you. The below image is what a response looks like when it is returned from using the API. We will use this API with Invoke-RestMethod. It may not mean much now, but you may end up referring to it as you progress further into this post. As you will see shortly, I will use …status -eq 200, I will use and edit the timestamp, and I will collect the URL. Having seen this image may make understanding the PowerShell much more straightforward.

I now know the CSV file can be imported and I understand the structure of the API response.

In the below PowerShell, I removed outputting the contents of the $CsvFileContents variable to the host, and instead, I set up a ForEach-Object looping construct. The first thing done inside this loop is put the current PS_URL value into the $UriRemainder variable in line 5. In line 6, we concatenate the $UriBase variable and the $UriRemainder variable and use them as the URI supplied to the Invoke-RestMethod command. We will continue working through this code below.

$CsvBasePath = 'C:\users\tommymaynard\Desktop\tommymaynard.com'
$UriBase = 'http://archive.org/wayback/available?url='
$CsvFileContents = Import-Csv -Path "$CsvBasePath\Corrections.csv"
$CsvFileContents | ForEach-Object {
	$UriRemainder = $_.PS_URL
	$WBInfoFull = Invoke-RestMethod -Uri "$UriBase$UriRemainder"
	If ($WBInfoFull.archived_snapshots.closest.status -eq 200) {
		$WBInfoDate = $WBInfoFull.archived_snapshots.closest.timestamp
		$WBInfoDate = -join $WBInfoDate[0..7]
		$WBInfoUrl = $WBInfoFull.archived_snapshots.closest.url
		$WBInfoFull
		$WBInfoDate
		$WBInfoUrl
		<#
		$_.Date = $WBInfoDate
		$_.WB_URL = $WBInfoUrl
		#>
		# Remove-Variable -Name WBInfoFull,WBInfoDate,WBInfoUrl
	} Else {
		'[[[[[NOPE]]]]]'
	} # End If-Else.
	$_
} # | Export-Csv -Path "$CsvBasePath\Corrections-temp.csv" -NoTypeInformation

The Invoke-RestMethod command reaches out to archive.org and stores the result — remember the response object — in the $WBInfoFull variable. For each iteration through the loop, this variable is repeatedly filled with data from each lookup against the Wayback Machine — another name for archive.org if I did not say that already. If the status is 200, we know our Invoke-RestMethod command was a success and so we progress further into the If portion of our nested If-Else construct.

We will then set $WBInfoDate by returning the timestamp property such as 20201022005417 and then joining the first eight “digits” [lines 8 and 9]. We then set, or assign, the $WBInfoURL variable. In the remainder of this code, we just dump our values to the screen, clear the variables, and then move onto the next line in the CSV file. We have yet to actually write to a CSV yet.

The below image shows a portion of the output generated by the above commands. Again, I’m not writing to the CSV yet; I’m only making sure the values in my variables are accurate.

It is here where the working PowerShell will be modified in such a way, that we can begin writing to the CSV file. I should make something clear. I did not edit an existing CSV file as much as I created a new one. I suppose I could have written back to the same file…or maybe I could not. That is probably worth finding out someday, but I do suspect that an open CSV can be written to. In the first iteration of this post, I created a temporary file and then did a remove/rename, so it appeared I actually edited a file.  But this whole time, I have not really been editing anything. Such a fraud, I know.

Moving along though. The below changes include having removed the code that outputs the values in the variables to the screen. Instead, these values are being written to the Date and WB_URL columns. When the status isn’t 200, instead of writing [[[[[NOPE]]]]] to the screen as I did above, something more pleasant and professional is written to the Notes column. In order to write to a new CSV, I uncommented the Export-CSV command, as well. In the post’s final image, you can view the “updated” CSV file. By my count, there are over 40 lines in this CSV that I didn’t have to type or paste in after manually doing the search myself. I’ll take it!

Always search for an API to use. Always.

$CsvBasePath = 'C:\users\tommymaynard\Desktop\tommymaynard.com'
$UriBase = 'http://archive.org/wayback/available?url='
$CsvFileContents = Import-Csv -Path "$CsvBasePath/Corrections.csv"
$CsvFileContents | ForEach-Object {
	$UriRemainder = $_.PS_URL
	$WBInfoFull = Invoke-RestMethod -Uri "$UriBase$UriRemainder"
	If ($WBInfoFull.archived_snapshots.closest.status -eq 200) {
		$WBInfoDate = $WBInfoFull.archived_snapshots.closest.timestamp
		$WBInfoDate = -join $WBInfoDate[0..7]
		$WBInfoUrl = $WBInfoFull.archived_snapshots.closest.url
		$_.Date = $WBInfoDate
		$_.WB_URL = $WBInfoUrl
		Remove-Variable -Name WBInfoFull,WBInfoDate,WBInfoUrl
	} Else { 
		$_.Notes = 'Unable to locate an archived webpage for that URL.'
	} # End If-Else.
	$_
} | Export-Csv -Path "$CsvBasePath\Corrections-temp.csv" -NoTypeInformation

While the above image includes the updated data, there have been some unexpected changes from the CSV image earlier in this post. This was worrisome for me at first — why were the dates changing!? My PowerShell worked before, so why not now? It turns out that it still is working. The date changes are because newer snapshots of the pages have been taken by the Wayback Machine since I began this post back in October — yeah, it has been a long time coming. Therefore, no worries. What clued me in was the above code and the response. Take a look at this for a brief moment.

$WBInfoFull.archived_snapshots.closest.timestamp

The keyword is “closest,” as in the most recent snapshot. My most recent snapshots changed between the creation of my original CSV file, and the updated one.

There may be a way to use PowerShell and an API to gather the old content, but for now, I am collecting it manually. I want to reclaim this content ASAP in order to line up getting these posts — the ones that are still relevant — republished here on tommymaynard.com. It is a lot of work, but I am after 416 posts by the end of June 2022. It is fair to say that I will be back with some new, old content very soon.

Hash Table to CSV, Revisited

I started writing about PowerShell in June 2014 at tommymaynard.com. While I’m not a Microsoft MVP, I’m certainly working to obtain Microsoft’s longevity award. While the distinction doesn’t actually exist, this year will mark six consistent years of reading, researching, spending my time with PowerShell, and giving back to the community in my written word. With the combination of my site and this one, I’ve authored somewhere close to 350+ articles, or posts.

What I’m really out to say here, is that in my time, I’ve noticed which of my writings have been consistently popular. There’s a particular post on my site that sees the largest amount of consistent visitors: It’s Hash Table to CSV. I’ve been looking for a way to take its personal success and provide more to the reader about the topic. That’s what we’re doing today with this post. I put a link to the original post on Twitter recently, and someone provided me an idea for an update — whether they realized it or not. It’s simple, but whatever. As proven, those are sometimes the most popular posts.

Before we get back into this, let’s remember how this all started. It started here: Hash Table to CSV. It’s always been a popular post, but just in case you didn’t follow this over from PowerShell.org, then you need to know this is a forward movement on that 2018 article.

I was sitting around and wondered how to best get a hash table into a CSV file. Maybe it was for configuration; I still don’t recall. Well, my option wasn’t the only option, and so today I’ll pass along something someone on Twitter pointed out that I hadn’t considered myself. We can use [PSCustomObject], introduced in (Windows) PowerShell 3.0.

I’ve long used this to create custom objects for its speed and insertion order preservation (over New-Object), but it didn’t occur to make use of it when getting a hash table into a CSV. It does work, with less work, so let’s try it on. While I won’t show the previous option, revisit the first article on this topic to see the old way; here’s the new way. We’ll start with our previously used hash table.

$HashTable = @{
    Name = 'Tommy'
    StreetName = 'Vista Pl.'
    City = 'Tucson'
}

$HashTable
Name                           Value
----                           ----- 
Name                           Tommy
StreetName                     Vista Pl.
City                           Tucson

Once that’s populated, instead of using the hash table’s GetEnumerator method, we’ll use [PSCustomObject]. This converts each entry in our hash table into an object, and sends the object to the Export-Csv cmdlet. We get what we expect in our CSV file without the additional work scripted into the first hash table to CSV article from 2018.

[PSCustomObject]$HashTable |
    Export-Csv -NoTypeInformation -Path .\Desktop\NewHashToCsv.csv

And, that was it. Quicker, easier, and more likely to be remembered, long term.

Edit: It was pointed out to me that this didn’t produce the exact information in the CSV from 2018. Instead of having a Key and Value properties, it included a property/column for each of the hash table’s keys: Name, StreetName, and City. Honestly, I think I like this better, as it reminds me of what any command that returns objects in PowerShell would do for us. If I were to read it back into the host (via Import-Csv), my output from the operation would be more in line with what PowerShell commands produces anyway: objects with properties. Thank you for the observation, Mike!

Hash Table to CSV

Update: There’s a “part two” now. See the link at the bottom of this post.

I was sitting here, and I wondered, how do I get a hash table into a CSV? Have I even ever done that? Now, a few commands in, and I can’t even remember why I wondered that. There was a reason, I just wish I could remember what it was. Whatever it was, I should be more likely to remember how to get a hash table into CSV when I need it after today’s post. Was it on-disk storage for a hash table for some software configuration? Ugh, what was it?

Anyway, let’s start by assigning a hash table to a variable. Maybe it’ll come back to me.

$HashTable = @{
    Name = 'Tommy'
    StreetName = 'Vista Pl.'
    City = 'Tucson'
}

$HashTable
Name                           Value
----                           ----- 
Name                           Tommy
StreetName                     Vista Pl.
City                           Tucson

Now that we’ve assigned our $HashTable variable the value of our hash table, we can try and get it into a CSV. At first, someone might try the below option. Let’s see how that works.

$HashTable |
    Export-Csv -NoTypeInformation -Path .\Desktop\HashToCsv.csv

As you can see, this doesn’t work.

In order to get this to work properly, we need to use the GetEnumerator method. I seem to use this quite often. This allows us to walk through each key-pair in our hash table.

$HashTable.GetEnumerator() |
    Export-Csv -NoTypeInformation -Path .\Desktop\HashToCsv2.csv

Now it’s just perfect, minus the whole Name property (column). Huh? I only expected the Keys and Values, like we’d see produced onscreen. With this in mind, let’s instead pipe to the Select-Object cmdlet before Export-Csv and get things properly filtered.

Update: It dawned on me, after I made all these screen captures, that I wasn’t expecting to see the above Name property included. Sure, it holds the same values as Name, but in a host program, we’re actually accustomed to seeing Name and Value, not Key and Value.

$HashTable.GetEnumerator() |
    Select-Object -Property Key,Value |
        Export-Csv -NoTypeInformation -Path .\Desktop\HashToCsv3.csv

My next logical thought was, can we use calculated properties? Can I use a different descriptor than Key and Value? You bet we can — take a look.

$HashTable.GetEnumerator() |
    Select-Object -Property @{N='Property';E={$_.Key}},
    @{N='PropValue';E={$_.Value}} |
        Export-Csv -NoTypeInformation -Path .\Desktop\HashToCsv4.csv

So yeah, there we go. I can use a hash table, saved to disk, for … whatever reason. I still don’t remember why I wondered this originally, but in case it’s helpful, I know where to find this when I remember why I wondered.

Update: There’s a follow-up post now that includes an easier way.

Use PowerShell to Edit a CSV


Notice: There is now a revisited, or second, Edit a CSV post. This time, however, we are obtaining and writing portions of a REST API response to our CSV.


I recently completed a scripting assignment for work. Yeah… it’s a part of what I do. During the process I learned something new, that I hadn’t known before. And like it normally does, this leaves me in a position to share it with the Internet, as well as help solidify it in my own mind. And that’s, why I’m writing today.

I thought my recent assignment had a need to edit an existing CSV file on the fly. It turns out it wasn’t necessary for the project, but just maybe it will be for another one, or maybe even one that’s sitting in front of you right now. So, how do you do that? How do you edit an existing CSV file?

Let’s begin with a simple, yet worthy CSV file as an example. In the below CSV data, we have four columns, fields, or properties — whatever you want to call them at this point. They are Name, Status, BatchName, and SentNotification. The idea here — or what I thought it was in my recent assignment, at first — was to notify by email the users that were in a Synced state (Status column) and then modify the SentNotification field so that it said True, instead of False. Take a look at our example data.

Name,Status,BatchName,SentNotification
landrews,Other,MigrationService:FirstBatch-to-O365,FALSE
lpope,Synced,MigrationService:FirstBatch-to-O365,FALSE
ljohnson,Other,MigrationService:FirstBatch-to-O365,FALSE
mlindsay,Other,MigrationService:FirstBatch-to-O365,FALSE
rperkins,Synced,MigrationService:FirstBatch-to-O365,FALSE
dstevenson,Other,MigrationService:FirstBatch-to-O365,FALSE
jbradford,Other,MigrationService:FirstBatch-to-O365,FALSE
jsmith,Other,MigrationService:FirstBatch-to-O365,FALSE
mdavidson,Synced,MigrationService:FirstBatch-to-O365,FALSE
bclark,Synced,MigrationService:FirstBatch-to-O365,FALSE

Let’s first begin by using Import-Csv and Format-Table to view our data.

Import-Csv -Path '.\Desktop\UCSVTestFile.csv' | Format-Table -AutoSize

Name       Status BatchName                           SentNotification
----       ------ ---------                           ----------------
landrews   Other  MigrationService:FirstBatch-to-O365 FALSE           
lpope      Synced MigrationService:FirstBatch-to-O365 FALSE           
ljohnson   Other  MigrationService:FirstBatch-to-O365 FALSE           
mlindsay   Other  MigrationService:FirstBatch-to-O365 FALSE           
rperkins   Synced MigrationService:FirstBatch-to-O365 FALSE           
dstevenson Other  MigrationService:FirstBatch-to-O365 FALSE           
jbradford  Other  MigrationService:FirstBatch-to-O365 FALSE           
jsmith     Other  MigrationService:FirstBatch-to-O365 FALSE           
mdavidson  Synced MigrationService:FirstBatch-to-O365 FALSE           
bclark     Synced MigrationService:FirstBatch-to-O365 FALSE

Now what we need to do, is modify the content, as it’s being imported. Let’s start first, however, by piping our Import-Csv cmdlet to ForEach-Object and returning each object (line).

Import-Csv -Path '.\Desktop\UCSVTestFile.csv' | ForEach-Object {
    $_
}

Name       Status BatchName                           SentNotification
----       ------ ---------                           ----------------
landrews   Other  MigrationService:FirstBatch-to-O365 FALSE           
lpope      Synced MigrationService:FirstBatch-to-O365 FALSE           
ljohnson   Other  MigrationService:FirstBatch-to-O365 FALSE           
mlindsay   Other  MigrationService:FirstBatch-to-O365 FALSE           
rperkins   Synced MigrationService:FirstBatch-to-O365 FALSE           
dstevenson Other  MigrationService:FirstBatch-to-O365 FALSE           
jbradford  Other  MigrationService:FirstBatch-to-O365 FALSE           
jsmith     Other  MigrationService:FirstBatch-to-O365 FALSE           
mdavidson  Synced MigrationService:FirstBatch-to-O365 FALSE           
bclark     Synced MigrationService:FirstBatch-to-O365 FALSE

Hey, look at that. It’s the same thing. The $_ variable represents the current object, or the current row, — if it helps to think about it that way — in the pipeline. Let’s add an If statement inside our ForEach-Object loop, and get the results we’re after. Remember, if a user has a Synced status, we want to change their SentNotification property to $true, and perhaps notify them, had this been more than just an example.

Import-Csv -Path '.\Desktop\UCSVTestFile.csv' | ForEach-Object {
    If ($_.Status -eq 'Synced' -and $_.SentNotification -eq $false) {
        $_.SentNotification = $true
    }
    $_
} | Format-Table -AutoSize

Name       Status BatchName                           SentNotification
----       ------ ---------                           ----------------
landrews   Other  MigrationService:FirstBatch-to-O365 FALSE           
lpope      Synced MigrationService:FirstBatch-to-O365 True            
ljohnson   Other  MigrationService:FirstBatch-to-O365 FALSE           
mlindsay   Other  MigrationService:FirstBatch-to-O365 FALSE           
rperkins   Synced MigrationService:FirstBatch-to-O365 True            
dstevenson Other  MigrationService:FirstBatch-to-O365 FALSE           
jbradford  Other  MigrationService:FirstBatch-to-O365 FALSE           
jsmith     Other  MigrationService:FirstBatch-to-O365 FALSE           
mdavidson  Synced MigrationService:FirstBatch-to-O365 True            
bclark     Synced MigrationService:FirstBatch-to-O365 True

In the above example, we use an If statement to check the values of two properties. If Status is Synced and SentNotification is $false, we’ll change SentNotification to $true. You can see that this worked. But what now? You see, the file from which we did our import is still the same. In order to update that file, we have a bit more work to do.

I wish I could say pipe directly back to the file; however, that doesn’t work. The file ends up being blank. It makes sense it doesn’t work though, as we’re literally reading each object — each row — and then trying to write back to the file in the same pipeline. Something is bound to go wrong, and it does. So, don’t do what’s in the below example, unless your goal is to fail at this assignment and wipe out your data. If that’s what you’re after, then by all means, have at it.

Import-Csv -Path '.\Desktop\UCSVTestFile.csv' | ForEach-Object {
    If ($_.Status -eq 'Synced' -and $_.SentNotification -eq $false) {
        $_.SentNotification = $true
    }
    $_
} | Export-Csv -Path '.\Desktop\UCSVTestFile.csv' -NoTypeInformation

What we need to do instead, is Export to a file with a different name, so that when we’re done, both files exist at the same time. Then, we remove the original file and rename the new one with the old one’s name. Here’s the entire example; take a look. And then after that, enjoy the weekend. Oh wait, tomorrow is only Friday. I keep thinking it’s the weekend, because I’m home tomorrow to deal with 1,000 square feet of sod. If only PowerShell could lay the sod for me.

Import-Csv -Path '.\Desktop\UCSVTestFile.csv' | ForEach-Object {
    If ($_.Status -eq 'Synced' -and $_.SentNotification -eq $false) {
        $_.SentNotification = $true
    }
    $_
} | Export-Csv -Path '.\Desktop\UCSVTestFile-temp.csv' -NoTypeInformation
Remove-Item -Path '.\Desktop\UCSVTestFile.csv'
Rename-Item -Path '.\Desktop\UCSVTestFile-temp.csv' -NewName 'UCSVTestFile.csv'

Remove and Add Users to an Active Directory Group

The more people that recognize that I’ve made an investment in Windows PowerShell, the more these posts write themselves. I was recently asked by a previous colleague to help them make some modifications in Active Directory (AD). They needed to strip out all the users from a single AD group, and then add a bunch of them back that were, conveniently, stored in a text file. While you can compare the current users in the group with the ones you want to add, this post will assume it’s suitable to momentarily remove all the users from a specific AD group.

When I completed this, out of the kindness of my heart, and because I want to promote and teach PowerShell whenever possible, I sent the previous colleague an email with the five steps I had completed. They might be useful for others as well, so here we are. I think I’m getting a lunch out of this, too. While it might not be Chipotle, I like food, so I’m sure it’ll be suitable.

1. Export Users in the AD Group: This command will return the current members of the VPN-Users group and store them in a Csv file. While this will allow us to compare our final results, its original purpose was as my safety net in case I made a mistake and wanted to make the previous members, members again.

PS> $GroupName = 'VPN-Users'
PS> Get-ADGroupMember -Identity $GroupName | Export-Csv -Path C:\VPN-Users01Pre.csv -NoTypeInformation

2. Remove Users from Group: This next command removes all the members of the group (which I had checked were only users). It did this by piping each member of the group to a ForEach-Object loop that ran Remove-ADGroupMember against the current user. Had there been other object types, you’d have to use a Where-Object command in between these two commands (command 1 | Where-Object… | command 2), filtering on the ObjectClass. The Remove-ADGroupMember cmdlet is using the -Confirm parameter with the $false value, so there’s no need for manual confirmation of each removal.

PS> Get-ADGroupMember -Identity $GroupName | Foreach-Object {Remove-ADGroupMember -Identity $GroupName -Members $_ -Confirm:$false}

3. Add Users to Group: After I allowed a few moments for replication, I went ahead and added all the users in the text file I had been supplied by using the Get-Content command and piping each entry to the ForEach-Object and Add-ADGroupMember cmdlets. It threw two errors when it wasn’t able to find a couple users in AD. While I didn’t, we could’ve written in precautions in this command to avoid this error, or programmatically fixed the file prior to this command.

PS> Get-Content -Path C:\UserNames.txt | Foreach-Object {Add-ADGroupMember -Identity $GroupName  -Members $_}

4. Export Users in the AD Group: Now that we have all the old group members removed, and the new ones added, we can create a new export of the group members for comparison. This isn’t necessary, but helps to see who was removed and added, if we desire that kind of information and comparison.

PS> Get-ADGroupMember -Identity $GroupName | Export-Csv -Path C:\VPN-Users02Post.csv -NoTypeInformation

5Compare Before and After Group Memberships: The next command will reach into the SamAccountName column inside both Csv files and compare them. Results that indicate <= mean they are in the file on the left (the file as the value to -ReferenceObject parameter), and results that indicate => mean they are only in the file on the right (-DifferenceObject parameter). Don’t forget, you can always use the -IncludeEqual parameter if you want to see which users where there before and after.

Compare-Object -ReferenceObject ((Import-Csv -Path C:\VPN-Users01Pre.csv).SamAccountName) -DifferenceObject ((Import-Csv -Path C:\VPN-Users02Post.csv).SamAccountName)

There ya go. The previous colleague said they had a problem finding some good examples of doing this online, so here’s to hoping that this will help someone when it they need it. Just maybe, I’ll be back with the right way to do this, where the users that are going to be added back, never get removed in the fist place.

Proving PowerShell’s Usefulness to Newbies, Part III

Here we go! It’s the third time doing what we can, to try and prove the usefulness and power in Windows PowerShell. With a little work, those new in PowerShell, get a glimpse into not only what they can do with PowerShell, but also why they’d want to do it with PowerShell. In the end, most of this is about time savings, and tonight’s post is no exception.

Part I can be found here. In that post, we learned that we could use PowerShell to create 10,000 folders in about 10 seconds — that’s 1,000 folders per second, or 1 folder each millisecond.

Part II is here. This post was about reading in a file’s contents, sorting the contents alphabetically, and then writing the sorted results back to the same file. PowerShell took milliseconds, while my manual attempt took minutes.

Part III
As part of my imaginary Windows PowerShell presentation I thought I might give one day, I always pictured a common theme — saving time. That’s a big part of why many of us have taken so fondly to PowerShell. I pictured telling a story about a few different approaches to the same problem, all to be completed by a different admin, each with their own best way of handling the request. Here’s my hypothetical, yet quite possible situation, beginning with the characters.

Admin One: This admin is fairly new to the industry. While they know their way around a desktop version of Windows, they’re still often learning new things about it. While they’re motivated, they haven’t learned enough to often know better.

Admin Two: This admin has a year or two on admin one, and is quickly picking up better ways to speed things up by utilizing certain tools.

Admin Three: This admin is getting there. It’s almost daily that they’re figuring out faster, and more remote ways to complete tasks. While they aren’t scripting and running commands (yet), they do know some tricks to get their results quicker.

Admin Four: This admin has been studying all things PowerShell whenever there’s a moment for it. Every task assigned is another challenge to learn more PowerShell, and to continue to add to their skills. They’re freeing up time, all the time.

One day the boss comes in with a request of the team. It’s not a difficult task. In fact, any one of the team members can get the task done. They need to check 200 of their computers to determine, and then document, which of the computers are running the print spooler service. Maybe it’s just busy work handed down by the boss, or maybe this is a valid concern somehow.

Admin One gets his clipboard and heads down to one of the computer labs. Once the three labs are checked, he’ll head to each of the offices to check those computers manually, too. Here’s the steps this admin takes: (1) Log on to the computer, (2) Open Computer Management, (3) Click on Services, (4) Find the Print Spooler service, noting the status, and (5) Write down the computer name and service’s status.

Let’s assume Admin One takes 5 minutes per computer which includes the time spent walking from computer to computer and office to office. At 200 computers, that’s 1,000 minutes, or approximately 16.6 hours. That’s longer than two full 8 hour days at the office. That kind of time loss equates to zero time spent learning anything new for two days in the week. You can’t stay in the field using this procedure.

Admin Two’s first though is Remote Desktop. While this admin still has to log on, it can be done without physically visiting each computer. In addition, this admin can work with more than one remote computer at a time. This allows Admin Two the ability to complete one computer every 2 1/2 minutes, cutting the time it takes Admin One in half. That’s only a little over one work day — 8.3 hours — lost to this project. While it’s shorter, it’s still too much time.

Admin Three knows enough to know that they can connect to a remote computer from inside Computer Management, which means they can also avoid leaving their desk. This removes the step of interactively logging on to a computer, and loading a desktop environment, too. We’ll assume this admin can do one computer every minute. That’s 200 total minutes, or 3.3 hours. That admin can actually finish before lunch if they start soon enough in the day. This might be reasonable by some boss’ standards, but PowerShell has been designed to do this quicker.

The final admin, Admin Four, knows PowerShell, and they know it well enough to pull back all those results without the need for any GUIs. There’s no wandering around the building, writing things down to a clipboard, or even entering data into a file, as Admin Two and Three do from their desks, as they use RDP and Computer Management, respectively.

Admin Four has a couple approaches and plenty of time to try them both out. While there’s many ways to read in computer names, we’ll assume they have a list of the 200 computers in a text file with each computer name on its own line. They may have had to build this file, but if they’re pulling from Active Directory, then it was probably automated, as well.

The first option is to use the Get-Service cmdlet, as seen below, in the first part of the example. The second part, which requires PowerShell Remoting, uses Invoke-Command. I’ve only include 20 computers in the computers.txt file, and to save space, only included 5 results per command in the example.

PS> # Uses the Get-Service cmdlet with -ComputerName parameter.
PS> # Have to add the MachineName property to see which results goes with which computer.
PS> Get-Service -ComputerName (Get-Content -Path .\computers.txt) -Name spooler | Select-Object MachineName,Status,Name,DisplayName

MachineName                                          Status Name                          DisplayName
-----------                                          ------ ----                          -----------
DC01                                                 Running spooler                       Print Spooler
DC02                                                 Running spooler                       Print Spooler
DC03                                                 Running spooler                       Print Spooler
WEB01                                                Running spooler                       Print Spooler
WEB02                                                Running spooler                       Print Spooler
...

PS> # Uses Invoke-Command which automatically adds the name of the remote computer by default.
PS> Invoke-Command -ComputerName (Get-Content -Path .\computers.txt) -ScriptBlock {Get-Service -Name spooler}

Status   Name               DisplayName                            PSComputerName
------   ----               -----------                            --------------
Running  spooler            Print Spooler                          SQL01
Running  spooler            Print Spooler                          DC02
Running  spooler            Print Spooler                          DC02
Running  spooler            Print Spooler                          WEB02
Running  spooler            Print Spooler                          DC03
...

While I didn’t include the results, each of these two options — Get-Service, and Get-Service inside Invoke-Command — took about the same amount of time when they were measured. Even so, we’ll round up and say it takes 10 second for each set of 20 computers. At 200 computers, that 100 seconds for all of them. At 100 seconds, we’re talking about an approximation of 1 1/2 minutes to check all two hundred computers.

If I was in front of a crowd right now, and had just talked thought this post, I’d take a moment to ask the crowd which admin they thought would be around after budgets cuts. The admin that showed up with the results written down, after two days, or the one who sent the boss the results via email after two minutes? Keep in mind, that either of these cmdlets above can be used in conjunction with Export-Csv — a cmdlet that produces a text-based file that can be opened and edited with Microsoft Excel — something in which the boss probably has on their computer. In addition to automatically writing the results to file, by using the Send-MailMessage cmdlet, Admin Four can also automate sending the results by email.

Find DNS Servers Being Used by DHCP Scopes

Download the Get-TMDhcpDNS function here: https://gist.github.com/tommymaynard/afdb78038e8639d5d23baaaaf897cac1

There was a Microsoft TechNet forum post last week regarding trying to obtain the DNS servers being used by different DHCP Scopes (Here’s the link: https://social.technet.microsoft.com/Forums/en-US/58723d31-7586-40c3-acd2-183f20b49daf/how-to-dump-the-dns-options-for-each-dhcp-scope?forum=ITCG).

I originally thought the person wanted to get the DNS Servers listed in the Server Options (what the scopes use by default), until he (or she) better clued me in to wanting the DNS Servers listed in the Scope Options, when there was one. At times there won’t be any DNS Servers listed in the Scope Options, and it’ll use the Server Options instead.

Since that forum post, I wrapped up a few commands to create a full-service advanced function. Get-TMDhcpDNS function will collect the Scope Name, Scope ID, DNS Servers, and whether the DNS Servers assigned, are done so in the Scope Options or Server Options.

If you think this might be helpful for you or someone else, then please download it, test it, and rate it. Thanks, and here’s an example of the function in action:

PS C:\> Get-TMDhcpDNS -ComputerName 'dhcpsrv1.mydomain.com' | Format-Table -AutoSize

Name                  ScopeName  ScopeID     DNS                                    ScopeOrServerDNS
----                  ---------  -------     ---                                    ----------------
dhcpsrv1.mydomain.com Building01 10.10.10.0  10.10.10.10.20,10.10.10.21,10.10.10.22 Scope
dhcpsrv1.mydomain.com Building02 172.16.16.0 172.16.16.5,172.16.16.15               Server

Use the link above to download the function.

Update: There was an issue with the DNS Server (System.Object[]) when piping the function to Export-Csv. That’s been corrected in 1.0.2. Here’s a post I had to reference (again): http://learn-powershell.net/2014/01/24/avoiding-system-object-or-similar-output-when-using-export-csv/.

Get Some Help for Later Reading

I’m just starting to get my hands wet with Microsoft Lync. As I often do, I use the Windows PowerShell cmdlets to help learn more about a product; I did this same thing with Hyper-V. The unfortunate thing about the GUI (think, an MMC snap-in for instance), is that the menu options don’t always tell you exactly what that option is going to do, or its exact purpose. While the GUI can be unclear, PowerShell tells you exactly what a cmdlet does. With that knowledge, I’ve often been able to relate a cmdlet, and its purpose, to its respective menu option in the GUI.

I wanted to read though the Lync cmdlet’s help files, but only the cmdlet name and its synopsis. Here’s the command I ran to extract this information for later reading.

PS C:> Get-Command -Module Lync | Get-Help | Select-Object Name,Synopsis | Export-Csv -Path C:\LyncCmdlets.csv -NoTypeInfomation

The command above works this way: it returns all the cmdlets included in the Lync Module and sends (pipes) those to the Get-Help cmdlet. The Get-Help cmdlet pipes its results to Select-Object which filters the returned properties to just the Name and Synopsis from each cmdlet’s help file. At the end, those filtered results are sent to Export-Csv which creates a file I can read at my leisure.

Although there’s over 500 cmdlets, it’s safe to say that many of the nouns (the part after the dash [-]) will be the same across some of the cmdlets. That means that many of the Get-* cmdlets will have a partnering Set-* cmdlet. Get reads information and Set changes it. These nouns may also have a matching New-*, Remove-*, and possibly even a Test-* cmdlet. Now off to do some reading…