Tag Archives: Get-Content

A Basic(ish) Active Directory Look-Up Script

It was just Saturday—it’s been a few weeks now, actually—that I wrote a post proclaiming to be back. Back to writing about PowerShell, that is. Why not take the first script I wrote in my new position and share it? It’s possible it has some concepts that might be helpful for readers.

The first thing I want to mention is that I hate writing scripts. Huh!? What I mean by that, is that I prefer to write functions. That didn’t happen my first time out in this position, as you’ll see, and I’m going to be okay with it. A one-off script may have a purpose. I stayed away from a function (and therefore a module [maybe], working with profile scripts [potentially], etc.). I’ll live with it.

Let’s break the script up into sections and explain what’s happening in each. Keep in mind that this isn’t perfect and there are places where I would make changes after having looked over this a couple of times. I’ll be sure to mention those toward the bottom of the post. Okay, let’s do this! I’ll put the script back together into a single, code block further below.

The beginning of the script creates two parameters: Properties and Output. Properties can accept multiple string values,—notice the []—and we’ll see that in the upcoming examples. Output can accept one string value of three predetermined values. Those are Console, GridView, and CsvFile; Console is the default parameter value.

Param (
    [Parameter()]
    [string[]]$Properties,
    [Parameter()]
    [ValidateSet('Console','GridView','CsvFile')]
    [string]$Output = 'Console'
)

Next, we create a path to a flat file called userlist.txt. This file will contain Active Directory Display Names (DisplayName). By using the $PSScriptRoot variable, all we have to do is keep our script and the text file in the same location/folder in order for it to work correctly.

Once the $Path variable is set, we attempt to run a Get-Content command against the values in the file, storing these in the $Userlist variable. If for some reason the file isn’t in place, the script will make use of the catch block of our try-catch to indicate to the user that the file can’t be located.

$Path = "$PSScriptRoot\userlist.txt"
try {
    $Userlist = Get-Content -Path $Path -ErrorAction Stop
} catch {
    Write-Error -Message "Unable to locate $Path."
}

Following that, we set our $TotalProperties variable to three Active Directory user properties we know we want. Then, if any values have been passed in using the Properties parameter, we combine those with the three properties already in the $TotalProperties variable.

$TotalProperties = @('DisplayName','EmployeeNumber','SamAccountName')
if ($Properties) {
    $TotalProperties = $TotalProperties + $Properties
}

Moving forward, we set up foreach language construct in order to loop through each user in the $Userlist variable. Remember, this variable was populated by our Get-Content command earlier. For each loop iteration, we are adding a PSCustomObject to our $Users—plural— variable. Normally, I wouldn’t store each value in a variable and instead just pump it out right there, but the script includes some output options that we’ll see next.

foreach ($User in $Userlist) {
    $Users += [PSCustomObject]@(
    Get-ADUser -Filter "DisplayName -eq '$User'" -Properties $TotalProperties |
        Select-Object -Property $TotalProperties
    )
}

Finally, we consider the output option the user either defaulted to, or didn’t. If they included the Output parameter with GridView, we pipe our $Users variable to OutGridView. If they included the Output parameter with CSVFile, we pipe out $Users variable to Export-Csv, saving them in a CSV file, and then open our saved CSV file. If they didn’t include the Output parameter, or they did with the Console value, then we display the results directly in the console. That’s it.

Switch ($Output) {
    GridView {$Users | Out-GridView -Title Users}
    CsvFile {
        $Users.GetEnumerator() |
        Export-Csv -NoTypeInformation -Path "$PSScriptRoot\$(Get-Date -Format FileDateTime -OutVariable NewFileOutput).csv"
        Invoke-Item -Path "$PSScriptRoot\$NewFileOutput.csv"
    }
    Default {$Users | Format-Table -AutoSize}
}

Although I took the base code from someone’s previously written script, this really is still much of a 1.0.0 version. Knowing that, there are some changes I might make; it’s not perfect, but it’ll get the job done.

  • While it’s not vital, I kind of wish I used a different variable name for $Path
    • It’s a path, sure, but to a specific file
    • Perhaps $FilePath, $UserFile, or $UserFilePath
      • It could’ve been more specific
  • Ensure properties passed in via the Properties parameter are valid for an Active Directory user
    • If someone sends in properties that don’t exist for an Active Directory user, it’s going to cause problems
      • Maybe check a known user, gather all the possible properties, and compare
      • (Or) Maybe wrap some error checking without having to do any property compare operation
  • Don’t use Default in the Switch language construct
    • It’s not necessary, as the Output parameter will only accept three possible values
    • Default could’ve been replaced with Console

Here are a few examples followed by the full PowerShell code in a single, code block.

I’ve redacted information from each of these images. There’s something that’s vital to know about each, however. In front of the full path (all the images but the last one), is the & operator. This is called the invocation or call operator. It informs PowerShell that everything after it should be treated as a command and that it’s not just a long, string value.

This example invokes the script without any parameters, pulling in two users from the userlist.txt file.

This example invokes the script and includes two additional Active Directory properties, which are then also included in the output.

This example does the same as the first one, however, it opens the results using the Out-GridView cmdlet.

This one opens the results using whatever program—Excel in my case—is associated with CSV files. This option is saving the file to disk, so keep that in mind, as it has no cleanup features.

This final example is included to show that it works when you’re inside the same directory as the script and text file. It also includes multiple parameters being included at the same time. You might know you can do it, but my at-work audience may not have—I’m not sure. As we’re in the directory with the script, you can actually see the inclusion of the invocation operator.

And finally, all the code in a single code block.

Param (
    [Parameter()]
    [string[]]$Properties,
    [Parameter()]
    [ValidateSet('Console','GridView','CsvFile')]
    [string]$Output = 'Console'
)

$Path = "$PSScriptRoot\userlist.txt"
try {
    $Userlist = Get-Content -Path $Path -ErrorAction Stop
} catch {
    Write-Error -Message "Unable to locate $Path."
}

$TotalProperties = @('DisplayName','EmployeeNumber','SamAccountName')
if ($Properties) {
    $TotalProperties = $TotalProperties + $Properties
}

foreach ($User in $Userlist) {
    $Users += [PSCustomObject]@(
        Get-ADUser -Filter "DisplayName -eq '$User'" -Properties $TotalProperties |
            Select-Object -Property $TotalProperties
    )
}

Switch ($Output) {
    GridView {$Users | Out-GridView -Title Users}
    CsvFile {
        $Users.GetEnumerator() |
            Export-Csv -NoTypeInformation -Path "$PSScriptRoot\$(Get-Date -Format FileDateTime -OutVariable NewFileOutput).csv"
        Invoke-Item -Path "$PSScriptRoot\$NewFileOutput.csv"
    }
    Default {$Users | Format-Table -AutoSize}
}

UX Headache – Joining Lines in a Text File

A part of me seriously wants to be involved in UX. I constantly find problems with just about every UI in which I interface. This one is beautiful, but it is lacking. This one is ugly, but works. Maybe it’s why I love PowerShell; it’s always the same no matter what I’m working with. It’s probably also why I wish every website on the planet was written with APIs first. What an amazing world, if I could do everything using PowerShell: check the bank, register children for school, order Chipotle, and make appointments at the doctor, the dentist, the eye doctor, the auto shop, etc. The list is endless.

Anyway, back to the topic here. I’ve often considered buying a new domain and pointing out awful, real-world experiences of my own until someone comes along, realizes I get it, and employs me to stop all the awful interfaces… at least for that company. Us humans, living in this digital world, are constantly subjected to awful-looking, unhelpful, and inconsistent interfaces that have become a requirement in our lives. If I request that your site show me 100 rows at a time, then it’s not likely I’m going to change my mind when I click into one record and then go back out to the row view again. And if I do want to change it, guess what, that’s on me. I could go on for days.

The company responsible for those, two paragraphs is the reason I’m writing today. I was sent a list of 100 or 200 CIDR ranges. No problem, I’ll just copy and paste them into that one box on the website set to accept both single IPs and CIDR ranges. Nope. That caused an error. It was unable to parse it, and so now it was my job to enter them one at a time!? Well, it would’ve been had I not known PowerShell. Seriously, someone somewhere might be doing that. Copy and paste, or select and drag, or whatever other option was left. Whichever method, it would be much slower than what I did. So, today’s post is both me venting a little of my pent-up UX frustration and providing a quick resolution for anyone in this same situation, that didn’t automatically think PowerShell themselves. You see, the interface would take a comma-separated list, it just couldn’t handle a line-delimited list–if that’s even what that’s called. Maybe new-line-delimited; I don’t know for sure.

Here’s what I had (after removing the public IP addresses):

10.138.80.0/22
10.138.87.160/27
10.138.91.160/27
10.138.129.0/29
10.139.33.96/28
10.139.38.0/27
10.224.1.32/27
10.224.41.128/25
10.224.43.0/24
10.224.73.0/25
10.228.21.192/27
10.120.1.0/27
10.128.1.32/27
10.128.11.64/26
10.128.29.0/24
10.128.205.128/26
10.130.66.0/25
10.152.7.160/27
10.152.12.0/24
10.152.13.0/24
10.152.14.0/24
10.152.15.0/27
10.156.20.0/28
10.156.20.16/28
10.156.20.32/27
10.156.20.96/27
10.156.24.0/22
10.156.28.128/26
10.156.29.0/24
10.156.30.0/24
10.156.31.0/24
10.156.32.0/24
10.156.33.0/24
10.156.34.0/24
10.156.35.0/24
10.156.36.0/24
10.156.42.0/24
10.156.43.0/24
10.160.20.0/25
10.161.43.0/25
10.161.43.128/25
10.161.44.0/22
10.161.48.0/22
10.162.2.0/27
10.166.9.0/24
10.192.237.0/26
10.192.238.0/24
10.192.255.0/25
10.193.120.0/25
10.193.120.128/25
10.193.121.0/25
10.193.121.128/25
10.193.122.0/25
10.193.122.128/25
10.193.123.0/25
10.193.123.128/25
10.193.124.0/25
10.193.124.128/25
10.193.125.0/25
10.208.17.0/24
10.208.21.0/24
10.224.21.0/25
10.224.40.0/24
10.224.61.192/26
10.224.71.32/27
10.224.71.160/27
10.224.72.192/27
10.224.74.0/23
10.224.78.0/24
10.224.81.0/25
10.224.81.128/25
10.224.82.64/26
10.224.83.0/24
10.224.100.0/22
10.224.104.0/22
10.224.108.0/22
10.224.112.0/22
10.224.116.0/22
10.224.120.0/22
10.224.124.0/22
10.224.128.0/23
10.224.130.0/23
10.224.132.0/23
10.224.134.0/23
10.224.136.0/23
10.224.138.0/23
10.224.140.0/23
10.224.142.0/23
10.224.148.0/22
10.226.3.0/26
10.229.16.0/23
10.230.12.128/25
10.140.76.0/24
10.140.78.0/28
10.140.102.0/24
10.140.103.0/24
10.140.113.0/24
10.140.138.0/24
10.140.139.0/26
10.120.1.32/28
10.128.167.32/27
10.192.178.64/26
10.194.3.128/25
10.224.5.128/26
10.224.42.0/25
10.224.76.0/24
10.224.77.0/24
10.224.79.0/24
10.224.96.0/22
10.140.100.0/24
10.140.101.0/24
10.140.104.0/24
10.140.105.0/24
10.140.106.0/24
10.193.120.0/21
10.130.169.0/24
10.224.9.0/24

Let’s save this file to my computer as C:\Users\tommymaynard\Documents\CIDR.txt. Now, let’s see how many entries we’re working with. What kind of time might I save?

$Path = 'C:\Users\tommymaynard\Documents\CIDR.txt'
(Get-Content -Path $Path).Count
117

We’re working with 117 entries, or rather, 116 commas. Yeah, I’m not moving those over one by one; I don’t have the kind of time during my day. Enter PowerShell. To begin testing, I chose a smaller subset of the CIDR ranges. When I was happy with that, which was practically immediately, I added the -join operator.

Get-Content -Path $Path | Select-Object -First 5
10.138.80.0/22
10.138.87.160/27
10.138.91.160/27
10.138.129.0/29
10.139.33.96/28
(Get-Content -Path $Path | Select-Object -First 5) -join ','
10.138.80.0/22,10.138.87.160/27,10.138.91.160/27,10.138.129.0/29,10.139.33.96/28

Once I had this, I only had two things left to do. One, test to see if the company’s UI accepted comma-separated entries like this, and two, if it did, then join all 117 addresses with a comma in between each, and carry on with my day. That’s three things. Well, four if you count writing up this post after work. The UI did accept things that way, and so I ran the below command, pasted it in the box, saved everything, and reported back to my customer that it was set and done, as requested. Next.

(Get-Content -Path $Path) -join ',' | Set-Clipboard

Let’s Learn the Get-FileHash Command

Someone, somewhere, sent me down a path. At the end of it, while it is not where I needed to be, I learned — or relearned rather — about the Get-FileHash cmdlet. Whether you know about it or not, we will quickly cover it and walk through some examples, as well. Get-FileHash, and I quote, “Computes the hash value for a file by using a specified hash algorithm.” This is what it does, but is not the why. Here is its reference page, however: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/get-filehash, and the why, is definitely in there; you should read it.

The idea, for those that are not going to read it, is that we can obtain a file’s hash and then check the hash to ensure the file has not been changed. At this point in your career, you have likely seen file hashes near, or alongside, a file download. Checking the file hash against the file, after it is downloaded, allows you to be certain that the file is the right one and that it was not altered by the download process, or anything else. It is what you were expecting.

If you are going to run any of my below commands, first be certain you know your working directory and that it is a location where you have permissions to write. We will start by creating a new file, adding a sentence to it, and then returning that content to ensure it is properly in place.

New-Item -Name hashfile.txt -ItemType File

Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a---           4/19/2022  7:06 PM              0 hashfile.txt
Add-Content -Path .\hashfile.txt -Value 'This is our file at the beginning.'
Get-Content -Path .\hashfile.txt
This is our file at the beginning.

That all works and so now we have a file with which can experiment. If you are wondering how you learn PowerShell, this is how you do it. Follow along, as there is a goodie further down below. In this example we invoke Get-FileHash against our file, only returning the algorithm used and the hash. We are using Format-List in order to better display this content.

Get-FileHash -Path .\hashfile.txt | Tee-Object -Variable SaveMeForLater | Format-List -Property Algorithm,Hash
Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356

In this example, we do the same as we did above, however, now we are going to try out the other parameter values that the Algorithm parameter will accept. By default it uses SHA256, but it will accept SHA1, SHA384, SHA512, and MD5, too.

Get-FileHash -Algorithm SHA1 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA1
Hash      : BD002AAE71BEEBB69503871F2AD3793BA5764097


Get-FileHash -Algorithm SHA256 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356


Get-FileHash -Algorithm SHA384 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA384
Hash      : E6BC50D6465FE3ECD7C7870D8A510DC8071C7D1E1C0BB069132ED712857082E34801B20F462E4386A6108192C076168A


Get-FileHash -Algorithm SHA512 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA512
Hash      : C0124A846506B57CE858529968B04D2562F724672D8B9E2286494DB3BBB098978D3DA0A9A1F9F7FF0D3B862F6BD1EB86D301D025B80C0FC97D5B9619A1BD7D86


Get-FileHash -Algorithm MD5 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : MD5
Hash      : 30091603F57FE5C35A12CB43BB32B5F5

For fun, let’s loop through these values and pump out all the hashes, one right after another. Notice that we are using hard-coded values for the Algorithm parameter. Obnoxious. We will get to another way, which is/was the goodie I mentioned above. The more I think about it though — as I have been in the last 10 minutes — the more I think it might need its own post. Anyway, more on that soon.

'SHA1','SHA256','SHA384','SHA512','MD5'
    | ForEach-Object {Get-FileHash -Algorithm $_ -Path '.\hashfile.txt'
    | Select-Object -Property Algorithm,Hash}

Algorithm Hash
--------- ----
SHA1      BD002AAE71BEEBB69503871F2AD3793BA5764097
SHA256    3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
SHA384    E6BC50D6465FE3ECD7C7870D8A510DC8071C7D1E1C0BB069132ED712857082E34801B20F462E4386A6108192C076168A
SHA512    C0124A846506B57CE858529968B04D2562F724672D8B9E2286494DB3BBB098978D3DA0A9A1F9F7FF0D3B862F6BD1EB86D301D025B80C0FC97D5B9619A1BD7D86
MD5       30091603F57FE5C35A12CB43BB32B5F5

And, there they are again. The various hashes for our file. Now, let’s add some new files. All we are going to do is copy and paste our hashfile.txt to the same directory two times. Rename them so that in addition to hashfile.txt, you have hashfilecopy.txt and hashfile.copy. Watch those names and file extensions, although really, how important do you have to be? Think about it…

When checking the hash of a file, we verify the file contents have not changed. And they have not been changed! Only the file name and file extension have. You are starting to see how this can be a useful tool and guess what? It is built-in.

Get-ChildItem | Get-FileHash | Format-List

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
Path      : C:\Users\tommymaynard\Documents\PowerShell_Get-FileHash\hashfile.copy

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
Path      : C:\Users\tommymaynard\Documents\PowerShell_Get-FileHash\hashfile.txt

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
Path      : C:\Users\tommymaynard\Documents\PowerShell_Get-FileHash\hashfilecopy.txt

Now real quick, let’s make another change. I am going to copy and paste hashfile.txt one last time. This copy I have renamed to hashfilechanged.txt. I opened it up and added a second sentence to it. Beneath the first line, I wrote, “This is our file at the end.”

Get-Content -Path .\hashfilechanged.txt
This is our file at the beginning.
This is our file at the end.
Get-FileHash -Path .\hashfilechanged.txt | Tee-Object -Variable SaveMeForNow | Format-List -Property Algorithm,Hash
Algorithm : SHA256
Hash      : 6998575555A0B7086E43376597BBB52582A4B9352AD4D3D642F38C6E612FDA76

I used Tee-Object a couple of times in this post to capture the original hash and this one, after adding a second sentence. As you can see, the file contents are indeed different now, even though the files could have had the same name, were they in different directories.

$SaveMeForLater.Hash
$SaveMeForNow.Hash

3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
6998575555A0B7086E43376597BBB52582A4B9352AD4D3D642F38C6E612FDA76

And, the goodie I mentioned. It is official; it will get its own post. Why not? I make the rules. I’ll link it from here once it is up and published!

Apartment Hunting with PowerShell

Note: Expect a part two on this post.

I know a guy, and that guy is looking for an apartment. It turns out that apartments are going really fast and inventory is low — maybe you knew this, but it was news to me. Just about as soon as they become available, they are gone. I suggested that I might be able to lend a hand… maybe, who knows. This is not because I know someone in apartments, but rather that if there is a way to use PowerShell here, then there is a good chance I can help. He got lucky and I learned something new.

I started by going to the apartment website where he was interested and found a page that listed each apartment model and whether or not they had any availability. It was a floor plan page. I was not expecting much honestly, but I used the built-in Web Developer Tools in my browser and viewed the page source, and found some exciting news (for me and PowerShell, at least). It was enough good news that I am able to write about this whole experience.

While empty here, this data structure caught my eye. The output I had hoped to gather, was in JSON format; that was huge! Best I can tell, it is generated by a JavaScript file, which then embeds the JSON in the HTML that makes up the webpage. That is not overly important, however, but look at this structure; it is magnificent.

floorplans: [
  {...
  },
  {...
  },
  {...
  },
  {...
  },
  {...
  },
  {...
  }
],
propertyID: 60484,

Inside the floor plans JSON array ([]) are six objects, each in their own set of curly braces. Inside each of those, was a plethora of information regarding each floor plan. These properties included things like Model, Sq.Ft., Beds, Baths, etc. Let’s start by taking a look at the Watch-Apartment PowerShell function I wrote. Just a note, but in order to make this work for yourself, you will need to edit the path in the $ContentPath variable.

function Watch-Apartment {
    $Uri = 'https://theplaceatcreekside.securecafe.com/onlineleasing/the-place-at-creekside/floorplans'
    $WebRequestContent = (Invoke-WebRequest -Uri $Uri).Content
    $ContentPath = 'C:\users\tommymaynard\Documents\tommymaynard.com\Apartment Hunting\WebpageContents.txt'
    Set-Content -Path $ContentPath -Value $WebRequestContent

    $File = Get-Content -Path $ContentPath
    $Pattern = "floorplans:(.*?)propertyID:"
    $ParsedPage = [regex]::Match($File,$Pattern).Groups[1].Value
    $ParsedPage = $ParsedPage.Trim(); $ParsedPage = $ParsedPage.TrimEnd(',')

    $JsonDocument = ConvertFrom-Json -InputObject $ParsedPage
    $JsonDocument |
        Select-Object -Property @{Name='Available';Expression={if ($_.isFullyOccupied -eq 0) {"Yes ($($_.availableCount))"} else {'No'}}},
        @{Name='Model';Expression={$_.name}},
        @{Name='Sq.Ft.';Expression={$_.sqft}},
        @{Name='Beds';Expression={$_.beds}},
        @{Name='Baths';Expression={$_.baths}} |
    Format-Table -AutoSize
}
Watch-Apartment

We will discuss the above function using its line numbers:

Line 1: Declares/creates the Watch-Apartment function.
Line 2: Stores the site’s URI inside the $Uri variable.
Line 3: Invokes an Invoke-WebRequest command using the URI and stores the Contents (as in the Contents property) inside the $WebRequestContent variable.
Line 4: Creates the ContentPath variable to hold a path to a text file that will be created in the next line/command.
Line 5: Takes the content from the webpage and writes it to a text file.

Writing to a file was not a requirement, however, it was my first choice for whatever reason and so I went with it, and then stayed with it.

Line 7: Read in the contents from the file and store them in the $File variable.
Line 8: Create a Regex pattern to allow us to collect all the content between the word “floorplans:” and “propertyID:”.
Line 9: Parse out the data we want and store it in the $ParsedPage variable.
Line 10: Trim off the white space from the beginning and end of the JSON string, and then trim off the trailing comma at the end of the JSON string.

Line 12: Assign the $JsonDocument variable the value assigned to the $ParsedPage variable after it has been converted from JSON by CovertFrom-Json.
Lines 13 – 19: Use Select-Object to select and modify our desired properties.

In the final command, we determine which floor plan is available, how many apartments there are, which model it is, how many square feet that model has, and how many bedrooms and bathrooms it has. Each line/property includes a calculated property and often, just to modify the case of the text.

I edited the friend’s PowerShell profile script and added this function. Not only is the function added to the PowerShell session by the profile script, but it also invokes the function, too. Open PowerShell, and just about instantly know whether anything is available or not.

These were the results the first time it ran back on my machine.

It was a good thing that the Available property included both the isFullyOccupied (“Yes” versus “No”) and the availableCount (# of apartments) information. Take a look at the next image to see why.

In the above image, it still says, “Yes,” but the count is zero. Apparently, my decision to include both values was the right choice, as this information is not all updated at the same time.

Later that same day, it cleared up.

Now, he waits, as my work is done.

Note: As stated at the top of this post, expect a part two. There is more than one apartment complex now.

Technical Fact at PowerShell Launch

When I read books, or websites, and find worthy facts, I aim to try and keep them. If I read a book, my bookmark is often a few pieces of paper stapled together with info and page numbers from the book. Well, I’m scrapping that technique, which went hand-in-hand with folding down page corners. I’m also ditching the small pieces of paper that litter my desk with random bits of information: “PowerShell objects come from classes,” and “LastLogonDate is the converted version of LastLogonTimeStamp.” Now, it’s all slated to go in a single file called InfoLines.txt in Dropbox, here: C:\Users\tommymaynard\Dropbox\PowerShell\Profile.

If you’re wondering why I use a Dropbox folder it’s because I want this file and its worthy facts to be available regardless of whether I’m on my work, or home computer. You can read more in a post I wrote that uses Dropbox to sync my profile script between work and home, here: http://tommymaynard.com/sync-profile-script-from-work-to-home-2017. It may help make what I’m doing here make more sense.

For now, because I just started this today, I only have a few lines of worthy information. Here’s the contents of my InfoLines.txt file so far. If you can’t tell, I’m finishing up Amazon Web Services in Action. I only have 70 more pages to go!

The auto-scaling group is responsible for connecting a newly launched EC2 instance with the load balancer (ELB). (AWS in Action, p.315)
DevOps is an approach driven by software development to bring development and operations closer together. (AWS in Action, p.93)
Auto-scaling is a part of the EC2 service and helps you to ensure that a specified number of virtual servers are running. (AWS in Action, p.294)

Each time I open the ConsoleHost, the ISE, or Visual Studio Code, I want a random line from the file to be shared with me. The below code will return a single, random line with asterisks both above and below it. This is in order to help separate it from the (totally unnecessary, unwelcome, and shouldn’t even be there) message that tells me how long my “personal and system profiles” took to load, and my prompt. They need to put that message in a variable and not on my screen without permission. Don’t tell us what you think we need to know, guys.

'**************'
Get-Content -Path "$env:USERPROFILE\Dropbox\PowerShell\Profile\InfoLines.txt" | Get-Random
'**************'

That’s it. Now, whenever I open one of these PowerShell hosts, I’ll get a quick reminder about something I found important, and want to keep fresh in my mind.

Update: I decided I wanted the asterisks above and below my technical fact to go from one end of the PowerShell host to other. Here’s how I did that.

'*' * ($Host.UI.RawUI.BufferSize.Width - 1)
Get-Content -Path "$env:USERPROFILE\Dropbox\PowerShell\Profile\InfoLines.txt" | Get-Random
'*' * ($Host.UI.RawUI.BufferSize.Width - 1)

Years Too Late: My First ISE Snippet

Every time I start to write a new PowerShell function, I manually write the same block of text. No idea how many times I’ve done it, but I’ve finally decided to stop. Today, I wrote it for the last time.

$Text = @'
Function ___-_________ {
    [CmdletBinding()]
    Param (
    )

    Begin {
    } # End Begin.

    Process {
    } # End Process.

    End {
    } # End End.
} # End Function: ___-_________.
'@

I’ve known about ISE snippets for some time, but haven’t taken a minute to get my advanced function included. Well, that finally ended today. With the $Text variable assigned above, I ran the following command.

New-IseSnippet -Title BasicFunction -Description 'Basic Advanced Function.' -Text $Text -Author 'Tommy Maynard'

Well, what did this just do? In the most basic reply to that question, it added a new snippet — a reusable chunk of text — I can add to the ISE anytime I want. All I have to do is press Ctrl + J and select BasicFunction from the available options.

That may be all you need, but I was curious what this really did. To find out, I ran Get-IseSnippet and it returned a path — helpful.

Get-IseSnippet

    Directory: C:\Users\tommymaynard\Documents\WindowsPowerShell\Snippets

Mode                LastWriteTime         Length Name
----                -------------         ------ ----
-a----        9/27/2016   0:32 PM            867 BasicFunction.snippets.ps1xml

Then I ran Get-Content on the file to see what it was storing.

Get-Content -Path (Get-IseSnippet).FullName

<?xml version='1.0' encoding='utf-8' ?>
    <Snippets xmlns='http://schemas.microsoft.com/PowerShell/Snippets'>
        <Snippet Version='1.0.0'





<Header>
                <Title>BasicFunction</Title>
                <Description>Basic Advanced Function.</Description>
                <Author>Tommy Maynard</Author>
                <SnippetTypes>
                    <SnippetType>Expansion</SnippetType>
                </SnippetTypes>
            </Header>






            <Code>
                <Script Language='PowerShell' CaretOffset='0'>
                    <![CDATA[Function ___-_________ { [CmdletBinding()] Param ( ) Begin { } # End Begin. Process { } # End Process. End { } # End End. } # End Function: ___-_________.]]>
                </Script>
            </Code>

    </Snippet>
</Snippets>

So, there it is. Not only do I have my basic function snippet the next time I need it, I know that New-IseSnippet is writing an XML (.ps1xml) file to my Snippets directory in my Documents/WindowsPowerShell directory in my local profile. The date on my snippet and the directory indicate they were both created when I ran this command. I told you I hadn’t used snippets before.

Me being me, I ran Get-Command against New-IseSnippet and guess what? It’s a function; it’s not compiled code. Let’s take a look at it; let’s find where it decides whether to create the directory, or not.

Get-Command -Name New-IseSnippet

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Function        New-IseSnippet                                     1.0.0.0    ISE

(Get-Command -Name New-IseSnippet).ScriptBlock

While I didn’t include the entire results of the last command, I’ll include what’s important for how it determined whether or not to create the Snippets directory. In this first part, the function creates a $snippetPath variable. In it, it stores the current user’s WindowsPowerShell directory path. Before writing that to the variable, Join-Path appends “Snippets” — the child path — to the end. That means that in the end, the $snippetPath variable contains C:\Users\tommymaynard\Documents\WindowsPowerShell\Snippets.

$snippetPath = Join-Path (Split-Path $profile.CurrentUserCurrentHost) "Snippets"

In this section of the function, it runs Test-Path against $snippetPath, to determine if the path exists, or not. This cmdlet returns $true or $false depending on whether the path exists.

if (-not (Test-Path $snippetPath))
{
    $null = mkdir $snippetPath
}

If the path doesn’t exist, thanks to the -not, it executes the mkdir function against the path, and the directory is created. The next time New-IseSnippet function is run, the directory will already exists and this part of the function won’t be run.

Well, that’s it. I’m already looking forward to pressing Ctrl + J the next time I need to start a new advanced function.

“Get Files” to a Remote Computer

I was a little quiet last week, but vacation can do that. I spent a week in Minneapolis, Minnesota in order to visit my wife’s family. Although away from my computer nearly all that time, PowerShell was still on my mind.

Not long before I left for vacation, I replied to a TechNet forum post. In that post, a user was PS Remoting to a computer, and from that computer trying to connect to another computer to pull down some files. Maybe you’ve been down that path before, or know someone that has.

Typically the user will use Invoke-Command to run commands on a remote computer. One of the commands will attempt to reach out to a network location and download some files. It sounds straight forward enough, but the part they forget, or aren’t aware of, is the inability to delegate your credentials (the user name and password) to the location where the files reside. This inability to delegate credentials to a remote computer, from your already remote connection to a computer, is called the second hop, or double hop, problem. It’s by design, as we don’t want repetitive credential delegation from machine to machine, especially if the credentials are being used maliciously, which is basically what would happen if this delegation were allowed.

When faced with this problem, there are three thoughts that I typically have:

The first though is to copy out the file(s) to the remote computers before the PowerShell Remoting connection. Then, when you need the files on the remote computer, they are already in place. I’ve recommend this several times over the years. Think of this file copy as a prestage for the work your remote command, or commands, will complete using those files.

The second thought is CredSSP, and it should be avoided at all costs whenever possible. Once set up, it allows you to approve the credential delegation from your remote computer to the remote location where you want to get the files. It sounds wonderful, but as of this writing, it is still includes some security concerns.

The third thought is to use an endpoint that runs under a different account. It requires that the endpoint you connect to — that’s the thing that accepts your incoming PowerShell Remoting connection — to run as a user other than the connecting user. When set up, or edited, to do this, the endpoint isn’t running as you, when you connect, and therefore, can delegate its own credentials to the location where your files live. This eliminates the second hop, as the credentials you used to connect to the remote computer aren’t used to get to the location of the file(s) you want to copy down.

Before I left for vacation, I came up with another idea; a fourth thought. When we use Invoke-Command we have the option to take variables with us into the remote session. In a previous post, I showed three examples of how to do this: Read it; it’s short. Now, knowing I can fill a variable with the contents of a file… and knowing I can take a variable into a remote session… did a light just go on?

As I mentioned in the forum post, my fourth thought, was mildly unconventional. It didn’t require someone to prestage (copy the files out, before the remote connection), it wasn’t unsecure (CredSSP), and it didn’t require any knowledge about creating, or editing, endpoints to use a run as account. All it required was reading some files into some variables and taking those variables into the remote PowerShell session. Here’s the example I gave them on TechNet (and here’s the forum thread). I’ll discuss the example, below the example, so you can understand the entire, logical flow.

$ConfigFile01 = Get-Content -Path C:\Config01.txt
$ConfigFile02 = Get-Content -Path C:\Config02.txt

Invoke-Command -ComputerName Server01 -ScriptBlock {
    Add-Content -Path C:\file01.txt -Value $Using:ConfigFile01
    Add-Content -Path C:\file02.txt -Value $Using:ConfigFile02

    $1 = Get-Content -Path C:\file01.txt
    $2 = Get-Content -Path C:\file02.txt

    $1
    $2

    Start-Sleep -Seconds 5

    Remove-Item -Path C:\file01.txt,C:\file02.txt
}

Beginning in the first two lines (1 and 2), I set two variables on my local computer with the content of two different files. The variable $ConfigFile01 stores the content from the file C:\Config01.txt, and $ConfigFile02 stores the content from the file C:\Config02.txt. With these variables set, we run the Invoke-Command command on Server01. Remember, the -ComputerName parameter can accept a comma-separated list of multiple computers.

Inside this script block, we do several things. Keep in mind, as you look this over, that this example isn’t doing much with this code except proving that there’s another way to “get a file to a computer.” The first two lines in the script block (lines 5 and 6), create new files on the remote computer. They create the files C:\file01.txt and C:\file02.txt. The value added to file01.txt comes from the $ConfigFile01 variable, and the value added to C:\file02.txt comes from the $ConfigFile02 variable. At this point, we have our files on the remote computer.

The rest of the script block isn’t necessary, but it helps show some of what we can do. Lines 8 and 9 put the contents of the newly created files on Server01 into variables. Lines 11 and 12 echo the contents of the two variables. On line 14, I pause the script block for five seconds. The reason I did this was so that I could test that the files were created and their contents written before the last line. In that line, line 16, we remove the files we created on the remote computer.

You can test with this example by creating C:\Config01.txt and C:\Config02.txt on your local computer with some text in each. Next, change Server01 in the Invoke-Command to one of your computers. Then, open File Explorer to C$ on the computer to which you’re going to create a PS Remoting session. In my case, I would open it to \\Server01\C$. Having this open will allow you to see the creation, and removal, of file01.txt and file02.txt.

Proving PowerShell’s Usefulness to Newbies, Part II

Back again with another example to share with the Windows PowerShell newbies. They’re out there: people that don’t know the practical power in PowerShell. The idea is to use real-world examples of things we simply wouldn’t want to do manually, in hopes that our lost friends will find the light. Maybe, that’s you.

Part I can be found here. In that post, we learned that we could use PowerShell to create 10,000 folders in about 10 seconds — that’s 1,000 folders per second, or 1 folder each millisecond.

Part II
What we’ll do today is read in the content of a file, sort the items we’ve read in, and then write our sorted results back out to the same file — it’s instant alphabetizing. Let’s start by reading in our file and see what we’re starting with.

PS> Get-Content -Path .\file.txt
web02.mydomain.com
web09.mydomain.com
dc03.mydomain.com
dc04.mydomain.com
sql08.mydomain.com
sql06.mydomain.com
dc08.mydomain.com
sql04.mydomain.com
dc10.mydomain.com
web01.mydomain.com
web03.mydomain.com
dc09.mydomain.com
web05.mydomain.com
web06.mydomain.com
dc02.mydomain.com
web07.mydomain.com
dc05.mydomain.com
web04.mydomain.com
web10.mydomain.com
sql01.mydomain.com
dc01.mydomain.com
sql03.mydomain.com
sql10.mydomain.com
web08.mydomain.com
sql05.mydomain.com
sql07.mydomain.com
dc07.mydomain.com
sql02.mydomain.com
sql09.mydomain.com
dc06.mydomain.com

Our file contains 30 fully-qualified servers names that are in no significant order. Sorting all 30 of these server names is probably going to be difficult, or at least, it’s going to be time consuming. Wrong. With one quick addition to the previous command — see the example below — our list is sorted and without any noticeable difference in the amount of time it took to complete.

PS> Get-Content -Path .\file.txt | Sort-Object
dc01.mydomain.com
dc02.mydomain.com
dc03.mydomain.com
dc04.mydomain.com
dc05.mydomain.com
dc06.mydomain.com
dc07.mydomain.com
dc08.mydomain.com
dc09.mydomain.com
dc10.mydomain.com
sql01.mydomain.com
sql02.mydomain.com
sql03.mydomain.com
sql04.mydomain.com
sql05.mydomain.com
sql06.mydomain.com
sql07.mydomain.com
sql08.mydomain.com
sql09.mydomain.com
sql10.mydomain.com
web01.mydomain.com
web02.mydomain.com
web03.mydomain.com
web04.mydomain.com
web05.mydomain.com
web06.mydomain.com
web07.mydomain.com
web08.mydomain.com
web09.mydomain.com
web10.mydomain.com

Yeah, that was tough. I ran this same command 10 times, measuring it with the Measure-Command cmdlet, so I could see the average amount of time it took to sort the list of computers. It completed this “challenge” in anywhere between 6 and 10 milliseconds. I’m sorry, but it would’ve taken me at least a few minutes do to this by hand.

Now that we know we can sort it, let’s write it back to disk. In this example, below, we’ll put our sorted results back into the file where it came from, overwriting the original contents. If you don’t want to do that, then be sure to alter the file name value for Set-Content’s -Path parameter.

PS> Get-Content -Path .\file.txt | Sort-Object | Set-Content -Path .\file.txt
PS>

Let’s verify that our file now contains the sorted contents by rerunning our first Get-Content command from the beginning of this post.

PS> Get-Content -Path .\file.txt
dc01.mydomain.com
dc02.mydomain.com
dc03.mydomain.com
dc04.mydomain.com
dc05.mydomain.com
dc06.mydomain.com
dc07.mydomain.com
dc08.mydomain.com
dc09.mydomain.com
dc10.mydomain.com
sql01.mydomain.com
sql02.mydomain.com
sql03.mydomain.com
sql04.mydomain.com
sql05.mydomain.com
sql06.mydomain.com
sql07.mydomain.com
sql08.mydomain.com
sql09.mydomain.com
sql10.mydomain.com
web01.mydomain.com
web02.mydomain.com
web03.mydomain.com
web04.mydomain.com
web05.mydomain.com
web06.mydomain.com
web07.mydomain.com
web08.mydomain.com
web09.mydomain.com
web10.mydomain.com

That’s it. We ran a one-liner command to read in some data, sort it, and push it right back to the same file. So we were able to compare, I went ahead and sorted the same, 30-line file manually. It took me 3 minutes and 30 seconds (proof below). While this wasn’t the most complex, or lengthy document one might sort, the time difference is still quite drastic. PowerShell has plenty to offer, and spending some time to learn it is going to pay off. This line of work requires all the time you have to continue to learn, and stay relevant. PowerShell can give you some time back.

PS C:\> $StartTime = Get-Date
PS C:\> $EndTime = Get-Date
PS C:\> New-TimeSpan -Start $StartTime -End $EndTime


Days              : 0
Hours             : 0
Minutes           : 3
Seconds           : 30
Milliseconds      : 860
Ticks             : 2108600828
TotalDays         : 0.00244051021759259
TotalHours        : 0.0585722452222222
TotalMinutes      : 3.51433471333333
TotalSeconds      : 210.8600828
TotalMilliseconds : 210860.0828

Copy Outlook Signature to Clipboard

As far as I am aware, the in-house built front end for our help desk ticketing system, doesn’t have a way to include a signature. This means that as I update and close tickets in the office, I often find myself opening a new, blank email, copying my signature, and pasting it in the notes field on the ticket. I know, I know — I’m embarrassed.

No more, am I going to consider this acceptable, especially for someone that uses PowerShell for as many things as I do: it’s. always. open. Today was the day I fixed this forever, and it took a whole two minutes.

I needed to first determine where Outlook (2013 on Windows 8.1) looks for my signature. I traced it down to C:\Users\tommymaynard\AppData\Roaming\Microsoft\Signatures. In that path there are three files named Standard — the same name used for my Signature in Outlook, when I open the Signatures and Stationary dialog. There is a .htm version, a .rtf version, and a .txt version of the signature. Simple decision: I decided I would make use of the text file.

Since I was going to use this in my profile, I didn’t include anything inside the function, but the simple command I wanted to run. Based on the function below, all I need to do is enter Get-Signature, or its alias, and my function will copy the contents of Standard.txt to my clipboard. From there, it’s a quick paste into the help desk ticketing system, and done.

Set-Alias -Name sig -Value Get-Signature
Function Get-Signature {
    $SigPath = 'C:\Users\tommymaynard\AppData\Roaming\Microsoft\Signatures\Standard.txt'
    Get-Content -Path $SigPath | Select-Object -First 4 | clip
}

If you’ve taken a look at the function, you’ll see that I only choose the first 4 lines of the signature file. This was because there was a blank line beneath the last line in my signature, that I wasn’t interested in copying (or manually removing from the file itself).

Randomly Selecting a Name from a CSV File

I have to admit, I’m kind of excited to be doing another Twitter Reply post. This one is the result of a post by Michael Bender. I’ve never met him, but I’ve follow his career a bit in the last few years, when I was introduced to his name via TechEd. He made a recent Tweet where he asked for a PowerShell genius to help build him a script. The goal was to randomly select a name from a CSV file, with an option to choose how many names it should randomly select.

Here’s the Tweet: https://twitter.com/MichaelBender/status/593168496571322370

I wouldn’t consider myself a genius, but if you consider what he’s after — a name from a CSV file (not a line from a text file) — then there’s a better option then the one he accepted: “get-content file.csv | get-random.” This will work, but there’s some reasons why this isn’t such a great idea. We can use the Get-Content cmdlet with CSV files, but we generally don’t. CSV files are created and formatted in such a way, that using Get-Content, instead of Import-Csv, isn’t the best way to do things.

Let’s consider that we’re using the CSV file in the image below. It has three headers, or column names: Name, Age, and HairColor.

Randomly Selecting Name from a CSV File01

First of all, if we use Get-Content, it’s going to consider the first line — the header line, or the column headings — a selectable line within the file. Here’s that example:

PS C:\> Get-Content .\file.csv | Get-Random
Name,Age,HairColor

Uh, not helpful. While this isn’t the end of the world, as we can simply rerun the command (and probably get something different), we’d be better off to write this so that this error is not a possibility — something we’ll see in a moment, when we use Import-Csv.

Second of all, unless the only column in the CSV file is Name, then we’re not returning just a name, but instead, everything in a row of the file. This means that if I randomly choose a line in the file, that it will include information that Michael didn’t request — the age and hair color.

PS C:\> Get-Content .\file.csv | Get-Random
Jeff,19,Brown

I suppose we could start parsing our returned value to just return the name ((Get-Content .\file.csv | Get-Random).Split(‘,’)[0]), but seriously, let’s skip that and use one of the cmdlets that was built to work with CSVs directly. We’ll assume we’re still using the same CSV file in the image above.

PS C:\> Import-Csv .\file.csv | Get-Random | Select-Object Name

Name
----
Lance

Um, that was easy. If we wanted to increase the number of names returned from our CSV file, then we would use Get-Random’s -Count parameter and an integer value, such as in this example:

PS C:\> Import-Csv .\file.csv | Get-Random -Count 3 | Select-Object Name

Name
----
Bob
Stephen
Sue

I think we’re best off when we use the *-Csv cmdlets with CSV files, and the Get-Content cmdlet with most other file types we can read in PowerShell. There is at least one exception of doing this in reverse: Using Import-Csv with a text file, that has a delimiter, might help you better parse the information in your file. Consider this text file:

Randomly Selecting Name from a CSV File02

The next few examples will progressively show you how to use Import-Csv with a delimited text file until all of the “columns” are broken apart. When we’re finished, we can even export our data into a properly formatted CSV file, and then import it, piping those results to other cmdlets — take a look.

PS C:\> Import-Csv .\file.txt

001|Sunday|Car7
---------------
002|Monday|Car2
003|Tuesday|Car3
004|Wednesday|Car3
005|Thursday|Car7
006|Friday|Car2
007|Saturday|Car3

PS C:\> Import-Csv .\file.txt -Header Id,Day,Car

Id                                      Day                                     Car
--                                      ---                                     ---
001|Sunday|Car7
002|Monday|Car2
003|Tuesday|Car3
004|Wednesday|Car3
005|Thursday|Car7
006|Friday|Car2
007|Saturday|Car3

PS C:\> Import-Csv .\file.txt -Header Id,Day,Car -Delimiter '|'

Id                                      Day                                     Car
--                                      ---                                     ---
001                                     Sunday                                  Car7
002                                     Monday                                  Car2
003                                     Tuesday                                 Car3
004                                     Wednesday                               Car3
005                                     Thursday                                Car7
006                                     Friday                                  Car2
007                                     Saturday                                Car3

PS C:\> Import-Csv .\file.txt -Header Id,Day,Car -Delimiter '|' | Export-Csv -Path C:\file-cars.csv -NoTypeInformation
PS C:\> Import-Csv .\file-cars.csv

Id                                      Day                                     Car
--                                      ---                                     ---
001                                     Sunday                                  Car7
002                                     Monday                                  Car2
003                                     Tuesday                                 Car3
004                                     Wednesday                               Car3
005                                     Thursday                                Car7
006                                     Friday                                  Car2
007                                     Saturday                                Car3

PS C:\> Import-Csv .\file-cars.csv | Where-Object Car -eq 'Car3'

Id                                      Day                                     Car
--                                      ---                                     ---
003                                     Tuesday                                 Car3
004                                     Wednesday                               Car3
007                                     Saturday                                Car3

PS C:\> Import-Csv .\file-cars.csv | Where-Object Car -eq 'Car3' | Select-Object Day

Day
---
Tuesday
Wednesday
Saturday

So, if I had seen the Tweet first, I would have to recommend Import-Csv, piped to Get-Random, and then piped to Select-Object Name. Thanks for reading the post.