Category Archives: AWS

Documents and resources involving AWS Tools for PowerShell.

Those AWS Region Commands

More and more, Amazon Web Services has become a significant part of my day. Luckily for me, PowerShell and AWS, work well together. There’s long been the AWSPowerShell module, which much like the AWS CLI, allows us to issue API calls to AWS from a command line.

As a part of continuing my journey into AWS, and maintaining my mild obsession with PowerShell, I’ve decided to better learn a few PowerShell cmdlets, from the AWSPowerShell module throughout at least a couple of posts. As of version 3.3.365, the module only contains a few thousand cmdlets. It seems like AWS has gone ahead and made an investment here in PowerShell. Especially, when it wasn’t even terribly long ago that there were only 2,000.

(Get-Command -Module AWSPowershell | Measure-Object).Count
4499

Oh yeah, that reminds me, Lambda supports PowerShell (Core) now, too. As I read somewhere recently, “It’s not Bash, it’s not Ruby. It’s PowerShell.”

In a few previous, AWS-specific posts I was able to point out some things I thought should be changed. And somehow, AWS paid close enough attention, that some changes were actually made. It’s still hard to believe; it’s a bit surreal.

Hashtag AWS Tweet Prompts Fix to AWSPowerShell Module
AWS Stop-EC2Instance Needs Complimentary Cmdlet
More AWS PowerShell Changes Due to Twitter (and Me)

I’m mostly writing this evening to help solidify a few commands for my own education, and anyone else who is reading along. But… as I experimented with some of these AWS PowerShell cmdlets, I couldn’t help but feel that some changes were in order. So, with that knowledge, let’s review a few Region-specific cmdlets and touch on some potential changes, as they make the most sense to me.

Get-AWSRegion: “Returns the set of available AWS regions.”

When the Get-AWSRegion cmdlet is invoked, it returns the Region Name (the full name), the Region (abbreviated Region Name), and whether or not the Region is the default in the shell. In this first example, you can see that the default output returns all the Regions.

Get-AWSRegion

Region         Name                      IsShellDefault
------         ----                      --------------
ap-northeast-1 Asia Pacific (Tokyo)      False
ap-northeast-2 Asia Pacific (Seoul)      False
ap-south-1     Asia Pacific (Mumbai)     False
ap-southeast-1 Asia Pacific (Singapore)  False
ap-southeast-2 Asia Pacific (Sydney)     False
ca-central-1   Canada (Central)          False
eu-central-1   EU Central (Frankfurt)    False
eu-west-1      EU West (Ireland)         False
eu-west-2      EU West (London)          False
eu-west-3      EU West (Paris)           False
sa-east-1      South America (Sao Paulo) False
us-east-1      US East (Virginia)        False
us-east-2      US East (Ohio)            False
us-west-1      US West (N. California)   False
us-west-2      US West (Oregon)          False

Wrong. Be sure to take a look at the examples further below. As you’ll see there are a couple of parameters—IncludeChina and IncludeGovCloud—that add some Regions that aren’t there by default. I’m not suggesting a change here, mostly, but Get-AWSRegion should return all the Regions, right?

Based on the cmdlet name alone, I suspected that all Regions were going to be listed in the output that’s returned. Good thing I looked into this cmdlet’s parameters, instead of assuming that the Regions were actually all included. And why China? I get why we might make an exception for GovCloud—we shouldn’t—but what was the thought in regard to China? You’ll see what I mean in the following examples.

(Get-AWSRegion | Measure-Object).Count
15
(Get-AWSRegion -IncludeChina | Measure-Object).Count
17
(Get-AWSRegion -IncludeChina -IncludeGovCloud | Measure-Object).Count
18

Now, let’s take a look at the SystemName parameter included in Get-AWSRegion. This is where it becomes quickly evident to me that we can definitely do better. Why does the Region property use a parameter called SystemName? I think maybe that parameter needs a Region parameter alias, at minimum.

Get-AWSRegion -SystemName us-west-1

Region    Name                    IsShellDefault
------    ----                    --------------
us-west-1 US West (N. California) False

Get-AWSRegion -SystemName us-west-2

Region    Name             IsShellDefault
------    ----             --------------
us-west-2 US West (Oregon) False

Get-AWSRegion -SystemName us-west-*

Region    Name    IsShellDefault
------    ----    --------------
us-west-* Unknown False

I didn’t spend any time reading the help for Get-AWSRegion, but as you can see directly above, the wildcard character isn’t supported with the SystemName parameter. That’s too bad. That would’ve been a great addition to this cmdlet (and one that can still be added). To use a wildcard character against this value, you’re required to pipe your output to the Where-Object cmdlet and therefore, filter it further down the pipeline. Yes, this does mean that all results are piped to Where-Object, whereas a wildcard character built in to Get-AWSRegion, would filter immediately and avoid the need for the pipeline. The pipeline is crucial to the language, but when it’s not needed, we’re better off.

Get-AWSRegion | Where-Object Region -like 'us-west-*'

Region    Name                    IsShellDefault
------    ----                    --------------
us-west-1 US West (N. California) False
us-west-2 US West (Oregon)        False

And if you’re wondering, Get-AWSRegion doesn’t include a Name parameter, so there’s no checking there for the ability to use wildcards.

Get-DefaultAWSRegion: “Returns the current default AWS region for this shell, if any, as held in the shell variable $StoredAWSRegion.”

This command’s purpose, as indicated, returns the Region the AWSPowerShell module’s commands should use by default. If the command returns nothing, then a default hasn’t been set. If it does, then someone has likely already used the command we’ll discuss after Get-DefaultAWSRegion: Set-DefaultAWSRegion.

Get-DefaultAWSRegion
# Nothing here yet...

Before we move on, the commands that include the string “Default” before “AWS,” should have instead maintained the same noun prefix as Get-AWSRegion. That’s right, each of these three cmdlets should’ve included the verb, the dash, and then the string AWS, before the remaining portion of the included nouns. Why in the world would we stuff “AWS” into the middle of some command names and at the beginning of others? Amazon should have maintained a consistent prefix. Every Microsoft Active Directory command uses an AD prefix, right after the dash. You’ll never find it anywhere else. Even in the office, the prefix we use on our self-written functions is right where we’ve been trained to expect it:

<ApprovedVerb>-<Dept><SingularNoun(s)InCamelCase>

In my experience, AWS isn’t afraid of using command aliases—so one command can resolve to another—discontinuing the use of parameter names at times, and changing cmdlet names altogether. Therefore, I suspect someone needs to revisit these three. It’s not like Get-AWSRegionDefault, Set-AWSRegionDefault, and Clear-AWSRegionDefault don’t make sense and are already being used. The current commands should be aliases to these three, keeping the prefixes in the proper place.

Get-DefaultAWSRegion -> Get-AWSRegionDefault
Set-DefaultAWSRegion -> Set-AWSRegionDefault
Clear-DefaultAWSRegion -> Clear-AWSRegionDefault

While we’re here, we need to stop using the plural form of any nouns. That said, I do recognize this move is happening, such that Get-AWSCredentials is an alias that resolves to Get-AWSCredential. Oh look at that, the AWS prefix is in the right place on those two!

Set-DefaultAWSRegion: “Sets a default AWS region system name (e.g. us-west-2, eu-west-1 etc) into the shell variable $StoredAWSRegion. AWS cmdlets will use the value of this variable to satisfy their -Region parameter if the parameter is not specified.”

The first thing to notice here is that we have a Region parameter. That’s what Get-AWSRegion should’ve been included (in addition to SystemName [as not to create a breaking change]). Maybe make Region the actual parameter, and SystemName an alias to the Region parameter. That sounds like the best way to phase out that parameter.

Set-DefaultAWSRegion -Region ca-central-1
Get-DefaultAWSRegion

Region       Name             IsShellDefault
------       ----             --------------
ca-central-1 Canada (Central) True 

Clear-DefaultAWSRegion: “Clears any default AWS region set in the shell variable $StoredAWSRegion.”

Get-DefaultAWSRegion
Region       Name             IsShellDefault
------       ----             --------------
ca-central-1 Canada (Central) True
Clear-DefaultAWSRegion
Get-DefaultAWSRegion
# Nothing here again.

This evening we covered some Region cmdlets from the AWSPowerShell module. They all do what they should in the end, but in my mind, there’s some room for some changes for consistency’s sake and overall improvement. Perhaps we’ll do this again… there are some credential-related AWS cmdlets I’m going to need to learn once. and. for. all.

Change Prompt on Module Import

To me, my PowerShell prompt is quite important. I’ve written about it nine times already. While I’m not going to write about it again, so much, I am going to focus on an updated prompt I’ve created for a recent project. I couldn’t help but take a few things into this project from my prompt, and that’s why that’s been mentioned.

I recently received a screen capture from someone running into a problem using one of the tools I’ve written. Sure, it needs some error checking, I won’t deny that, but it was a very obscure and unforeseen problem. You know, how we often find out about error conditions.

This tool, or function, can be run on an Amazon Web Services EC2 instance within a project, to determine the status of its partner EC2 instance. When it works, it returns the status of the other instance, to include things like running, stopping, stopped, etc. There’s also a couple other companion functions that can start and stop the partner instance. The second of the two machines has very high specifications, and so we ask that our users shut down those secondary instances when they’re not running experiments. They pricey.

The problem is that when I look at this error, the prompt doesn’t tell me enough. I can only tell it’s PowerShell — the PS in the prompt — and the current path — some of which has been hidden in this first image. I want more information without having to ask the user, and so I’ve added that in.  Here’s the default prompt up close.

The new prompt includes the username (to the left of the @), the computer name (to the right of the @), the project name (while it’s not in this example, it’s normally a part of the computer name), the path, and whether the user is an admin (#) or not ($). Now, when I receive a PowerShell screen capture, I already have a few of my first questions answered.

All of the functions, such as the one that was run that generated this error, are a part of the same PowerShell module. There’s somewhere near 20 of them so far, and I keep finding reasons to add new ones. If you don’t know, creating a PowerShell script module that includes a module manifest file (a .psd1), allows one to include scripts that execute just before a module is imported. This, whether the module is imported manually using Import-Module, or by invoking a function from within the module, when the module hasn’t yet been imported.

Let’s take a look at my SetPrompt.ps1 script file that executes when my PowerShell module is imported.

Function prompt {
    # Determine Admin; set Symbol variable.
    If ([bool](([System.Security.Principal.WindowsIdentity]::GetCurrent()).Groups -match 'S-1-5-32-544')) {
        $Symbol = '#'} Else {$Symbol = '$'
    }

    # Create prompt.
    "[$($env:USERNAME.ToLower())@$($env:COMPUTERNAME.ToLower()) $($executionContext.SessionState.Path.CurrentLocation)]$Symbol "
} # End Function: prompt.

With this script in place and a ScriptsToProcess entry in the module’s manifest file, pointing to this file, I can be ensured that the user’s prompt will change the moment my module is imported. From here on out, I can rest assured that if a user — of these machines at least — sends us a screen capture that it’ll include relevant pieces of information I would have had to ask for, had this prompt not been in place.

There’s a final thought here. When the user is done with this PowerShell module, they’re still going to have this prompt. In my case, it’s perfectly suitable because my users won’t be in PowerShell, unless they’re issuing commands from the module. At least, I can’t imagine they will be. The only other thoughts I’ve had about this “problem” would be to (1) teach users to remove the module, and have code in the prompt monitor whether the module is loaded or not, and revert the prompt if the module is removed, or (2) revert the prompt if a command outside of my module is invoked.

That’s it for today. I don’t have it shown here, but it’s neat to see the prompt alter itself when the module is loaded. Something to keep in mind, if you find yourself in a similar situation.

Update: I was annoyed that $HOME, or C:\Users\tommymaynard, was being displayed as the full path, so I made some additional modifications to the prompt that’s being used for this project. It’ll now look like this, when at C:\Users\tommymaynard (or another user’s home directory).

Here’s the new prompt function, to include a better layout.

Function prompt {
    # Determine Admin; set Symbol variable.
    If ([bool](([System.Security.Principal.WindowsIdentity]::GetCurrent()).Groups -match 'S-1-5-32-544')) {
        $Symbol = '#'
    } Else {
        $Symbol = '$'
    }

    If ((Get-Location).Path -eq $env:USERPROFILE) {
        $Path = '~'
    } Else {
        $Path = (Get-Location).Path
    }

    # Create prompt.
    "[$($env:USERNAME.ToLower())@$($env:COMPUTERNAME.ToLower()) $Path]$Symbol "
} # End Function: prompt.

AWS Write-S3Object Folder Creation Problem

If you were wondering what I occasionally thought about over the holiday break, while I was scrubbing my pool tiles—ugh, it was AWS and the Get- and Write- S3Object cmdlets. Why wasn’t I able to get them to do what I wanted!? Let me first explain my problem, and then my solution.

When you create a folder (and yes, I realize it’s not really a folder), in the AWS Management Console, you do so by clicking the “+ Create folder” button. This guy.

Simple stuff. In my case, you choose the AES-256 encryption setting and then give the folder a name. This folder, because I created it in this manner, is returned by the Get-S3Object cmdlet. So, let’s say we walked through this manual procedure and created a folder called S3ConsoleFolder. Here’s what Get-S3Object would return, providing the S3 bucket, we’re calling tst-bucket, had been empty from the start.

(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/

Because it could prove helpful, let’s say we also manually created a nested folder inside of our S3ConsoleFolder called S3CFA, as in the A folder of the S3ConsoleFolder. Consider it done. Here’s the new results of the Get-S3Object command. As you’ll see now, we return both the top-level folder, and the newly created, nested folder.

(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/
S3ConsoleFolder/S3CFA/

Now, let’s get one step closer to using Write-S3Object. Its purpose, according to its maker, Amazon, is that it “uploads one or more files from the local file system to an S3 bucket.” That, it does. Below I’ve indicated the top-level folder, the nested folders, and the files that we’ll upload. The complete path to this folder is C:\Users\tommymaynard\Desktop\TestFolder.

TestFolder
|__ FolderA
|	|__ TestFile2.txt
|	|__ TestFile3.txt
|__ FolderB
|	|__ TestFile4.txt
|	|__ TestFile5.txt
|	|__ TestFile6.txt
|	|__ FolderC
|		|__ TestFile7.txt
|__ TestFile1.txt

The below Write-S3Object command’s purpose is to get everything in the above file structure uploaded to AWS S3. Additionally, it creates the folders we need: TestFolder, FolderA, FolderB, and FolderC. I’m using a parameter hash table below to decrease the length of my command, so it’s easier to read. It’s in no way a requirement.

$Params = @{
    BucketName = 'tst-bucket'
    Folder = 'C:\Users\tommymaynard\Desktop\TestFolder'
    KeyPrefix = (Split-Path -Path 'C:\Users\tommymaynard\Desktop\TestFolder' -Leaf).TrimEnd('\')
    Recurse = $true
    Region = 'us-gov-west-1'
    ServerSideEncryption = 'AES256'
}

With the parameter hash table created, we’ll splat it on the Write-S3Object cmdlet. When completed, we’ll run the Get version of the cmdlet again, to see what was done.

Write-S3Object @Params
(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/
S3ConsoleFolder/S3CFA/
TestFolder/FolderA/TestFile2.txt
TestFolder/FolderA/TestFile3.txt
TestFolder/FolderB/FolderC/TestFile7.txt
TestFolder/FolderB/TestFile4.txt
TestFolder/FolderB/TestFile5.txt
TestFolder/FolderB/TestFile6.txt
TestFolder/TestFile1.tst

Now, look at the above results and tell me what’s wrong. I’ll wait…

… Can you see it? What don’t we have included in those results, that we did when we created our folders in the AWS Management Console?

Maybe I was asking for too much, but I expected to have my folders returned on their own lines just like we do for S3ConsoleFolder/ and S3ConsoleFolder/S3CFA/. Remember, I’m lying down on my pool deck, scrubbing the pool tiles (it’s Winter, yes, but I live in southern Arizona), and I cannot for the life of me wrap my head around why I’m not seeing those folders on. their. own. lines. I expected to see these lines within my results:

TestFolder/
TestFolder/FolderA/
TestFolder/FolderB/
TestFolder/FolderB/FolderC/

Remember, I can create folders in the AWS Management Console and it works perfectly, but not with Write-S3Object. Well, not with Write-S3Object the way I was using it. I finally had an idea worth trying: I needed to use Write-S3Object to create the folders first, and then upload the files into the folders. It’s obnoxious. While that required more calls to Write-S3Object, I was okay with it, if it could get me the results I wanted. And ultimately, get my users the results I wanted them to have.

So let’s dump my TestFolder from S3 and start over. We’re here again.

(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/
S3ConsoleFolder/S3CFA/

We’ll start by creating our parameter hash table. After that, we’ll begin to use the $Param variable and its properties (its keys) to supply parameter values to parameter names. I don’t actually splat the entire hash table in this code section. Although the next three code sections go together, I’ve broken them up, so I can better explain them. This first section creates a $Path variable, an aforementioned parameter hash table (partially based on the $Path variable), and an If statement. The If statement works this way: If my S3 bucket doesn’t already include a folder called TestFolder/, then create it.

$Path = 'C:\Users\tommymaynard\Desktop\TestFolder'
$Params = @{
    BucketName = 'tst-bucket'
    Folder = (Split-Path -Path $Path -Leaf).TrimEnd('\')
    KeyPrefix = (Split-Path -Path $Path -Leaf).TrimEnd('\')
    Recurse = $true
    Region = 'us-gov-west-1'
    ServerSideEncryption = 'AES256'
}

# Create top-level folder (if necessary).
If ((Get-S3Object -BucketName $Params.BucketName -Region $Params.Region).Key -notcontains "$($Params.Folder)/") {
    Write-S3Object -BucketName $Params.BucketName -Region $Params.Region -Key "$($Params.Folder)/" -Content $Params.Folder -ServerSideEncryption AES256
}

Now, Get-S3Object returns my newly created, top-level folder. It was working so far.

(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/
S3ConsoleFolder/S3CFA/
TestFolder/

This second section gets all of the directory’s names from my path. If there are duplicates, and there were, they’re removed by Select-Object‘s Unique parameter. Once I know these, I can start creating my nested folders after cleaning them up a little: splitting the path, replacing backslashes with forward slashes, and removing any forward slashes from the beginning of the path. With each of those, we’ll make sure the cleaned-up path doesn’t include two forward slashes (this would indicate it’s the top-level folder again [as TestFolder//]), and that it doesn’t already exist.

# Create nested level folder(s) (if necessary).
$NestedPaths = (Get-ChildItem -Path $Path -Recurse).DirectoryName | Select-Object -Unique
Foreach ($NestedPath in $NestedPaths) {

    $CleanNestedPath = "$(($NestedPath -split "$(Split-Path -Path $Path -Leaf)")[-1].Replace('\','/').TrimStart('/'))"

    If (("$($Params.Folder)/$CleanNestedPath/" -notmatch '//') -and ((Get-S3Object -BucketName $Params.BucketName -Region $Params.Region).Key -notcontains "$($Params.Folder)/$CleanNestedPath/")) {
        Write-S3Object -BucketName $Params.BucketName -Region $Params.Region -Key "$($Params.Folder)/$CleanNestedPath/" -Content $CleanNestedPath -ServerSideEncryption AES256
    }
}

And, now Get-S3Object returns my nested folders, too. Still working.

(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/
S3ConsoleFolder/S3CFA/
TestFolder/
TestFolder/FolderA/
TestFolder/FolderB/
TestFolder/FolderB/FolderC/

This last section only serves to upload the files from the EC2 instance to the S3 bucket and into the folders we’ve created. Like the other code did in the last two sections, this doesn’t check that files already exist. It’ll happily write, right over them without warning. I didn’t need this protection, so I didn’t include it.

Write-S3Object -BucketName $Params.BucketName -Region $Params.Region -Folder $Path -KeyPrefix "$($Params.Folder)/" -ServerSideEncryption AES256 -Recurse

Now that my folders are created and the files are uploaded, I get the results I expect. I can see all the folders on their own lines, as well as that of the files.

(Get-S3Object -BucketName 'tst-bucket' -Region 'us-gov-west-1').Key
S3ConsoleFolder/
S3ConsoleFolder/S3CFA/
TestFolder/
TestFolder/FolderA/
TestFolder/FolderA/TestFile2.txt
TestFolder/FolderA/TestFile3.txt
TestFolder/FolderB/
TestFolder/FolderB/FolderC/
TestFolder/FolderB/FolderC/TestFile7.txt
TestFolder/FolderB/TestFile4.txt
TestFolder/FolderB/TestFile5.txt
TestFolder/FolderB/TestFile6.txt
TestFolder/TestFile1.txt

I do hope I didn’t overlook an easier way to do this, but as history has proved, it’s quite possible. Here’s to hoping this can help someone else. I felt pretty lost and confused until I figured it out. It seems to me that AWS needs to iron this one out for us. No matter how it’s used, Write-S3Object should create “folders” in such a way that they are consistently returned by Get-S3Object. That’s whether they’re created before files are uploaded (my fix), or simply created as files are uploaded.

And, cue the person to tell me the easier way.

Determine if AWS EC2 Instance is in Test or Prod Account

It recently became apparent that I need a way to determine if I’m on an AWS TST (Test) EC2 instance, or a PRD (production) EC2 instance. The reason this is necessary is so that I can include a function to upload a file, or a folder, to an S3 bucket and ensure the portion of the bucket name that includes the environment, is included, and is correct. Therefore, I needed a function to be able to determine where it was running: in TST or PRD. Had I known I would need this information earlier, I would’ve had the CloudFormation template write this information to the Windows Registry or a flat file, so I could pick it up when needed. Had I considered an option such as this, I wouldn’t be wracking my brain now to try to determine a way to gather this information, when I didn’t leave it somewhere.

The computer names are the same in both environments (in both AWS accounts), so any comparison there doesn’t work. The TST servers and the PRD servers have the same name. After some thought, I came up with my three options:

1. Return the ARN from the metadata, and use that to determine whether I am on a TST or PRD EC2 instance, by extracting the AWS account number from the ARN and comparing it to two known values. This would allow me to determine which account I’m in—TST or PRD.

2. Look at the folders in C:\support\Logs. My functions, that are invoked inside the UserData section of a CloudFormation template, include logging that creates folders in this location named “TST” on a test EC2 instance and named “PRD” on a production EC2 instance.

3. Read in the UserScript.ps1 file (the UserData section in the CloudFormation template), and compare the number of the ‘tst’ and ‘prd’ strings in the file’s contents. Based on my UserData section, if there are more ‘tst’ strings, I’m on a TST EC2 instance, and if there are more ‘prd’ strings, I’m on a PRD EC2 instance.

Seriously Tommy, leave yourself some information on the system somewhere already. You never know when you might need it.

I decided I would use two of the three above options, providing myself a fallback if it were ever necessary. Who knows, I may not retire from my place of employment, and I’d like my code to continue to work as long as possible, even if I’m not around to fix it. I was worried about hard coding in those AWS account numbers, but less so, when I added a check against my UserScript.ps1 file (again, the UserData section of a CloudFormation document) for the count of ‘tst’ vs. ‘prd’ strings. Maybe it’ll never be used, but if needed, it’ll be there for me.

Here’s my first check in order to determine if my code is running on a TST or PRD EC2 instance. Feel free to look past my Write-Verbose statements. I used these for logging. So you have a touch of information, I’m creating the name of an S3 bucket, such as <projectname>-<environment>. Right now, we’re after the <environment> section, which I’m out to store in $Env. The <projectname> portion comes from parsing the EC2 instance’s assigned hostname.

#region Determine S3 Bucket using ARN and Account Number.
Write-Verbose -Message "$BlockLocation Determing the S3 Bucket Name using the ARN and account number."
$WebRequestResults = (Invoke-WebRequest -Uri 'http://169.254.169.254/latest/meta-data/iam/info' -Verbose:$false).Content
$InstanceProfileArn = (ConvertFrom-Json -InputObject $WebRequestResults).InstanceProfileArn
$AccountNumber = ($InstanceProfileArn.Split(':'))[4]
Write-Verbose -Message "$BlockLocation ARN: $InstanceProfileArn."
Write-Verbose -Message "$BlockLocation Account Number: $AccountNumber."`

If ($AccountNumber -eq '615613892375') {
    $Env = 'tst'
} ElseIf ($AccountNumber -eq '368125857028') {
    $Env = 'prd'
} Else {
    $Env = 'unknown'
    Write-Verbose -Message "$BlockLocation Unable to determine S3 Bucket Name using the ARN and account number."
} # End If.
#endregion

The following region’s code is wrapped in an If statement that will only fire if the $Env variable is equal to the string “unknown”. Notice that in the above code, $Env will be set to this value if neither of the AWS account numbers matches my hard coded values. No, those aren’t real AWS account numbers—well, not mine at least, and yes, they could’ve come in via function parameters.

#region If necessary, determine using 'tst' vs. 'prd' in UserScript.ps1.
If ($Env -eq 'unknown') {
    Write-Verbose -Message "$BlockLocation Determing S3 Bucket Name by comparing specific strings in the UserScript.ps1 file."
    $FileContent = Get-Content -Path "$env:ProgramFiles\Amazon\EC2ConfigService\Scripts\UserScript.ps1"
    $MatchesTst = Select-String -InputObject $FileContent -Pattern 'tst' -AllMatches
    $MatchesPrd = Select-String -InputObject $FileContent -Pattern 'prd' -AllMatches
    Write-Verbose -Message "$BlockLocation Comparing ."

    If ($MatchesTst.Matches.Count -gt $MatchesPrd.Matches.Count) {
        $Env = 'tst'
    } ElseIf ($MatchesTst.Matches.Count -lt $MatchesPrd.Matches.Count)  {
        $Env = 'prd'
    } Else {
        Write-Warning -Message "Unable to determine account number by comparing specific strings in the UserScript.ps1 file."
        Write-Verbose -Message "$BlockLocation Unable to determine account number by comparing specific strings in the UserScript.ps1 file."
    }
} # End If.
#endregion

If the above code fires, it will read in the contents of my UserScript.ps1 file. Again, this script file contains exactly what’s in the UserData section of my instance’s CloudFormation template. Once we have the file’s contents in a variable, we’ll scan it two times. On the first check, we’ll record the matches for the string ‘tst’. On the second check, we’ll record the matches for the string ‘prd’. The way I’ve written my UserData section, if there are more ‘tst’ strings, I’m on a TST EC2 instance, and if there are more ‘prd’ strings, I’m on a PRD EC2 instance. There’s a nested If statement that does this comparison and tells me which it is by assigning the proper value to that $Env variable.

So, after all that, I think I’ll just try to predetermine what information might be needed in the future and drop that into a flat-file or put it into the registry… just in case, it’s ever needed. This task was obnoxious but worthy of sharing.

AWS UserData Multiple Run Framework Part III

Update: There’s a fourth version of this article series. See the link at the bottom of this post.

At the bottom of the below post (Part II), was a function that created code for what I called the AWS UserData Multiple Run Framework. The code produced by this function can be added to the AWS CloudFormation UserData section allowing for Windows EC2 instance configuration between a controlled number of restarts. You know… configure, restart, configure, restart, configure, restart, etc. Without this in place, or some other configuration tool, you only get one time to utilize the UserData section, and that’s when an EC2 instance launches for the first time.

That version, however, required the use of text files at the root of the C:\ drive. These were used so that the UserData code would know which section of the code to run after each restart. I always worried that someone wouldn’t know the importance of the text files at the root of the C:\ drive and remove them. Therefore, I wrote a newer version. I believe I previously mentioned that as a possibility, in one of the two other related posts.

AWS UserData Multiple Run Framework Part II

This version—the newest version—makes use of the Windows Registry to make the determination of what code to run next. There are no more scattered files on the root of the C:\, and instead, everything is better protected, and hidden, in the Windows Registry. Let’s briefly discuss each section first. Then, use the function at the bottom of this post. It creates the code that you would add to the AWS CloudFormation UserData section. It isn’t the code you’d add to the UserData section. It’s still PowerShell creating PowerShell. For reference, think of the Microsoft function New-IseSnippet of old, and the New-ModuleManifest cmdlet. They’re both PowerShell, that create PowerShell.

Function Set-SystemForNextRun {
    Param (
        [string]$Pass,
        [switch]$UserData,
        [switch]$Restart
    )
    If ($Pass) {
        [System.Void](New-ItemProperty -Path 'HKLM:\SOFTWARE\DEPT' -Name "Pass$Pass" -Value 'Complete')
    }
    If ($UserData) {
        $Path = "$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]$ConfigXml = Get-Content -Path $Path
        ($ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        $ConfigXml.Save($Path)
    }
    If ($Restart) {
        Restart-Computer -Force
    }
}

The Set-SystemForNextRun function—the first part of the code that’s created—has some minor changes. First, the PassFile parameter is now just Pass. This was changed because, well, we’re not using files anymore. Second, the code inside the If ($Pass) section no longer creates text files and instead, creates registry values. Remember, as this code runs inside the AWS CloudFormation UserData section, this function is written to memory before it’s ever used. The two upcoming sections are actually run, or rather, do things we can see, first.

# Check if Registry Subkey does not exist.
If (-Not(Get-Item -Path 'HKLM:\SOFTWARE\DEPT' -ErrorAction SilentlyContinue)) {
    # Create Registry Subkey.
    [System.Void](New-Item -Path 'HKLM:\SOFTWARE\' -Name 'DEPT')
}

The above code is the first run code in UserData, as again our Set-SystemForNextRun function is written to memory and yet to be invoked. Its purpose is to create a Windows Registry subkey DEPT if it doesn’t already exist. You can change DEPT to whatever makes the most sense for your use. This is the location where we’ll store our values that indicate which section in the upcoming If-ElseIf (ElseIf, ElseIf, etc.) statement we’ll run.

If (-Not((Get-ItemProperty -Path 'HKLM:\SOFTWARE\DEPT').Pass1 -eq 'Complete')) {

    # Place code here (1).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -Pass '1' -UserData -Restart

} ElseIf (-Not((Get-ItemProperty -Path 'HKLM:\SOFTWARE\DEPT').Pass2 -eq 'Complete')) {

    # Place code here (2).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -Pass '2'

}

This is the meat of what’s produced by the upcoming function. It’s the If-ElseIf statement. On the first run of the UserData, the If statement will fire. This is because we won’t have a value called Pass1 set to the string “Complete” in the HKLM:\SOFTWARE\DEPT subkey. After its code is done, it’ll invoke the Set-SystemForNextRun function which will add the Pass1 value to the Registry, set UserData to enabled, and restart the instance. On the next run, it’ll execute the code in the first (and only) ElseIf section because the value Pass1 will have been created, but the value Pass2 will not have been created. It gets created after the code in this section is complete.

That’s it. Hopefully, this can be helpful for more than just me and those around me at work.

Function New-AWSMultiRunTemplate {
    [CmdletBinding()]
    Param (
        [Parameter()]
        [ValidateRange(1,10)]
        [int]$CodeSectionCount = 2,

        [Parameter()]
        [ValidateSet('All','AllButLast')]
        [string]$EnableUserData = 'AllButLast',

        [Parameter()]
        [ValidateSet('All','AllButLast')]
        [string]$EnableRestart = 'AllButLast'
    )

    DynamicParam {
        # Create dynamic, Log parameter.
        If ($PSBoundParameters['Verbose']) {
            $SingleAttribute = New-Object System.Management.Automation.ParameterAttribute
            $SingleAttribute.Position = 1

            $AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
            $AttributeCollection.Add($SingleAttribute)

            $LogParameter = New-Object System.Management.Automation.RuntimeDefinedParameter('Log',[switch],$AttributeCollection)
            $LogParameter.Value = $true
 
            $ParamDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
            $ParamDictionary.Add('Log',$LogParameter)
            return $ParamDictionary
        } # End If.
    } # End DynamicParam.

    Begin {
        #region Create logs directory and Write-Verbose function.
        If ($PSBoundParameters['Verbose'] -and $PSBoundParameters['Log']) {
            $LogDirectory = "$($MyInvocation.MyCommand.Name)"
            $LogPath = "$env:SystemDrive\support\Logs\$LogDirectory"
            If (-Not(Test-Path -Path $LogPath)) {
                [System.Void](New-Item -Path $LogPath -ItemType Directory)
            }
            $LogFilePath = "$LogPath\$(Get-Date -Format 'DyyyyMMddThhmmsstt').txt"

            Function Write-Verbose {
                Param ($Message)
                Microsoft.PowerShell.Utility\Write-Verbose -Message $Message
                Microsoft.PowerShell.Utility\Write-Verbose -Message "[$(Get-Date -Format G)]: $Message" 4>> $LogFilePath
            }
        }

        # Set Write-Verbose block location.
        $BlockLocation = '[BEGIN  ]'
        Write-Verbose -Message "$BlockLocation Entering the Begin block [Function: $($MyInvocation.MyCommand.Name)]."
        #endregion

        Write-Verbose -Message "$BlockLocation Storing the template's function to memory."
        $TemplateFunction = @"
Function Set-SystemForNextRun {
    Param (
        [string]`$Pass,
        [switch]`$UserData,
        [switch]`$Restart
    )
    If (`$Pass) {
        [System.Void](New-ItemProperty -Path 'HKLM:\SOFTWARE\DEPT' -Name "Pass`$Pass" -Value 'Complete')
    }
    If (`$UserData) {
        `$Path = "`$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]`$ConfigXml = Get-Content -Path `$Path
        (`$ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        `$ConfigXml.Save(`$Path)
    }
    If (`$Restart) {
        Restart-Computer -Force
    }
}

# Check if Registry Subkey does not exist.
If (-Not(Get-Item -Path 'HKLM:\SOFTWARE\DEPT' -ErrorAction SilentlyContinue)) {
    # Create Registry Subkey.
    [System.Void](New-Item -Path 'HKLM:\SOFTWARE\' -Name 'DEPT')
}


"@
    } # End Begin.

    Process {
        #region Set Write-Verbose block location.
        $BlockLocation = '[PROCESS]'
        Write-Verbose -Message "$BlockLocation Entering the Process block [Function: $($MyInvocation.MyCommand.Name)]."
        #endregion

        Write-Verbose -Message "$BlockLocation Beginning to create the If-ElseIf code for the template."
        1..$CodeSectionCount | ForEach-Object {
            If ($_ -eq 1) {
                $Start = 'If'
            } Else {
                $Start = 'ElseIf'
            }

            If ($EnableUserData -eq 'All') {
                $UserData = '-UserData '
            } ElseIf ($_ -eq $CodeSectionCount) {
                $UserData = $null
            } Else {
                $UserData = '-UserData '
            }

            If ($EnableRestart -eq 'All') {
                $Restart = '-Restart'
            } ElseIf ($_ -eq $CodeSectionCount) {
                $Restart = $null
            } Else {
                $Restart = '-Restart'
            }

            $TemplateIfElseIf += @"
$Start (-Not((Get-ItemProperty -Path 'HKLM:\SOFTWARE\DEPT').Pass$_ -eq 'Complete')) {
    
    # Place code here ($_).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -Pass '$_' $UserData$Restart

} $End
"@
        } # End ForEach-Object.
    } # End Process.

    End {
        #region Set Write-Verbose block location.
        $BlockLocation = '[END    ]'
        Write-Verbose -Message "$BlockLocation Entering the End block [Function: $($MyInvocation.MyCommand.Name)]."
        #endregion

        Write-Verbose -Message "$BlockLocation Displaying the DEPT AWS mulitple run code with $CodeSectionCount code section(s)."
        "$TemplateFunction$TemplateIfElseIf"
    } # End End.
} # End Function: New-AWSMultiRunTemplate.

Read IV

AWS EC2 Instance CSV File: Let’s Have it, Amazon

A while back I found something that AWS began offering that made the below paragraph meaningless. I’ll be back with more soon. I know I said that before, but I’ll try extra hard this time…

I’ve wanted it for some time now… an always up-to-date CSV file that I can programmatically download—using PowerShell, duh—directly from AWS. But not just any CSV. I need a continually updated CSV that includes all the possible information on every AWS EC2 instance type. It can even include the previous generation instances, providing there’s a column that indicates whether it’s current or not. It would probably contain information I’ll never need, or want, and I’d still want the file, as I can easily filter against the contents of the document. I’d love to write the website post that shows how to do that if we can get this file to fruition.

 

AWS UserData Multiple Run Framework Part II

My first post on the topic of using AWS UserData (via CloudFormation and PowerShell), was posted recently (http://tommymaynard.com/aws-userdata-multiple-run-framework-2017). While it didn’t light up Twitter and Facebook, where it was shared, there was still some value to it for me, and at least one other person. Therefore, I’ve decided to do part two. This is one of these using-PowerShell-to-create-PowerShell posts. I just love these.

A quick recap on the first post. I use the below PowerShell function in all the UserData sections of all my CloudFormation documents (when building Windows systems). Additionally, I also include an If-ElseIf language construct to do specific actions against an EC2 instance, as it’s coming online. The best part here is that I can do instance restarts between configuration steps. While I’ll include the full example (actually put together), from Part I below, it might be best to read that first post (link above).

Function Set-SystemForNextRun {
    Param (
        [string]$PassFile,
        [switch]$UserData,
        [switch]$Restart
    )
    If ($PassFile) {
        [System.Void](New-Item -Path "$env:SystemDrive\passfile$PassFile.txt" -ItemType File)
    }
    If ($UserData) {
        $Path = "$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        $ConfigXml = Get-Content -Path $Path
        ($ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        $ConfigXml.Save($Path)
    }
    If ($Restart) {
        Restart-Computer -Force
    }
}

If (-Not(Test-Path -Path "$env:SystemDrive\passfile1.txt")) {
 
    # Place code here (1).
 
    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '1' -UserData -Restart
 
} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile2.txt")) {
 
    # Place code here (2).
 
    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '2' -UserData -Restart
 
} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile3.txt")) {
 
    # Place code here (3).
 
    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '3' -UserData -Restart
 
} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile4.txt")) {
 
    # Place code here (4).
 
    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '4'
 
}

Here’s the thing, the above example is going to restart my computer three times: after the If portion, after the first ElseIf, and after the second ElseIf portion. These three restarts effectively give me four areas in which to configure an EC2 instance in sequence and again, between restarts.

There’s no way for me to predict how many code runs you or I are going to need with each new project, whether or not we’ll want to enable UserData after the final code run, or even if we’ll need a final, final restart. Therefore, I’ve written a PowerShell function to help with this process. Why manually edit this If-ElseIf statement if you don’t have to, right? With this newer function, you can decide how many code runs you need and whether you need to enable UserData or issue a restart after your final code section executes. Here are a few examples with the full function, New-AWSMultiRunTemplateat the end of today’s post.

In this first example, we’ll just run the function with its default settings. This will create the Set-SystemForNextRun function and an If-ElseIf statement (with two places to configure the instance and a single restart in between them. If this is what you needed, you’d take the output of the New-AWSMultiRunTemplate function invocation and paste it into your AWS CloudFormation’s UserData section, between an open and close PowerShell tag: <powershell> and </powershell>.

New-AWSMultiRunTemplate
Function Set-SystemForNextRun {
    Param (
        [string]$PassFile,
        [switch]$UserData,
        [switch]$Restart
    )
    If ($PassFile) {
        [System.Void](New-Item -Path "$env:SystemDrive\passfile$PassFile.txt" -ItemType File)
    }
    If ($UserData) {
        $Path = "$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]$ConfigXml = Get-Content -Path $Path
        ($ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        $ConfigXml.Save($Path)
    }
    If ($Restart) {
        Restart-Computer -Force
    }
}

If (-Not(Test-Path -Path "$env:SystemDrive\passfile1.txt")) {

    # Place code here (1).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '1' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile2.txt")) {

    # Place code here (2).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '2'

}

This next run will create a single If statement. I suppose if you’re only going to do a single configuration pass, you don’t need this function at all. Again, I said I use it for all my CloudFormation templates. Say you edit your CloudFormation document later and add more configuration and restarts, you’ll already have most of what you need to be included in your CloudFormation document.

New-AWSMultiRunTemplate -CodeSectionCount 1
Function Set-SystemForNextRun {
    Param (
        [string]$PassFile,
        [switch]$UserData,
        [switch]$Restart
    )
    If ($PassFile) {
        [System.Void](New-Item -Path "$env:SystemDrive\passfile$PassFile.txt" -ItemType File)
    }
    If ($UserData) {
        $Path = "$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]$ConfigXml = Get-Content -Path $Path
        ($ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        $ConfigXml.Save($Path)
    }
    If ($Restart) {
        Restart-Computer -Force
    }
}

If (-Not(Test-Path -Path "$env:SystemDrive\passfile1.txt")) {

    # Place code here (1).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '1'

}

Now, let’s get crazy! We’re going to add a bunch of ElseIf statements in this example. Additionally, we’re going to highlight a couple of other features. There are parameters to use if you wish to include a final UserData reset and a final restart. Take a look at the below parameters used with the New-AWSMultiRunTemplate.

New-AWSMultiRunTemplate -CodeSectionCount 6 -EnableUserData All -EnableRestart All
Function Set-SystemForNextRun {
    Param (
        [string]$PassFile,
        [switch]$UserData,
        [switch]$Restart
    )
    If ($PassFile) {
        [System.Void](New-Item -Path "$env:SystemDrive\passfile$PassFile.txt" -ItemType File)
    }
    If ($UserData) {
        $Path = "$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]$ConfigXml = Get-Content -Path $Path
        ($ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        $ConfigXml.Save($Path)
    }
    If ($Restart) {
        Restart-Computer -Force
    }
}

If (-Not(Test-Path -Path "$env:SystemDrive\passfile1.txt")) {

    # Place code here (1).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '1' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile2.txt")) {

    # Place code here (2).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '2' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile3.txt")) {

    # Place code here (3).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '3' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile4.txt")) {

    # Place code here (4).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '4' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile5.txt")) {

    # Place code here (5).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '5' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile6.txt")) {

    # Place code here (6).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '6' -UserData -Restart

}

Notice that in the final ElseIf above, we have the UserData and Restart parameters. These were added by including the EnableUserData and EnableRestart parameters, and All as the parameters’ value. By default, they’re both set to the AllButLast values. And that it’s! Here’s the New-AWSMultiRunTemplate function, in case it’s something you might find helpful! Enjoy!!

Function New-AWSMultiRunTemplate {
    [CmdletBinding()]
    Param (
        [Parameter()]
        [ValidateRange(1,10)]
        [int]$CodeSectionCount = 2,

        [Parameter()]
        [ValidateSet('All','AllButLast')]
        [string]$EnableUserData = 'AllButLast',

        [Parameter()]
        [ValidateSet('All','AllButLast')]
        [string]$EnableRestart = 'AllButLast'
    )

    DynamicParam {
        # Create dynamic, Log parameter.
        If ($PSBoundParameters['Verbose']) {
            $SingleAttribute = New-Object System.Management.Automation.ParameterAttribute
            $SingleAttribute.Position = 1

            $AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
            $AttributeCollection.Add($SingleAttribute)

            $LogParameter = New-Object 
System.Management.Automation.RuntimeDefinedParameter('Log',[switch],$AttributeCollection)
            $LogParameter.Value = $true
 
            $ParamDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
            $ParamDictionary.Add('Log',$LogParameter)
            return $ParamDictionary
        } # End If.
    } # End DynamicParam.

    Begin {
        #region Create logs directory and Write-Verbose function.
        If ($PSBoundParameters['Verbose'] -and $PSBoundParameters['Log']) {
            $LogDirectory = "$($MyInvocation.MyCommand.Name)"
            $LogPath = "$env:SystemDrive\support\Logs\$LogDirectory"
            If (-Not(Test-Path -Path $LogPath)) {
                [System.Void](New-Item -Path $LogPath -ItemType Directory)
            }
            $LogFilePath = "$LogPath\$(Get-Date -Format 'DyyyyMMddThhmmsstt').txt"

            Function Write-Verbose {
                Param ($Message)
                Microsoft.PowerShell.Utility\Write-Verbose -Message $Message
                Microsoft.PowerShell.Utility\Write-Verbose -Message "[$(Get-Date -Format G)]: $Message" 4&amp;gt;&amp;gt; 
$LogFilePath
            }
        }

        # Set Write-Verbose block location.
        $BlockLocation = '[BEGIN  ]'
        Write-Verbose -Message "$BlockLocation Entering the Begin block [Function: $($MyInvocation.MyCommand.Name)]."
        #endregion

        Write-Verbose -Message "$BlockLocation Storing the template's function to memory."
        $TemplateFunction = @"
Function Set-SystemForNextRun {
    Param (
        [string]`$PassFile,
        [switch]`$UserData,
        [switch]`$Restart
    )
    If (`$PassFile) {
        [System.Void](New-Item -Path "`$env:SystemDrive\passfile`$PassFile.txt" -ItemType File)
    }
    If (`$UserData) {
        `$Path = "`$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]`$ConfigXml = Get-Content -Path `$Path
        (`$ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        `$ConfigXml.Save(`$Path)
    }
    If (`$Restart) {
        Restart-Computer -Force
    }
}


"@
    } # End Begin.

    Process {
        #region Set Write-Verbose block location.
        $BlockLocation = '[PROCESS]'
        Write-Verbose -Message "$BlockLocation Entering the Process block [Function: $($MyInvocation.MyCommand.Name)]."
        #endregion

        Write-Verbose -Message "$BlockLocation Beginning to create the If-ElseIf code for the template."
        1..$CodeSectionCount | ForEach-Object {
            If ($_ -eq 1) {
                $Start = 'If'
            } Else {
                $Start = 'ElseIf'
            }

            If ($EnableUserData -eq 'All') {
                $UserData = '-UserData '
            } ElseIf ($_ -eq $CodeSectionCount) {
                $UserData = $null
            } Else {
                $UserData = '-UserData '
            }

            If ($EnableRestart -eq 'All') {
                $Restart = '-Restart'
            } ElseIf ($_ -eq $CodeSectionCount) {
                $Restart = $null
            } Else {
                $Restart = '-Restart'
            }

            $TemplateIfElseIf += @"
$Start (-Not(Test-Path -Path "`$env:SystemDrive\passfile$_.txt")) {
    
    # Place code here ($_).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '$_' $UserData$Restart

} $End
"@
        } # End ForEach-Object.
    } # End Process.

    End {
        #region Set Write-Verbose block location.
        $BlockLocation = '[END    ]'
        Write-Verbose -Message "$BlockLocation Entering the End block [Function: $($MyInvocation.MyCommand.Name)]."
        #endregion

        Write-Verbose -Message "$BlockLocation Displaying the AWS mulitple run code with $CodeSectionCount code 
section(s)."
        "$TemplateFunction$TemplateIfElseIf"
    } # End End.
} # End New-AWSMultiRunTemplate function.

Read III

AWS UserData Multiple Run Framework

In AWS, we can utilize the UserData section in EC2 to run PowerShell against our EC2 instances at launch. I’ve said it before; I love this option. As someone that speaks PowerShell with what likely amounts to first language fluency, there’s so much I do to automate my machine builds with CloudFormation, UserData, and PowerShell.

I’ve begun to have a need to do various pieces of automation at different times. This is to say I need to have multiple instance restarts, as an instance is coming online, in order to separate different pieces of configuration and installation. You’ll figure out when you need that, too. And, when you do, you can use what I’ve dubbed the “multiple run framework for AWS.” But really, you call it what you want. That hardly matters.

We have to remember, that by default, UserData only runs once. It’s when the EC2 instance launches for the first time. In the below example, we’re going to do three restarts and four separate code runs.

Our UserData section first needs to add a function to memory. I’ve called it Set-SystemForNextRun and its purpose is to (1) create what I call a “passfile” to help indicate where we are in the automation process, (2) enable UserData to run the next time the service is restarted (this happens at instance restart, obviously), and (3) restart the EC2 instance. Let’s have a look. Its three parameters and three If statements; simple stuff.

Function Set-SystemForNextRun {
    Param (
        [string]$PassFile,
        [switch]$UserData,
        [switch]$Restart
    )
    If ($PassFile) {
        [System.Void](New-Item -Path "$env:SystemDrive\passfile$PassFile.txt" -ItemType File)
    }
    If ($UserData) {
        $Path = "$env:ProgramFiles\Amazon\Ec2ConfigService\Settings\config.xml"
        [xml]$ConfigXml = Get-Content -Path $Path
        ($ConfigXml.Ec2ConfigurationSettings.Plugins.Plugin |
            Where-Object -Property Name -eq 'Ec2HandleUserData').State = 'Enabled'
        $ConfigXml.Save($Path)
    }
    If ($Restart) {
        Restart-Computer -Force
    }
}

The above function accepts three parameters: PassFile, UserData, and Restart. PassFile accepts a string value. You’ll see how this works in the upcoming If-ElseIf example. UserData and Restart are switch parameters. If they’re included when the function is invoked, they’re True ($true), and if they’re not included, they’re False ($false).

Each of the three parameters has its own If statement within the Set-SystemforNextRun function. If PassFile is included, it creates a text file called C:\passfile<ValuePassedIn>.txt. If UserData is included, it resets UserData to enabled (it effectively, checks the check box in the Ec2Config GUI). If Restart is included, it restarts the instance, right then and there.

Now let’s take a look at the If-ElseIf statement that completes four code runs and three restarts. We’ll discuss it further below, but before we do, a little reminder. Our CloudFormation UserData PowerShell is going to contain the above Set-SystemForNextRun function, and something like you’ll see below after you’ve edited it for your needs.

If (-Not(Test-Path -Path "$env:SystemDrive\passfile1.txt")) {

    # Place code here (1).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '1' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile2.txt")) {

    # Place code here (2).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '2' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile3.txt")) {

    # Place code here (3).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '3' -UserData -Restart

} ElseIf (-Not(Test-Path -Path "$env:SystemDrive\passfile4.txt")) {

    # Place code here (4).

    # Invoke Set-SystemForNextRun function.
    Set-SystemForNextRun -PassFile '4'

}

In line 1, we test whether or not the file C:\passfile1.txt exists. If it doesn’t exist, we run the code in the If portion. This will run whatever PowerShell we add to that section. Then it’ll pass 1 to the Set-SystemForNextRun function to have C:\passfile01.txt created. Additionally, because the UserData and Restart parameters are included, it’ll reset UserData to enabled, and restart the EC2 instance. Because the C:\passfile1.txt file now exists, the next time the UserData runs, it’ll skip the If portion and evaluate the first ElseIf statement.

This ElseIf statement determines whether or not the C:\passfile2.txt file exists, or not. If it doesn’t, and it won’t after the first restart, then the code in this ElseIf will run. When it’s done, it’ll create the passfile2.txt file, reset UserData, and restart the instance. It’ll do this for the second ElseIf (third code run), and the final ElseIf (fourth code run), as well. Notice that the final invocation of the Set-SystemForNextRun function doesn’t enable UserData or Restart the instance. Be sure to add those if you need either completed after the final ElseIf completes.

And that’s it. At this point in time, I always use my Set-SystemForNextRun function and a properly written If-ElseIf statement to separate the configuration and installation around the necessary amount of instance restarts. In closing, keep in mind that deleting those pass files from the root of the C:\ drive is not something you’ll likely want to do. In time, I may do a rewrite that stores entries in the Registry perhaps, so there’s less probability that one of these files might be removed by someone.

Either way, I hope this is helpful for someone! If you’re in this space—AWS, CloudFormation, UserData, and PowerShell—then chances are good that at some point you’re going to want to restart an instance, and then continue to configure it.

Read II

Silent Install from an ISO

In the last several weeks, I’ve been having a great time writing PowerShell functions and modules for new projects moving to Amazon Web Services (AWS). I’m thrilled with the inclusion of UserData as a part of provisioning an EC2 instance. Having developed my PowerShell skills, I’ve been able to leverage them in conjunction with UserData to do all sorts of things to my instances. I’m reaching into S3 for installers, expanding archive files, creating folders, bringing down custom written modules in UserData and invoking the contained functions from them there, too. I’m even setting the timezone. It’s seems so straight forward sure, but getting automation and logging wrapped around that need, is rewarding.

As a part of an automated SQL installation — yes, the vendor told me they don’t support AWS RDS — I had a new challenge. It wasn’t overly involved by any means, but it’s worthy of sharing, especially if someone hits this post in a time of need, and gets a problem solved. I’ve said it a millions times: I often write, so I have a place to put things I may forget, but truly, it’s about anyone else I can help, as well. I’ve been at that almost three years now.

Back to Microsoft SQL: It’s on an ISO. I’ve been pulling down Zip files for weeks, in various projects, with CloudFormation, and expanding them, but this was a new one. I needed to get at the files in that ISO to silently run an installation. Enter the Mount-DiskImage function from Microsoft’s Storage module. Its help synopsis says this: “Mounts a previously created disk image (virtual hard disk or ISO), making it appear as a normal disk.” The command to pull just that help information is listed below.

PS > (Get-Help -Name Mount-DiskImage).Synopsis

As I typically do, I started working with the function in order to learn how to use it. It works as described. Here’s the command I used to mount my ISO.

PS > Mount-DiskImage -ImagePath 'C:\Users\tommymaynard\Desktop\SQL2014.ISO'

The above example doesn’t produce any output by default, and I rather like it that way. After a dismount — it’s the same above command with Dismount-DiskImage instead of Mount-DiskImage — I tried it with the -PassThru parameter. This parameter returns an object with some relevant information.

PS > Mount-DiskImage -ImagePath 'C:\Users\tommymaynard\Desktop\SQL2014.ISO' -PassThru

Attached          : False
BlockSize         : 0
DevicePath        :
FileSize          : 2606895104
ImagePath         : C:\Users\tommymaynard\Desktop\SQL2014.ISO
LogicalSectorSize : 2048
Number            :
Size              : 2606895104
StorageType       : 1
PSComputerName    :

The first thing I noticed about this output is that it didn’t provide the drive letter used to mount the ISO. I was going to need that drive letter in PowerShell, in order to move to that location and run the installer. Even if I didn’t move to that location, I needed the drive letter to create a full path. The drive letter was vital, and this, is why we’re here today.

Update: See the below post replies where Get-Volume is used to discover the drive letter.

Although the warmup here seemed to take a bit, we’re almost done here for today. I’ll drop the code below, and we’ll do a quick, line-by-line walk through.

# Mount SQL ISO and run setup.exe.
PS > $DrivesBeforeMount = (Get-PSDrive).Name
PS >
PS > Mount-DiskImage -ImagePath 'C:\Users\tommymaynard\Desktop\SQL2014.ISO'
PS >
PS > $DrivesAfterMount = (Get-PSDrive).Name
PS >
PS > $DriveLetterUsed = (Compare-Object -ReferenceObject $DrivesBeforeMount -DifferenceObject $DrivesAfterMount).InputObject
PS >
PS > Set-Location -Path "$DriveLetterUsed`:\"

Line 2: The first command in this series, stores the name property of all the drives in our current PowerShell session in a variable named $DrivesBeforeMount. That name should offer some clues.

Line 4: This line should look familiar; it mounts our SQL 2014 ISO (to a mystery drive letter).

Line 6: Here, we run the same command as in Line 2, however, we store the results in $DrivesAfterMount. Do you see what we’re up to yet?

Line 8:  This command compares our two recently created Drive* variables. We want to know which drive is there now, that wasn’t when the first Get-PSDrive command was run.

Line 10: And finally, now that we know the drive letter used for our newly mounted ISO, we can move there in order to access the setup.exe file.

Okay, that’s it for tonight. Now back to working on a silent SQL install on my EC2 instance.

Add AccessKey and SecretKey Parameters to AWS Function

I’m spending more, and more, time with AWS. As a PowerShell enthusiast, I’ve therefore kept myself busy during a few of the week and weekend evenings. And sometimes during the day, too, when it makes sense.

As of today, I’ve written four AWS-specific functions for work. It’s everything from comparing my version of the AWSPowerShell module with the one in the PowerShell Gallery, returning the AWS service noun prefixes (EC2, CFN, etc.), and returning the EC2 instance type counts, and then there’s the newest one.

My most recent function can create an RDP (.rdp) file to connect to an EC2 instance (it gets the EC2’s IP address and places that in an RDP file). As I wrote up an email to share the function with a coworker—the one that mentioned Azure having an option to download RDP files—it occurred to me that none of my functions include an option to provide AccessKey and SecretKey parameters. All of them, that need it (two of them now), rely on persistent credentials having been saved.

Well, I’ll be going to go back through these two functions and add a way to accept an AccessKey and SecretKey parameter. First, however, was to figure out how. I literally typed “PowerShell add AccessKey and SecretKey to function” into Google (because I was being lazy,… and didn’t feel the need to write something that must surely be out there). Well, there wasn’t anything I could use, so I’ve done the work myself. Now, if anyone else types that, or something like that into Google or Bing, maybe they’ll get this answer.

Before I consider moving this to an existing function, I needed to make it work. I did just that, using parameter sets. I’ve long known about them, but haven’t really had a need for them in more than a few instances. This of course led me to view the parameter sets on an AWS cmdlet. Ugh, they aren’t even parameter sets on something like Get-EC2Instance. Try it yourself: Show-Command -Name Get-EC2Instance. I suspect there’s internal logic in the cmdlet itself to handle whether or not an AccessKey and SecretKey have been provided. Don’t have persistent, shell credentials: error*. Only provide an AccessKey: error*. Only provide a SecretKey: error*. Well, I went with parameter sets anyway.

* “No credentials specified or obtained from persisted/shell defaults.”

Here’s the complete function used to work through this desire. Take a look and continue reading about it further below.

Function Test-AWSParameterSets {
    [CmdletBinding(DefaultParameterSetName='NoAccessKeys')]
    Param(
        [Parameter(ParameterSetName='NoAccessKeys',Mandatory = $true)]
        [Parameter(ParameterSetName='AccessKeys',Mandatory=$true)]
        [String]$InstanceID,
 
        [Parameter(ParameterSetName='AccessKeys',Mandatory=$true)]
        [string]$AccessKey,
 
        [Parameter(ParameterSetName='AccessKeys',Mandatory=$true)]
        [string]$SecretKey

    )

    $PSBoundParameters
    '-----------------------------------------'
    "InstanceID: $InstanceID"
    "AccessKey: $AccessKey"
    "SecretKey: $SecretKey"
    "ParameterSet: $($PSCmdlet.ParameterSetName)"
}

In this first example, we’ll run the function without supplying any parameter names, or values. Because the InstanceID parameter is mandatory, and the NoAccessKeys parameter set is the default, the PowerShell engine prompts me to enter a value for InstanceID. After I entered i-1234554321, the function returns the $PSBoundParameters hash tables. This includes the parameter names and values that were supplied to the function at run time. Additionally, it returns the values for the InstanceID, AccessKey, SecretKey, and which of the two, parameter sets were used: NoAccessKeys or AccessKeys.

Test-AWSParameterSets
cmdlet Test-AWSParameterSets at command pipeline position 1
Supply values for the following parameters:
InstanceID: i-1234554321

Key        Value       
---        -----       
InstanceID i-1234554321
-----------------------------------------
InstanceID: i-1234554321
AccessKey: 
SecretKey: 
ParameterSet: NoAccessKeys

The next example includes an InstanceID parameter when the function is invoked. Therefore, I’m not prompted to enter any additional information. It produces the same output as it did above.

Test-AWSParameterSets -InstanceID i-1234554321
Key        Value       
---        -----       
InstanceID i-1234554321
-----------------------------------------
InstanceID: i-1234554321
AccessKey: 
SecretKey: 
ParameterSet: NoAccessKeys

The below example includes both an InstanceID parameter and an AccessKey parameter when the function is invoked. The moment the AccessKey parameter name was included, we switched to using the AccessKeys parameter set. Therefore, I was prompted to enter a SecretKey parameter value, as it’s a required parameter in that parameter set.

Test-AWSParameterSets -InstanceID i-1234554321 -AccessKey aacccceesssskkeeyy
cmdlet Test-AWSParameterSets at command pipeline position 1
Supply values for the following parameters:
SecretKey: ssseeecccrrreeetttkkkeeeyyy

Key        Value                      
---        -----                      
InstanceID i-1234554321               
AccessKey  aacccceesssskkeeyy         
SecretKey  ssseeecccrrreeetttkkkeeeyyy
-----------------------------------------
InstanceID: i-1234554321
AccessKey: aacccceesssskkeeyy
SecretKey: ssseeecccrrreeetttkkkeeeyyy
ParameterSet: AccessKeys

This is basically the opposite of the above example. I included the SecretKey parameter and it prompted me to enter the AccessKey parameter—we can’t have one without the other; they’re both mandatory in the AccessKeys parameter set.

Test-AWSParameterSets -InstanceID i-1234554321 -SecretKey ssseeecccrrreeetttkkkeeeyyy
cmdlet Test-AWSParameterSets at command pipeline position 1
Supply values for the following parameters:
AccessKey: aacccceesssskkeeyy

Key        Value                      
---        -----                      
InstanceID i-1234554321               
SecretKey  ssseeecccrrreeetttkkkeeeyyy
AccessKey  aacccceesssskkeeyy         
-----------------------------------------
InstanceID: i-1234554321
AccessKey: aacccceesssskkeeyy
SecretKey: ssseeecccrrreeetttkkkeeeyyy
ParameterSet: AccessKeys

The final example includes values for all the parameter names at the time the function is invoked. If someone wasn’t using a persistent set of credentials, then this is how you might expect an AWS function you’ve written to be used.

Test-AWSParameterSets -InstanceID i-1234554321 -AccessKey aacccceesssskkeeyy -SecretKey ssseeecccrrreeetttkkkeeeyyy
Key        Value                      
---        -----                      
InstanceID i-1234554321               
SecretKey  ssseeecccrrreeetttkkkeeeyyy
AccessKey  aacccceesssskkeeyy         
-----------------------------------------
InstanceID: i-1234554321
AccessKey: aacccceesssskkeeyy
SecretKey: ssseeecccrrreeetttkkkeeeyyy
ParameterSet: AccessKeys

Well, that’s the first part of the challenge. Now to incorporate what I’ve done here, into the functions that need it. Maybe, I’ll be back with that.