Tag Archives: Module

Prestaging Modules for PowerShell, Windows PowerShell

I recently wrote this to a colleague: “Finally, if you’re going to peer into the module manifest file (.psd1), you might also check ‘CompatiblePSEditions.’ If it includes both the ‘Core’ and ‘Desktop’ values, then you might consider installing modules in both $env:USERPROFILE\Documents\WindowsPowerShell\Modules and $env:USERPROFILE\Documents\PowerShell\Modules.”

He’s using SCCM, Software Center, and a Windows PowerShell form to install modules. My thought was to install it for both PowerShell and Windows PowerShell, regardless of which they were using and in reference to the manifest file. This would ensure a module was already in place provided the user switched between PowerShell and Windows PowerShell or moved from one to another one day. Not a bad idea, I suppose. If the duplicated module is never used, it’s okay as the disk space used would likely never really be noticed. Maybe one day, PowerShell will include this option itself. You install a module in PowerShell and include some yet-to-be-determined switch parameter, and boom, it’s ready in both locations.

This got me thinking. How would one write the code to do the conditional logic here? While it wasn’t my project, I couldn’t help myself from determining how I might write it. And, when it was done — and this has happened before — I wasn’t sure what to do with it. Enter, this blog post; its new home, or final resting place.

I wrote this to check four different modules on my computer. They were: (1) AWS.Tools.Common, (2) AWSLambdaPSCore, (3) ISE, and (4) ExchangeOnlineManagement. Each of these four contained a different value in the psd1’s CompatiblePSEditions. In the same order as above, there were (1) Core and Desktop, (2) Only Core, (3) Only Desktop, and (4) Neither Core nor Desktop.

Here’s what I wrote; we’ll discuss it further below. It was saved into a file named ModuleDeploy.ps1.

# Both Core AND Desktop.
$Path01 = "$env:USERPROFILE\Documents\PowerShell\Modules\AWS.Tools.Common\4.0.5.0\AWS.Tools.Common.psd1"

# ONLY Core.
$Path02 = "$env:USERPROFILE\Documents\PowerShell\Modules\AWSLambdaPSCore\2.0.0.0\AWSLambdaPSCore.psd1"

# ONLY Desktop.
$Path03 = "$env:SystemRoot\system32\WindowsPowerShell\v1.0\Modules\ISE\ISE.psd1"

# Neither Core NOR Desktop.
$Path04 = "$env:USERPROFILE\Documents\PowerShell\Modules\ExchangeOnlineManagement\1.0.1\ExchangeOnlineManagement.psd1" 

$Paths = $Path01,$Path02,$Path03,$Path04
Foreach ($Path in $Paths) {
	"`r`n||| Module: $(Split-Path -Path $Path -Leaf) |||"
	$Psd1File = Import-PowerShellDataFile -Path $Path

	Switch ($Psd1File.CompatiblePSEditions) {
		{$_ -contains 'Desktop'} {
			Write-Output -InputObject 'Copy to WindowsPowerShell directory.'
		} # End Condition.

		{$_ -contains 'Core'} {
			Write-Output -InputObject 'Copy to PowerShell directory.'
		} # End Condition.

		default {
			If ($PSVersionTable.PSEdition -eq 'Desktop' -or $null -eq $PSVersionTable.PSEdition) {'Copy to WindowsPowerShell directory (default).'
			} ElseIf ($PSVersionTable.PSEdition -eq 'Core') {'Copy to PowerShell directory (default).'} # End If-ElseIf.
		} # End Condition.
	} # End Switch.
} # End Foreach.

Let’s quickly cover what’s happening in the above code. Then we’ll view the below results for the execution in both PowerShell and Windows PowerShell. Showing both versions may clarify the reason for the If statement inside the above final/default Switch condition. We’re already jumping ahead.

First, we created four different path variables for four specific module manifest files (.psd1 files). The first one is to a module that includes both “Core” and “Desktop” in CompatiblePSEdition. The second only has “Core,” the third only has “Desktop,” and the final one has neither/nothing. This was mentioned earlier. These four variables and their assigned values are combined into an array, which we then iterate over inside of a Foreach loop.

During each iteration, we run through a Switch statement that indicates where each module would be copied (if we were actually copying the modules). Based on the value(s) in CompatiblePSEdition for each of the files, the first module would be copied to both the PowerShell and WindowsPowerShell modules, the second, only to PowerShell modules, and the third, only to WindowsPowerShell modules. The final copy is dependent on whether or not we’re using PowerShell or Windows PowerShell. Take a look at the results.

PowerShell 7.0.1

[PS7.0.1] C:\> . "$env:TEMP\ModuleDeploy.ps1"

||| Module: AWS.Tools.Common.psd1 |||
Copy to PowerShell directory.
Copy to WindowsPowerShell directory.

||| Module: AWSLambdaPSCore.psd1 |||
Copy to PowerShell directory.

||| Module: ISE.psd1 |||
Copy to WindowsPowerShell directory.

||| Module: ExchangeOnlineManagement.psd1 |||
Copy to PowerShell directory (default).

WindowsPowerShell 5.1

[PS5.1.18362.752] C:\> . "$env:TEMP\ModuleDeploy.ps1"

||| Module: AWS.Tools.Common.psd1 |||
Copy to PowerShell directory.
Copy to WindowsPowerShell directory.

||| Module: AWSLambdaPSCore.psd1 |||
Copy to PowerShell directory.

||| Module: ISE.psd1 |||
Copy to WindowsPowerShell directory.

||| Module: ExchangeOnlineManagement.psd1 |||
Copy to WindowsPowerShell directory (default).

That’s it. Just some code that needed a home. I’m fully aware that there’s better ways this could be done, and that there would be some copy duplication if this were dropped into production the way it is. Additionally, if a directory didn’t yet exist, it might have to be created. Still, it needed a home (as is), and I didn’t think about much more than the conditional logic involved to determine where a module would need to be deployed.

The ScriptsToProcess and RequiredModules Order

Recently, I wrote this:

Even more recently, I wrote a usable fix.

Before we get there, however, let’s make sure my Tweet makes sense. First off, if you don’t know already, you need to be aware that there’s an optional file called the module manifest file that we can include alongside a script module file (a .psm1). It’s a .psd1 file and its purpose in life is to help define additional information — metadata — about a script module.

In addition to telling us about the module (the author, description, version, etc.), it can do other things for us. This includes requiring a specific PowerShell host program, requiring a specific version of PowerShell, requiring specific modules are imported when our module is imported, and also running PowerShell scripts, before our module is imported. You don’t have to use a module manifest file when you create and a use a module, but there’s so much to gain from doing so (and it’s super easy [see New-ModuleManifest]).

My problem here is that the RequiredModules section — an entry in our module manifest file — is checked before any of the scripts are run that assist to set up the environment before the module is done loading. This means that I was unable to install a PowerShell module via these scripts (called ScriptsToProcess) before RequiredModules inspected the system for the modules in which our module is dependent. Too bad. To me, and in this instance at minimum, these two module manifest entries run in the wrong order. Had ScriptsToProcess run first, I would have been able to install a PowerShell module before the required modules’ dependencies were evaluated.

To get this to work as desired, required a workaround. I thought I’d take a minute and share what I’ve done. One, we have a script module — a .psm1 file — and a module manifest — a .psd1 file. We also, have a second .psd1 file. This is key.

The first .psd1 file does not require any modules; it does not have a dependency on the system already having specific modules in place. Here’s that entry in our first, or initial, module manifest file. Do notice that RequiredModules is commented out, and therefore not read, when this file is parsed.

# Modules that must be imported into the global environment prior to importing this module
# RequiredModules = @()

The next section of interest in our first .psd1 file, is ScriptsToProcess. These are standalone scripts that execute prior to our module importing. Do notice that ScriptsToProcess can accept multiple scripts. This means I can run multiple scripts, one right after another, in order that I don’t have all my code in one big script file. If you’re writing functions and not scripts, you get this. Smaller pieces of code are easier on you.

# Script files (.ps1) that are run in the caller's environment prior to importing this module.
ScriptsToProcess = '.\ScriptsToProcess\InstallADDelegationModule.ps1','.\ScriptsToProcess\ReplacePsd1File.ps1','.\ScriptsToProcess\ReimportModule.ps1'

Again, we have two module manifest files for our one, script module. The first script in the above list installs the ADDelegation PowerShell module onto our system. Remember, if our first manifest file required this module, we wouldn’t be able to get it installed to our system with ScriptsToProcess. With an initial, only used once, .psd1 file, we can. The second script, copies our second manifest file over the top of the one that currently executing during this first module import. Not to worry, the manifest is only read when the module is initially imported. The updates are not going to matter just yet. That said, finally, the last ScriptsToProcess script simply imports our module again, using the Force parameter. This parameter imports a module even if it’s already been imported. In doing this second import on our module, the second module manifest becomes active.

Before we consider that our module has been imported again with an updated manifest, we need to discuss the last section in our first manifest. It’s the FunctionsToExport section. Notice that during the first import of our module, there aren’t any functions being exported. Exported functions are how functions are added to our PowerShell session for use. This hardly matters, however, since the last ScriptsToProcess script, discussed above, imports our module again. Since we forcibly import our function again with the second manifest file in place, it doesn’t matters what we do or don’t import in the first run; it becomes of no importance almost immediately. Even so, I’m keeping this hold over, because there’s no reason to do any extra work on the first import, such as importing functions that would never be used.

# Functions to export from this module, for best performance, do not use wildcards and do not delete the entry, use an empty array if there are no functions to export.
FunctionsToExport = @()

Hopefully I’ve explained things well enough that you’ve been able to follow along so far. Our second module manifest file, that we copied over the first manifest in the second ScriptsToProcess file, has some changes as you can see below. In this manifest file, we do require a couple modules. Remember, we installed the ADDelegation module when we were using the first manifest file. We also have a dependency on the ActiveDirectory module, but I’m expecting that is already in place by my users (for now).

# Modules that must be imported into the global environment prior to importing this module
RequiredModules = 'ActiveDirectory','ADDelegation'

Next, we have only a single ScriptsToProcess script. While a ScriptsToProcess script isn’t always a necessity, or a requirement, all this script does is verify I have all the CSV files I need for the functions in our module.

# Script files (.ps1) that are run in the caller's environment prior to importing this module.
ScriptsToProcess = '.\ScriptsToProcess\TestForCsvFiles.ps1'

And lastly, we include all the functions we need exported into the PowerShell session for those using our module.

# Functions to export from this module, for best performance, do not use wildcards and do not delete the entry, use an empty array if there are no functions to export.
FunctionsToExport = 'New-DomainDelegationPrep','New-DeptADGroupAndRole','New-OnPremExchRole','New-O365ExchRole'

Hope you enjoyed the weekend, but it’s time to start the week again. You may have never seen it before, but we do have a way, albeit a workaround, to install modules and make them required. It takes a little extra work, but it’s doable. In my case, it’s worth the extra work.

PowerShell Code and AWS CloudFormation UserData

Note: This post was written well over a month ago, but was never posted, due to some issues I was seeing in AWS GovCloud. It works 100% of the time now, in both GovCloud and non-GovCloud AWS. That said, if you’re using Read-S3Object in GovCloud, you’re going to need to include the Region parameter name and value.

As I spend more and more time with AWS, I end up back at PowerShell. If I haven’t said it yet, thank you Amazon Web Services, for writing us a PowerShell module.

In the last month, or two, I’ve been getting into the CloudFormation template business. I love the whole UserData option we have—injecting PowerShell code into an EC2 instance during its initialization, and love, that while we can do it in the AWS Management Console, we can do it with CloudFormation (CFN) too. In the last few months, I’ve decided to do things a bit differently. Instead of dropping large amounts of PowerShell code inside my UserData property in the CFN template, I decided to use Read-S3Object to copy PowerShell modules to EC2 instances, and then just issue calls to the functions in the remainder of the CFN UserData. In one instance, I went from 200+ lines of PowerShell in the CFN template to just a few.

To test, I needed to verify if I could get a module folder and file into the proper place on the instance and be able to use the module’s function(s) immediately, without any need to end one PowerShell session and start a new one. I suspected this would work just fine, but it needed to be seen.

Here’s how the testing went: On my Desktop, I have a folder called MyModule. Inside the folder, I have a file called MyModule.psm1. If you haven’t seen it before, this file extension indicates the file is a PowerShell module file. The contents of the file, are as follows:

Function Get-A {
    'A'
}

Function Get-B {
    'B'
}

Function Get-C {
    'C'
}

The file contents indicate that the module contains three functions: Get-A, Get-B, and Get-C. In the next example, we can see that the Desktop folder isn’t a place where a module file and folder can exist, where we can expect that the modules will be automatically loaded into the PowerShell session. PowerShell isn’t aware of this module on its own, as can be seen below.

Get-A
Get-A : The term 'Get-A' is not recognized as the name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:1
+ Get-A
+ ~~~~~
    + CategoryInfo          : ObjectNotFound: (Get-A:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException
Get-B
Get-B : The term 'Get-B' is not recognized as the name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:1
+ Get-B
+ ~~~~~
    + CategoryInfo          : ObjectNotFound: (Get-B:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException
Get-C
Get-C : The term 'Get-C' is not recognized as the name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:1
+ Get-C
+ ~~~~~
    + CategoryInfo          : ObjectNotFound: (Get-C:String) [], CommandNotFoundException
    + FullyQualifiedErrorId : CommandNotFoundException

While I could tell PowerShell to look on my desktop, what I wanted to do is have my CFN template copy the module folder out of S3 and place it on the instances, in a preferred and proper location: C:\Program Files\WindowsPowerShell\Modules. This is a location that PowerShell checks for modules automatically, and loads them the moment a contained function, or cmdlet, from the module, is requested. My example uses a different path, but PowerShell will check here automatically, as well. As a part of this testing, we’re pretending that the movement from my desktop is close enough to the movement from S3 to an EC2 instance. I’ll obviously test this more with AWS.

Move-Item -Path .\Desktop\MyModule\ -Destination C:\Users\tommymaynard\Documents\WindowsPowerShell\Modules\
Get-A
A
PS > Get-B
B
Get-C
C

Without the need to open a new PowerShell session, I absolutely could use the functions in my module, the moment the module was moved from the Desktop into a folder PowerShell looks at by default. Speaking of those locations, you can view them by returning the value in the $env:PSModulePath environmental variable. Use $env:PSModulePath -split ';' to make it easier to read.

Well, it looks like I was right. I can simply drop those module folders on the EC2 instance, into C:\Program Files\WindowsPowerShell\Modules, just before they’re used with no need for anything more than the current PowerShell session that’s moving them into place.

Update: After this on-my-own-computer test, I took it to AWS. It works, and now it’s the only way I use my CFN template UserData. I write my function(s), house them in a PowerShell module(s), copy them to S3, and finally use my CFN UserData to copy them to the EC2 instance. When that’s complete, I can call the contained function(s) without any hesitation, or additional work. It wasn’t necessary, but I added sleep commands between the function invocations. Here’s a quick, modified example you might find in the UserData of one of my CloudFormation templates.

      UserData:
        Fn::Base64:
          !Sub |
          <powershell>
            # Download PowerShell Modules from S3.
            $Params = @{
              BucketName = 'windows'
              Keyprefix = 'WindowsPowerShell/Modules/ProjectVII/'
              Folder = "$env:ProgramFiles\WindowsPowerShell\Modules"
            }
            Read-S3Object @Params | Out-Null

            # Invoke function(s).
            Set-TimeZone -Verbose -Log
            Start-Sleep -Seconds 15

            Add-EncryptionType -Verbose -Log
            Start-Sleep -Seconds 15

            Install-ProjectVII -Verbose -Log
            Start-Sleep -Seconds 15
            </powershell>