The PowerShell Conference Book – My Chapter


Welcome to the 291st post on tommymaynard.com. It’s countdown to 300!


Well, it’s happened. My chapter has gone live in The PowerShell Conference Book. This is one of those moments I’ll never forget. Even so, it may not be entirely for the reason you might think. Sure, I’ve got my name alongside some of the biggest names in the PowerShell community right now, and that’s amazing, but this isn’t about me, so much. You see, the money collected from this effort goes toward The DevOps Collective, Inc. OnRamp Scholarship. That’s what’s most rewarding. It’s bigger than me, and as such, I’m glad that those of us that were chosen and accepted the opportunity to write a chapter, have been able to help in these fundraising efforts. I think it’s apparent that those already published certainly have.

(Read more below)

Before you rush over to Leanpub and get your own copy, let’s quickly discuss my chapter. Sure, there will eventually be 30 some chapters to consider, and just maybe you’ll read them all. If not, then at least give me the chance to see if I can get mine, on your must read list.

What I did was take some old code, and thanks to this project, greatly improve it. The chapter is called “The Profile Script: All Things All Places,” and it has everything to do with syncing a PowerShell profile script between multiple computers. You make a change to your CurrentUserAllHosts profile script on one of the computers in your sync group (a group of computers that all use the same profile script), and it ends up on the other computers in your sync group. It has some cool fixture code that uses PowerShell to create PowerShell. Its purpose, is to create your new CurrentUserAllHosts profile.ps1 profile script for all the computers in your sync group. Don’t worry, it’ll back up that profile script if it already exists. That fixture code function, might just be one my favorite parts of all the code written for this project.

When this step is done, you can add a partner, such as Dropbox, in order to start the sync process. Reopen PowerShell, and your profile script is copied from the system to Dropbox, in my example. Head over to another computer and run another function to establish the sync there. Now, everything should be synced between the two computers. It’s pretty new code, so expect that there might be problems. It’s open source, so we can both be sure to add features, and fixes over on GitHub. It’s great to know that any new variable, or alias, or function, added to my profile script on any of my computers, will be available on all the others, if I want it on all the others. There’s a way to make that determination inside the SinkProfile’s profile script.

Don’t let the below photo fool you; that’s not all the private functions. You’ll have to read the chapter to find out what’s missing. At some point in the development process, I stopped using the dry erase board, although it did feel mandatory early on. I added two final private functions (the ones in red), to the code, but they didn’t make this list. I see something else wrong, but oh well, this portion of my endeavor is complete.

Thank you for your time, and enjoy the book. There’s some extremely talented authors in this publication, and I’m honored to have been included with this group of PowerShell community members. I’m also excited that I was able to potentially add to the OnRamp Scholarship.

The PowerShell Conference Book – Intro


Welcome to the 290th post on tommymaynard.com. It’s countdown to 300!


Last Friday morning, The PowerShell Conference Book was made available for purchase. While it’s not yet complete, the Leanpub method of publishing allows that it can be sold and distributed even before it’s at 100%. This first release includes 9 of 33 chapters, or roughly 25%. As a part of its introduction, the below Tweet went out from Mike F Robbins. Do yourself a favor and follow the link if you haven’t and read his post. It’s quite descriptive of the overall project, and a great place to start in regard to learning more about this multi authored book project.

During the course of the last few months, I’ve been having one of those unforgettable PowerShell moments, albeit it’s a longer one than most. You know those moments: your first successful script, your most popular blog article, tweaking a script or function in a way you never though possible, being followed on Twitter by Don Jones, etc.

It all started when Mike contacted me and asked if I would be interested in writing a chapter in the newest PowerShell book. Of course I would! The whole idea wasn’t to spread out some spending money to a collection of worldwide PowerShell experts and authors. No. Instead, it had everything to do with Mike and his determination to help support The DevOps Collective, Inc. OnRamp Scholarship. The authors receive nothing tangible; they only receive an opportunity to help support this cause, as well.

You can help too. Buy the book, and help support this effort. You’ll get to learn from some of the best PowerShell minds this community has to offer (and me, somehow). I do want to thank Mike of course, but also Jeffery Hicks and Michael T Lombardi. The three of these guys went above and beyond what was required of them. Thanks as well to all the other authors that made this endeavor possible. Remember, get your copy, if you haven’t already.

In closing, I also want to thank my wife Jenn and kiddos, Carson and Arianna, for putting up with me. Sorry about being so PowerShell obsessed.

Yeah, okay not really. 😉

The ScriptsToProcess and RequiredModules Order

Recently, I wrote this:

Even more recently, I wrote a usable fix.

Before we get there, however, let’s make sure my Tweet makes sense. First off, if you don’t know already, you need to be aware that there’s an optional file called the module manifest file that we can include alongside a script module file (a .psm1). It’s a .psd1 file and its purpose in life is to help define additional information — metadata — about a script module.

In addition to telling us about the module (the author, description, version, etc.), it can do other things for us. This includes requiring a specific PowerShell host program, requiring a specific version of PowerShell, requiring specific modules are imported when our module is imported, and also running PowerShell scripts, before our module is imported. You don’t have to use a module manifest file when you create and a use a module, but there’s so much to gain from doing so (and it’s super easy [see New-ModuleManifest]).

My problem here is that the RequiredModules section — an entry in our module manifest file — is checked before any of the scripts are run that assist to set up the environment before the module is done loading. This means that I was unable to install a PowerShell module via these scripts (called ScriptsToProcess) before RequiredModules inspected the system for the modules in which our module is dependent. Too bad. To me, and in this instance at minimum, these two module manifest entries run in the wrong order. Had ScriptsToProcess run first, I would have been able to install a PowerShell module before the required modules’ dependencies were evaluated.

To get this to work as desired, required a workaround. I thought I’d take a minute and share what I’ve done. One, we have a script module — a .psm1 file — and a module manifest — a .psd1 file. We also, have a second .psd1 file. This is key.

The first .psd1 file does not require any modules; it does not have a dependency on the system already having specific modules in place. Here’s that entry in our first, or initial, module manifest file. Do notice that RequiredModules is commented out, and therefore not read, when this file is parsed.

# Modules that must be imported into the global environment prior to importing this module
# RequiredModules = @()

The next section of interest in our first .psd1 file, is ScriptsToProcess. These are standalone scripts that execute prior to our module importing. Do notice that ScriptsToProcess can accept multiple scripts. This means I can run multiple scripts, one right after another, in order that I don’t have all my code in one big script file. If you’re writing functions and not scripts, you get this. Smaller pieces of code are easier on you.

# Script files (.ps1) that are run in the caller's environment prior to importing this module.
ScriptsToProcess = '.\ScriptsToProcess\InstallADDelegationModule.ps1','.\ScriptsToProcess\ReplacePsd1File.ps1','.\ScriptsToProcess\ReimportModule.ps1'

Again, we have two module manifest files for our one, script module. The first script in the above list installs the ADDelegation PowerShell module onto our system. Remember, if our first manifest file required this module, we wouldn’t be able to get it installed to our system with ScriptsToProcess. With an initial, only used once, .psd1 file, we can. The second script, copies our second manifest file over the top of the one that currently executing during this first module import. Not to worry, the manifest is only read when the module is initially imported. The updates are not going to matter just yet. That said, finally, the last ScriptsToProcess script simply imports our module again, using the Force parameter. This parameter imports a module even if it’s already been imported. In doing this second import on our module, the second module manifest becomes active.

Before we consider that our module has been imported again with an updated manifest, we need to discuss the last section in our first manifest. It’s the FunctionsToExport section. Notice that during the first import of our module, there aren’t any functions being exported. Exported functions are how functions are added to our PowerShell session for use. This hardly matters, however, since the last ScriptsToProcess script, discussed above, imports our module again. Since we forcibly import our function again with the second manifest file in place, it doesn’t matters what we do or don’t import in the first run; it becomes of no importance almost immediately. Even so, I’m keeping this hold over, because there’s no reason to do any extra work on the first import, such as importing functions that would never be used.

# Functions to export from this module, for best performance, do not use wildcards and do not delete the entry, use an empty array if there are no functions to export.
FunctionsToExport = @()

Hopefully I’ve explained things well enough that you’ve been able to follow along so far. Our second module manifest file, that we copied over the first manifest in the second ScriptsToProcess file, has some changes as you can see below. In this manifest file, we do require a couple modules. Remember, we installed the ADDelegation module when we were using the first manifest file. We also have a dependency on the ActiveDirectory module, but I’m expecting that is already in place by my users (for now).

# Modules that must be imported into the global environment prior to importing this module
RequiredModules = 'ActiveDirectory','ADDelegation'

Next, we have only a single ScriptsToProcess script. While a ScriptsToProcess script isn’t always a necessity, or a requirement, all this script does is verify I have all the CSV files I need for the functions in our module.

# Script files (.ps1) that are run in the caller's environment prior to importing this module.
ScriptsToProcess = '.\ScriptsToProcess\TestForCsvFiles.ps1'

And lastly, we include all the functions we need exported into the PowerShell session for those using our module.

# Functions to export from this module, for best performance, do not use wildcards and do not delete the entry, use an empty array if there are no functions to export.
FunctionsToExport = 'New-DomainDelegationPrep','New-DeptADGroupAndRole','New-OnPremExchRole','New-O365ExchRole'

Hope you enjoyed the weekend, but it’s time to start the week again. You may have never seen it before, but we do have a way, albeit a workaround, to install modules and make them required. It takes a little extra work, but it’s doable. In my case, it’s worth the extra work.

Write-Verbose and -Warning Template

I’m pretty serious when it comes to ensuring the use of Write-Verbose and Write-Warning to their fullest potential in my functions. With that in mind, I’ve come up with a standard way in which I write these types of messages to the screen in warning conditions. In order that I don’t have to track down this code from an actual, already written function the next time I need it, I’ve opted to drop an example, right here. Perhaps it’ll be useful for you, as well.

The idea is that I assign the message I want to convey to a variable, and then use it in the Write-Verbose and Write-Warning commands. That whole use-a-variable concept isn’t what I’m here to discuss; I kind of expect that you put your strings into variables when you plan to use them more than once. The idea is that you can see the full structure I use to inform my users in these types of situations. Have a look at the below example.

...
If (Test-Path -Path $CsvPath) {
    ...
} Else {
    $Warning = "Unable to locate the ""$CsvPath"" path."
    Write-Verbose -Message "$BL Preparing and displaying a warning message."
    Write-Warning -Message $Warning
    Write-Verbose -Message "$BL WARNING: $Warning"
    Write-Verbose -Message "$BL Breaking out of the current function [$CmdType`: $CmdName]."
    break
} # End If.
...

Don’t mind the $BL, $CmdType, and $CmdName variables, as these have everything to do with the advanced function template I’ve written and use for just about everything. They’re not required for this basic code structure.

This simple Warning/Verbose design ensures that Write-Warning messages are captured by Write-Verbose. Once a warning message hits the PowerShell host program, it’s mostly lost. By using the same variable with Write-Verbose, I’m able to do other things with it. As my advanced function template can send Write-Verbose messages to log files, I can therefore ensure warning messages are logged by my functions. This is key.

Anyway, it’s nothing groundbreaking, but I am done looking for an example of this in something I’ve already written, in order to use it in something I’m writing. The two following images are the same example, both in the ISE (Integrated Scripting Environment) and in Visual Studio Code. You can see that when the Verbose parameter is used, we get the warning information included in a verbose message. Therefore, with the use of my advanced function template, I can ensure my warning message makes its way into a log file, if that’s what’s wanted.

And yes, it should’ve included “to,” as in “Unable to locate the…” Ugh, I’m not going to bother recreating these screen captures, even though I’ve updated the above code example.

Of course, all this got me thinking. In some recent projects, I’ve been doing something a bit differently. I’ve been piping my string straight to ForEach-Object, where both Write-Warning and Write-Verbose are executed. Therefore, I suppose the template may end up like it looks below. I’m going to have to think about potentially making this code example my Write-Verbose and Write-Warning template.

...
If (Test-Path -Path $CsvPath) {
    ...
} Else {
    Write-Verbose -Message "$BL Preparing and displaying a warning message."
    "Unable to locate the ""$CsvPath"" path." | ForEach-Object {
        Write-Warning -Message $_; Write-Verbose -Message "$BL WARNING: $_"
    }
    Write-Verbose -Message "$BL Breaking out of the current function [$CmdType`: $CmdName]."
    break
} # End If.
...

Either way, I’ve got this template written down somewhere now, outside of a function.

It’s Year Four Around Here

Update: There was an update to this post on June 6, 2018. See below.

I had kind of hoped my 300th post here at tommymaynard.com would have lined up with June 2018, but that isn’t going to happen. Instead, welcome to post 287. We’re just 13 posts away from reaching post 300.

There’s just too much else going on right now to had gotten these two feats to align. I do want to mention, however, that this month marks the fourth consecutive year of me, blogging on my site. That’s what important about June. It was this month in 2014 that I set out to learn more about PowerShell though experimentation and sharing and teaching. It’s been a successful run, and I’m still interested in watching PowerShell continue to make our lives easier. If someone told me where PowerShell would be now, back in 2014, I would’ve laughed. On Linux? On Mac? But why!? I get it now, and I’ll forever be grateful that I set out to learn it and know it as well as I do. I don’t trust I know everything, but I’ve come along way.

Maybe you don’t know the story. It was early 2007 and I was doing some FMLA (Family and Medical Leave Act) in order that I could hang out at home with my newborn son. That was near to when PowerShell 1.0 had been released, so I installed Windows Management Framework (that’s how we got PowerShell back then), and did some mild experimentation. Sadly, just as quickly as I picked it up, I put it down. I was in love with Visual Basic Script (VBS), which is mostly embarrassing to say these days.

For a few years, I continued to write my solutions using VBS with little to no interest in PowerShell. Then one day, I happened across one of those top 10 things to do for your career articles. Learn PowerShell was listed at number 10. Another year passes and somehow — because I didn’t go looking for it myself this time either — I ran into another top 10 things to do for your career article. PowerShell was number 1.

It was then, that I decided my next scripting project would be in PowerShell and that I’d force myself to put down VBS, and step away. Well I did just that. I needed a way to copy a Sophos file out to 200+ computers. I wasn’t in a self management setup at that time, and so I used PowerShell to copy out the file to each machine, recording data about whether the copy was successful or not. When Sophos went to update the next time — as this file indicated the update servers — it hit the newer servers, upgraded the version of Sophos, and then continued to use those servers for hourly updates. I was hooked.

And, I never looked back.

In closing, here’s the math on 287 posts over 4 consecutive years. Across 48 months, I’m averaging 5.979 posts per month, and I’ve never missed a month. Not bad. Maybe I can break the six posts per month threshold as some point this year, or next.

Update: As of today, I’ve renewed my ownership of tommymaynard.com for an additional three years. I’m not done here.

Create Function from Variable Value I

If you were here Wednesday, then perhaps you already read Get the Total Parameter Count. If not, I’ll quickly tell you about it. I wanted to know how many parameters a function had whether or not those parameters were used when the function was invoked. I did this by using Get-Help against the function, from within the function. Yeah, I thought it was cleaver too, but the best part, it gave me the results I was after. If a function had three parameters, it was indicated in the function’s output. Same goes for when it had two parameters, but of course in that instance, it indicated two.

In going along with that post I had an idea. I wanted to create a way to dynamically create a function with a random number of parameters between one and ten. Then, I could prove my code was working. Sure, I could’ve used Pester, but I was after making this work — this whole, create a function with a random number of parameters thing. I needed to figure out how to get the code for a function, stored inside a variable, into an actual PowerShell function. That might’ve been confusing, but the example code in this post will probably prove helpful.

I’ll show you what I started experimenting with for discovery purposes, and then we’ll jump into the code that actually creates my function with a random number of parameters in a soon to be released second part to this post.

First, we’ll start with a here-string variable assignment. Using a here-string allows us to include multiple line breaks within our variables. As you should be able to see, the value of $FunctionCode below is the code that makes up a simple function. It includes the function keyword, a function name, an open curly brace, the function’s actual code — there’s three lines of it — and a closing curly brace, as well.

$FunctionCode = @'
Function Show-Info {
    '*****'
    'This is function Show-Info.'
    '-----'
}
'@

As expected, when I echo my variable’s value, I get back exactly what I put into it.

PS > $FunctionCode
Function Show-Info {
    '*****'
    'This is function Show-Info.'
    '-----'
}

Now, for the fun part. I’ll include all my code and then we can discuss it line by line below.

$FunctionCode = $FunctionCode -split '\n'
$FunctionCodeCount = $FunctionCode.Count - 2
$FunctionCode = $FunctionCode[1..$FunctionCodeCount]
$FunctionCode = $FunctionCode | Out-String

The first of the above four lines assigns the variable $FunctionCode the value stored in $FunctionCode after we split it on each new line. The second of the four lines creates a $FunctionCodeCount variable, assigning it the number of lines in $FunctionCode after having subtracted 2 from its value. See if you can figure why it’s two…

The third line reassigns $FunctionCode again. In this line we only return the contents of the function. This means it doesn’t return the first line of the function, which includes the function keyword, the function name, and the open curly brace. It also will not include the closing curly brace at the bottom. Our final line reassigns the $FunctionCode for a third time, taking its current value and piping that to Out-String. This will help us ensure we add back in our line breaks.

Before we create our Show-Info function, let’s take a look at the $FunctionCode variable value now.

PS > $FunctionCode
    '*****'
    'This is function Show-Info.'
    '-----'

Using Set-Item, we’ll create our function called Show-Info regardless of whether or not it already exists. That’s the difference between New-Item and Set-Item (and most New vs. Set commands). New-Item will only work if Show-Info doesn’t already exist, while Set-Item will work even if the function already exists. It if doesn’t, it’ll act like New-Item and create the function.

Set-Item -Path Function:\Show-Info -Value $FunctionCode

And finally, entering Show-Info invokes the newly created function.

PS > Show-Info
*****
This is function Show-Info.
-----

Okay, with all that out of the way, we’re going to hit the pause button. Be sure you get what’s happening here, because we’ll pick up on this post in a very soon to be released follow-up that includes the rest of what we’re after: creating a function with a random number of parameters and testing we can calculate that number correctly. If you see an area in which I can improve this, please let me know!

Before we sign off though, let me include all the above code in a single, below code block. That may end up being helpful to one of us.

Remove-Variable -Name FunctionCode,FunctionCodeCount -ErrorAction SilentlyContinue

$FunctionCode = @'
Function Show-Info {
    '*****'
    'This is function Show-Info.'
    '-----'
}
'@

$FunctionCode = $FunctionCode -split '\n'
$FunctionCodeCount = $FunctionCode.Count - 2
$FunctionCode = $FunctionCode[1..$FunctionCodeCount]
$FunctionCode = $FunctionCode | Out-String

Set-Item -Path Function:\Show-Info -Value $FunctionCode

Show-Info

Get the Total Parameter Count

Update: There was an update to this post on May 29, 2018. See below.

It’s been a littler longer than normal for me to have not written. My entire week last week didn’t include a post; it’s so weird. Well, I’m taking a quick minute — seriously — to discuss something I wanted to do recently.

In a new project I’m on, I wanted to know how many parameters a function has from inside the function itself. I didn’t want to know how many parameter were used when the function was invoked. Had I, I would’ve used $PSBoundParameters. Again, I wanted to know how many parameter(s) the function had, whether they were used or not.

I’ll tell you what I opted to do. On that note, do however, let me know if you come up with a better idea. I didn’t give this a ton of time. That said, it doesn’t even have to be better; I’d still like to hear about it. For me, I opted to execute a Get-Help command against the function, from inside the function. I’m making this a quick post, so let’s jump to some code.

Function Check-ParameterCount {
    [CmdletBinding()]
    Param (
        [Parameter()]
        [string]$Param01,

        [Parameter()]
        [string]$Param02,

        [Parameter()]
        [ValidateSet('Source','Destination')]
        [string]$Param03
    )

    $ParameterCount = (Get-Help -Name Check-ParameterCount).Parameters.Parameter.Count
    "There are $ParameterCount possible parameter(s) in this function not including the common parameters."
}
PS > Check-ParameterCount
There are 3 possible parameter(s) in this function.

The next example is the same example as above, however, we’ve removed the third parameter. Sure enough, the value is now two. As you may have noticed, Get-Help gets its parameter count from the actual parameter(s) themselves. Neither function has any comment-based help. Therefore, we can determine that it doesn’t use any static help we might include in regard to the parameter(s).

Function Check-ParameterCount {
    [CmdletBinding()]
    Param (
        [Parameter()]
        [string]$Param01,

        [Parameter()]
        [string]$Param02
    )

    $ParameterCount = (Get-Help -Name Check-ParameterCount).Parameters.Parameter.Count
    "There are $ParameterCount possible parameter(s) in this function not including the common parameters."
}
PS > Check-ParameterCount
There are 2 possible parameter(s) in this function.

That was it. Again, let me know if there’s a better way.

Update: I had a thought. Because my function uses the CmdletBinding attribute, it’s possible that my function can have more than the number of parameters returned by Get-Help. Therefore, I’ve only slightly modified the above strings my function returns. It was “There are $ParameterCount possible parameter(s) in this function.”, and now it’s “There are $ParameterCount possible parameter(s) in this function not including the common parameters.” Now, it indicates it returns X number of parameters, not to include the command parameters.

PS > Check-ParameterCount
There are 2 possible parameter(s) in this function not including the common parameters.

The Other $PROFILEs

If you weren’t aware, I did a great deal of writing and posting recently due to my idea to complete chapter reviews of The Pester Book. I think it was a success, and if anything, it helped force me to ensure I fully understood the content for the things I covered here. I’m not confident with things until I fully understand them, and so even though I finished that book, I’m still after additional content.

Well, after all the writing, I did a couple other posts, but the last one I left off on was about an old HTA I wrote. An HTA is an HTML application. It’s old, but it allowed (back when I used and wrote it), people like me to create GUIs for VBS scripts. Yes, VBS. I hate that it’s the first post on the front page, so I had to write something new in relation to PowerShell.

Today, I wanted to take a moment and remind everyone about the $PROFILE variable. If you enter this variable into your PowerShell host program you’ll see a path to the current user’s, current host’s, PowerShell profile script. A couple things: One, the host here is the PowerShell host, as in the ConsoleHost, the ISE, Visual Studio Code, etc. It has nothing to do with a computer host.

Two, a profile script –once it’s been created and edited– runs when you open the host program, and allows you to set up the environment, or PowerShell session, the way you want it. You can create aliases, variables, and define functions, and ensure they’re available when you open that PowerShell host program.

Here’s how you can check the currently used host program.

PS > $Host.Name
ConsoleHost

When I enter $PROFILE, it returns to me the below path. Do notice that I’m using Windows PowerShell in this example. With PowerShell 6.0 running, it would’ve indicated “PowerShell” in the path, and not “WindowsPowerShell.”

PS > $PROFILE
C:\Users\tommymaynard\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1

Now, it should be noted that there’s actually more profile scripts that can execute when you open a PowerShell host program. Take a look here.

PS > $PROFILE | Select-Object -Property *

AllUsersAllHosts       : C:\Windows\System32\WindowsPowerShell\v1.0\profile.ps1
AllUsersCurrentHost    : C:\Windows\System32\WindowsPowerShell\v1.0\Microsoft.PowerShell_profile.ps1
CurrentUserAllHosts    : C:\Users\tommymaynard\Documents\WindowsPowerShell\profile.ps1
CurrentUserCurrentHost : C:\Users\tommymaynard\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1
Length                 : 78

The above Select-Object command returns five properties, and four of them are paths. In addition to the CurrentUserCurrentHost property, that we saw when we didn’t use Select-Object, we also have entries for CurrentUserAllHosts, and two for all users: AllUsersAllHosts and AllUsersCurrentHost. It’s too bad that in Windows PowerShell the desire to use an AllUsers profile script required work in the C:\Windows\System32 directory. That’s changed in PowerShell Core, where AllUsers profile scripts default to C:\Program Files\PowerShell\<version>. Just because we have a path in these properties, does not mean there’s a file with these indicated names. The files have to be created with New-Item in order to make use of these profile scripts.

Keep this in mind as you use profile scripts. While I suspect most people don’t use it, CurrentUserAllHosts, is better, than using CurrentUserCurrentHost if you want the same thing in every profile. There was a time I used CurrentUserCurrentHost in the Console and then had all my other hosts dot source that script in their own profile scripts. Dumb. If you need something in all, use CurrentUserAllHosts, but if you need something in one host vs. another, then use CurrentUserCurrentHost.

Now, back to that project I’m working. It has everything to do with profile scripts, and with this post I can be certain that I’ve provided myself some reminders about the $PROFILE variable. Additionally, I’ve gotten that HTA post moved down one; it’s no longer the first post on my site.

Another Old HTA: WoL Manager

There’s a not so recent post that I wrote that discussed two old HTAs (HTML Applications) that I wrote long ago. I decided to link the post on the, The Scripting Guys’ Facebook Page (direct link: https://www.facebook.com/groups/5901799452/permalink/10154015674414453). It seemed to me that this group of people might be interested in these HTAs. It made sense, as I had learned plenty about VBS and HTAs from The Scripting Guy, Ed Wilson himself. It turned out to be good idea to share those old HTAs, as there were many downloads. If anyone really used them, I can’t honestly even say.

Someone back when I did this asked if I can, or would, add WoL (Wake on LAN), to the LoggedOnUser HTA. I think the idea was that if the computer didn’t respond, that it would leave the option to wake the computer. While I’m not interested in spending time adding that feature (unless there’s cash involved, or a whole lot of Chipotle gift cards), I did mention another HTA I had written back in my VBS/HTA days, called WoL Manager.

This HTA’s sole purpose was to wake up computers in a computer lab. I had opted to try and save some money by reducing electricity costs, by allowing the lab computers to drop into hibernation, or some form of deep sleep when they weren’t being used during the day, each evening, and over the weekends. It was an often used lab during all hours of the day. Being that I’m very much into Windows PowerShell and automation in general, you can understand that I would rather not walk to the lab every time I needed to wake up a computer. No, I would rather send a WoL magic packet, give it a moment, and then remotely connect to the system, which I did for a few years. The last time this thing was used, was… 2013.

We’ll start by looking at the WoL Manager HTA and then discuss a little more about it. There’s some things you’ll need to know if you opt to try it out. I’ll include a download link at the bottom of this post. Here’s three different views of the HTA: (left) when the HTA is first opened, (center) when the HTA is ready to wake a single computer, and (right) when the HTA is ready to wake all the computers.

script-sharing-another-old-htas-wol-manager-2015-01     script-sharing-another-old-htas-wol-manager-2015-02     script-sharing-another-old-htas-wol-manager-2015-03

Much like the LoggedOnUser HTA, this one also requires a computers.txt files arranged like the example below (name, colon, MAC address). The file needs to remain in the same directory as the HTA. The download includes this file which can be easily edited.

Server01:053ae714eaca
Server02:0044e924f241
Server03:0034e734f2a2
Server04:0043e944f5a3
Server05:0034e654a533
Server06:0043e664e3d3

The HTA requires a third-party executable to wake the computer, which I’m not distrubuting. It’s called wol.exe and can be downloaded here: http://www.gammadyne.com/cmdline.htm#wol. Make sure this executable is located in the same directory with the HTA and computers.txt file, too.

In the environment where I (used to) use WoL Manager, I was able to use the wol.exe command line tool, such as:

PS> wol.exe 0043e944f5a2

It did not require that I include an IP address, like it does in the second example back on this page: http://www.gammadyne.com/cmdline.htm#wol, or in my below example. If your environment requires the IP address of the computer where the HTA is running (not the computer you’re trying to wake), then the HTA will need to be modified (which you’re welcome to do). I think there’s some commented out HTML at the bottom where I may have been getting ready to add this option. I don’t believe there’s any logic in the HTA, however.

I used this same wol.exe executable at home to wake up my desktop computer, although at home, it’s used as part of a Windows PowerShell function. To use this there, my laptop’s IP must be included to wake up the home desktop, such as:

PS> wol.exe 0043e944f5a2 10.10.10.33

If you’re interested in this HTA, start by downloading wol.exe and seeing if it works, first. Also, you may have to make some modifications to the system(s) in which you want to wake, but I’m sure Google, or Bing, can help with setting up the NIC(s) to accept WoL magic packets, if they don’t already.

One final note: I cannot remember for the life of me if the Wake All option actually works, or not. I briefly looked a the subroutine called by clicking that button, and it looks good. Someone might want to let me know. After writing all this, and reading it two years later, maybe I understand why this post as been sitting in my drafts for so long. Enjoy, if you try it and it works!

You can download this HTA here: WoLManager1.1.zip (10353 downloads )

Functions Finding Variable Values

Every once in awhile I forget something I know, I know. Not sure why, but for a quick moment, I suddenly couldn’t remember which can access what. Can a nested, or child, function, access the variable in the containing, or parent, function, or is it the other way around? That is, can a containing, or parent, function access the variable in the nested, or child, function? Even as I write this, I still can’t believe that for a moment I forgot what I’ve known for so long.

Duh. A nested or child function can access a variable assigned in the containing or parent function. If the child function cannot find a declared variable inside itself, it’ll go upward in scope to its mom or dad, and ask if they have the variable assigned, and if so, they can borrow it. Being good parents, they offer it if they have it.

It’s so obnoxious that in a quick moment, I wasn’t sure, even though I’ve pictured it in my own head multiple times, as well as relied on it. Anyway, here’s an example that does prove that a nested function will go upward to find a variable, if it’s not been assigned inside itself.

Function AAA {
    'In function AAA.'
    Function BBB {
        'In function BBB.'
        $x = 5
    }
    $x
    BBB
}
AAA

Function CCC {
    'In function CCC.'
    $x = 5
    Function DDD {
        'In function DDD.'
        $x
    }
    DDD
}
CCC
In function AAA.
In function BBB.
In function CCC.
In function DDD.
5

This got me wondering, how far up will it go!? I assume it’ll go up and up and up, and well I was right. Take a look at this example. I guess if I had to forget something so simple that at least I got an opportunity to build this example. Enjoy the weekend!

Function Top {
    $x = 10
    "0. Assigned `$x with $x."
    "1. Inside the $($MyInvocation.MyCommand.Name) function."
 
    Function MidTop {
        "2. Inside the $($MyInvocation.MyCommand.Name) function."
 
        Function MidBottom {
            "3. Inside the $($MyInvocation.MyCommand.Name) function."
 
            Function Bottom {
                "4. Inside the $($MyInvocation.MyCommand.Name) function."
                If (Get-Variable -Name x -Scope Local -ErrorAction SilentlyContinue) {
                    '5. Can find the variable in the local scope.'
                } Else {
                    '5. Cannot find the variable in the local scope.'
                }
                "6. The value of `$x is $x."
            } # End Function: Bottom.
            Bottom
 
        } # End Function: MidBottom.
        MidBottom
 
    } # End Function: MidTop.
    MidTop
 
} # End Function: Top.
Top
0. Assigned $x with 10.
1. Inside the Top function.
2. Inside the MidTop function.
3. Inside the MidBottom function.
4. Inside the Bottom function.
5. Cannot find the variable in the local scope.
6. The value of $x is 10.

Update: I made a few modifications to the above function. Now, it indicates that it cannot find the $x variable in the local scope of the Bottom function. That’s one way to help prove it goes upward through the parent functions looking for a value for $x.