Create a Function to Open Internet Explorer (Like in Run)

Even before Windows PowerShell, I strove to do things quickly. One such thing was to open Internet Explorer (IE) from the Run dialog. In the image below, you can see how you can enter iexplore followed by a space and then a URL. When you press OK, it will launch IE and direct itself to the URL that was passed along. If no URL is provided, it will open the home page.

Create Function to Open Internet Explorer (Like in Run)01

When I transferred this knowledge to PowerShell, I was sad to see that entering the same thing in the console host resulted in an error: “The term ‘iexplore’ is not recognized as the name of a cmdlet, function, script file, or operable program.”  I wasn’t overly concerned about it, moved on, and mostly forgot about it.

After spending last week at the PowerShell Summit North America 2015, I saw several demos from speakers that included ‘start iexplore <URL>,’ where start is an alias for Start-Process. Use Get-Alias to see this yourself: Get-Alias -Name start. I decided I would write up a simple function, to add to my profile, that would allow me to just use iexplore again.

With the function below, in place, I can enter iexplore in PowerShell to open Internet Explorer to the home page, or iexplore bing.com to open Internet Explorer to bing.com, for example — pretty straightforward. Here it is: the little function that saves me six keystrokes every time I use it. Don’t worry, it won’t be long before I make up for the lost keystrokes from writing the function itself.

Function iexplore {
	Param ([string]$Url)

	If ($Url) {
		start iexplore $Url
	} Else {
		start iexplore
	}
}

And, here’s the results.

Create Function to Open Internet Explorer (Like in Run)02

Before we close for the day, we should probably makes a couple changes to our function. We should change the function name so it uses an approved verb, along with the verb-noun naming convention, and then create an alias (iexplore), to call the function.

Set-Alias -Name iexplore -Value Open-InternetExplorer

Function Open-InternetExplorer {
	Param ([string]$Url)

	If ($Url) {
		start iexplore $Url
	} Else {
		start iexplore
	}
}

Randomly Selecting a Name from a CSV File

I have to admit, I’m kind of excited to be doing another Twitter Reply post. This one is the result of a post by Michael Bender. I’ve never met him, but I’ve follow his career a bit in the last few years, when I was introduced to his name via TechEd. He made a recent Tweet where he asked for a PowerShell genius to help build him a script. The goal was to randomly select a name from a CSV file, with an option to choose how many names it should randomly select.

Here’s the Tweet: https://twitter.com/MichaelBender/status/593168496571322370

I wouldn’t consider myself a genius, but if you consider what he’s after — a name from a CSV file (not a line from a text file) — then there’s a better option then the one he accepted: “get-content file.csv | get-random.” This will work, but there’s some reasons why this isn’t such a great idea. We can use the Get-Content cmdlet with CSV files, but we generally don’t. CSV files are created and formatted in such a way, that using Get-Content, instead of Import-Csv, isn’t the best way to do things.

Let’s consider that we’re using the CSV file in the image below. It has three headers, or column names: Name, Age, and HairColor.

Randomly Selecting Name from a CSV File01

First of all, if we use Get-Content, it’s going to consider the first line — the header line, or the column headings — a selectable line within the file. Here’s that example:

PS C:\> Get-Content .\file.csv | Get-Random
Name,Age,HairColor

Uh, not helpful. While this isn’t the end of the world, as we can simply rerun the command (and probably get something different), we’d be better off to write this so that this error is not a possibility — something we’ll see in a moment, when we use Import-Csv.

Second of all, unless the only column in the CSV file is Name, then we’re not returning just a name, but instead, everything in a row of the file. This means that if I randomly choose a line in the file, that it will include information that Michael didn’t request — the age and hair color.

PS C:\> Get-Content .\file.csv | Get-Random
Jeff,19,Brown

I suppose we could start parsing our returned value to just return the name ((Get-Content .\file.csv | Get-Random).Split(‘,’)[0]), but seriously, let’s skip that and use one of the cmdlets that was built to work with CSVs directly. We’ll assume we’re still using the same CSV file in the image above.

PS C:\> Import-Csv .\file.csv | Get-Random | Select-Object Name

Name
----
Lance

Um, that was easy. If we wanted to increase the number of names returned from our CSV file, then we would use Get-Random’s -Count parameter and an integer value, such as in this example:

PS C:\> Import-Csv .\file.csv | Get-Random -Count 3 | Select-Object Name

Name
----
Bob
Stephen
Sue

I think we’re best off when we use the *-Csv cmdlets with CSV files, and the Get-Content cmdlet with most other file types we can read in PowerShell. There is at least one exception of doing this in reverse: Using Import-Csv with a text file, that has a delimiter, might help you better parse the information in your file. Consider this text file:

Randomly Selecting Name from a CSV File02

The next few examples will progressively show you how to use Import-Csv with a delimited text file until all of the “columns” are broken apart. When we’re finished, we can even export our data into a properly formatted CSV file, and then import it, piping those results to other cmdlets — take a look.

PS C:\> Import-Csv .\file.txt

001|Sunday|Car7
---------------
002|Monday|Car2
003|Tuesday|Car3
004|Wednesday|Car3
005|Thursday|Car7
006|Friday|Car2
007|Saturday|Car3

PS C:\> Import-Csv .\file.txt -Header Id,Day,Car

Id                                      Day                                     Car
--                                      ---                                     ---
001|Sunday|Car7
002|Monday|Car2
003|Tuesday|Car3
004|Wednesday|Car3
005|Thursday|Car7
006|Friday|Car2
007|Saturday|Car3

PS C:\> Import-Csv .\file.txt -Header Id,Day,Car -Delimiter '|'

Id                                      Day                                     Car
--                                      ---                                     ---
001                                     Sunday                                  Car7
002                                     Monday                                  Car2
003                                     Tuesday                                 Car3
004                                     Wednesday                               Car3
005                                     Thursday                                Car7
006                                     Friday                                  Car2
007                                     Saturday                                Car3

PS C:\> Import-Csv .\file.txt -Header Id,Day,Car -Delimiter '|' | Export-Csv -Path C:\file-cars.csv -NoTypeInformation
PS C:\> Import-Csv .\file-cars.csv

Id                                      Day                                     Car
--                                      ---                                     ---
001                                     Sunday                                  Car7
002                                     Monday                                  Car2
003                                     Tuesday                                 Car3
004                                     Wednesday                               Car3
005                                     Thursday                                Car7
006                                     Friday                                  Car2
007                                     Saturday                                Car3

PS C:\> Import-Csv .\file-cars.csv | Where-Object Car -eq 'Car3'

Id                                      Day                                     Car
--                                      ---                                     ---
003                                     Tuesday                                 Car3
004                                     Wednesday                               Car3
007                                     Saturday                                Car3

PS C:\> Import-Csv .\file-cars.csv | Where-Object Car -eq 'Car3' | Select-Object Day

Day
---
Tuesday
Wednesday
Saturday

So, if I had seen the Tweet first, I would have to recommend Import-Csv, piped to Get-Random, and then piped to Select-Object Name. Thanks for reading the post.

Find DNS Servers Being Used by DHCP Scopes

Download the Get-TMDhcpDNS function here: https://gist.github.com/tommymaynard/afdb78038e8639d5d23baaaaf897cac1

There was a Microsoft TechNet forum post last week regarding trying to obtain the DNS servers being used by different DHCP Scopes (Here’s the link: https://social.technet.microsoft.com/Forums/en-US/58723d31-7586-40c3-acd2-183f20b49daf/how-to-dump-the-dns-options-for-each-dhcp-scope?forum=ITCG).

I originally thought the person wanted to get the DNS Servers listed in the Server Options (what the scopes use by default), until he (or she) better clued me in to wanting the DNS Servers listed in the Scope Options, when there was one. At times there won’t be any DNS Servers listed in the Scope Options, and it’ll use the Server Options instead.

Since that forum post, I wrapped up a few commands to create a full-service advanced function. Get-TMDhcpDNS function will collect the Scope Name, Scope ID, DNS Servers, and whether the DNS Servers assigned, are done so in the Scope Options or Server Options.

If you think this might be helpful for you or someone else, then please download it, test it, and rate it. Thanks, and here’s an example of the function in action:

PS C:\> Get-TMDhcpDNS -ComputerName 'dhcpsrv1.mydomain.com' | Format-Table -AutoSize

Name                  ScopeName  ScopeID     DNS                                    ScopeOrServerDNS
----                  ---------  -------     ---                                    ----------------
dhcpsrv1.mydomain.com Building01 10.10.10.0  10.10.10.10.20,10.10.10.21,10.10.10.22 Scope
dhcpsrv1.mydomain.com Building02 172.16.16.0 172.16.16.5,172.16.16.15               Server

Use the link above to download the function.

Update: There was an issue with the DNS Server (System.Object[]) when piping the function to Export-Csv. That’s been corrected in 1.0.2. Here’s a post I had to reference (again): http://learn-powershell.net/2014/01/24/avoiding-system-object-or-similar-output-when-using-export-csv/.

Extra – PowerShell Summit North America 2015 [#8]

Read them all here: http://tommymaynard.com/extra-powershell-summit-north-america-2015-0-2015/

I couldn’t believe it when it arrived: the final day of the PowerShell Summit North America 2015. Fifty some days ago and I wasn’t sure the summit would ever get here, and now, it’s over.

My final day consisted of another ride over to the Microsoft campus, another second breakfast — seriously, I ate two each morning — and several more PowerShell sessions. The standout sessions for me on the last day were Jason Helmick‘s The Top DSC Gotchas and Best Practices, and June Blender‘s PowerShell Help Deep Dive. These two speakers are two of the best when it comes to capturing the audience’s attention. Jason could keep me awake regardless of topic, and June could sell me anything. They are both great presenters, and so you should get to watching both of their sessions now. Now, as in relation to when you finish reading this post.

As I sat and listened to June speak, I kept thinking the same thing, and no, it’s not that her voice reminds me of my CPA — something I told her earlier in the week when waiting outside an elevator. Oh, by the way, I should probably put my apology out there for her now. Microsoft had these mints on the front desk, as you enter the building. I popped one in my mouth as I made my way to the elevator — bad idea. As much as I’d like to forget it didn’t happen during our conversation, a small piece of my breath mint decidely left my mouth and took flight near June’s direction. Thanks for pretending that didn’t happen, June!

What I kept thinking was that June should have spoken way sooner than day three. In future summits, she needs to do a welcome or keynote presentation, alongside, or before, or after Don. She pulls you in, and I think we all might have benefited from her speaking sooner, and to everyone.

So, as you might be aware, I had been looking forward to day three for awhile, because it was time to take the Verified Effective exam. During lunch on the second day, Don spoke to everyone that was going to take the exam and gave us some information we were going to need to know. One of the things I remembered most, was that the average time to complete the exam was 37 minutes.

Sometime on Wednesday morning, I took a look at my return flight home. The time had been changed to two hours earlier than I had planned. I called the airline and the change was made back in January. I can take some of the blame here, but for whatever reason the notification sent to Expedia, was never sent to me. That doesn’t mean it was Expedia’s fault, but suddenly I had to scramble — a little extra stress with my test. The airline couldn’t help me, nor could the hotel for a ride. If I wanted a ride from the hotel I would have to skip my exam. Ah! I asked the person at the hotel about a taxi and he had a private company, that was able to guarantee the pick up time, give me a call (“I have a guy.”). I gave them my info, got a call back, and arranged my car for 3:45 p.m. — the test started at 3 p.m. (ish).

The test wasn’t hard. In fact, I know I would have completed, if I had been given more time. I left at 3:45 p.m. (and turned in what I had completed, since Jason mentioned doing that) and headed to the airport. Originally, I wasn’t going to bother turning in anything. When I stood up at 3:45 p.m., I was only the second person to do so since the exam began. The first person had been sitting next to me and left a few minutes into the beginning of the exam. No one in a room full of 40+ people was done at 3:45 p.m. Someone did come up as I was standing at the front of the exam room and having Jason copy over my exam. I’m not sure how people felt about the exam and the amount of time to complete it, but I would definitely be interested to know. It seems like nearly all of us could have use a bit longer, unless of course everyone finished after I left, but before 4 p.m.

The summit was over. I was seated in the back of my SUV transportation and headed to the airport. I was disappointed about my test, and I was still stressed — I had to get my boarding pass, drop off and pay for my luggage, get through security, and find my gate all before it was my departure time. I didn’t have as much time as I wanted, but luckily I had enough time to make it.

In the end, I would absolutely recommend you join us next year. Hopefully, I’ll be able to do that too, because this was an opportunity like no other I’ve yet to have in my career. My prediction was correct. I look forward to continue to script in PowerShell, to continue to explore DSC, to continue to answer questions on PowerShell forums, and to continue to create tools for the community, and my employer. It’s a special bunch of people, and I’m glad I got involved. PowerShell is rewarding, and one of the best things I’ve done, and will continue to do, for my career.

It was great meeting and talking to everyone! I look forward to doing it again.

Extra – PowerShell Summit North America 2015 [#7]

Read them all here: http://tommymaynard.com/extra-powershell-summit-north-america-2015-0-2015/

When the second day was done and over, I simply didn’t have the energy to do much more than get a shower and watch 30 Rock on Netflix. Yes, 30 Rock — I still have a couple seasons to go. But, just because I didn’t write, didn’t mean I didn’t have another rewarding day. On day two, I attended sessions such as as Dave Wyatt‘s, Keeping Secrets… session, and the combo, two-part session by Jeffery Hicks and Lee Holmes. Each of those were great. Not only do they explain the how in everything, but they explain the why — I think that’s an important distinction, and a requirement for working with Windows PowerShell at the 400-level.

I also got to hear Jim Christopher explain SeeShell and Mike F. Robbins discuss PowerShellGet. They both had a well-thought-out flow to their topics. I recommend these two, without question. I’m looking forward to that future moment when I suddenly remember SeeShell while working on that future project. As well, I can’t wait for WMF 5 to be out of preview, and be ready for down level versions of Windows. PowerShellGet, and especially the PowerShell Package Manager — previously OneGet — are going to make a huge impact on desktop administrators — mark my words.

The final two sessions of the day were quite legendary. First, we heard Don Jones discuss what he knows about — and his predictions for — Nano Server. This was followed up by a QA session with Jeffrey Snover. Yes, that’s as cool as it sounds — especially on a stage this small.

Speaking of Jeffrey Snover, I didn’t get to sit with him again during lunch, but our table did have a second best guest. Don Jones decided to sit with us since “all the spots were taken at the cool kids’ table.” Of course, he was kidding — that’s at least what I’m telling myself. During the conversations with Don, he explained some of the differences between using the Microsoft Campus for the summit and what they had done in the previous year. He indicated that next year’s summit will be back in Washington, and as of now, is tentatively scheduled for the 4th, 5th, and 6th of April, although those dates were actually mentioned on day one. I’m hoping to be there next year, and look forward to venue where everything is in walking distance.

Consider how amazing this event is for a moment: In two days’ time, I sat and ate lunch with Jeffery Snover and Don Jones. The likelihood of that at a big conference is next to zero. Besides all the great PowerShell content, there’s this possibility to actually have conversations with some of the community’s most talented and influential members.

You can watch most, if not all, of the sessions here — seriously, do it, it’s worthy of your time. I’ve been somewhat vague about the session content because my intention is more in line with convincing you to attend — to work on your PowerShell knowledge and skills daily, to be a member of the PowerShell community, and to come out and meet the rest of us. I think I heard Jeffrey Snover say it twice now: it’s his favorite conference. If you’re not going to take my advice, fine, but you’ve got to take his. He’s the inventor of PowerShell; he’s the one that’s been telling us to learn PowerShell — that we’re going to need it (think: Nano Server). We’re System Administrators learning DevOps skills; we’re developers that are thinking operations. We’re blurring the lines, and moving to the front of the line.

One of the most memorial parts of the second day was the evening out. While I could have packed up and easily sat in my hotel room for the evening — I mean, I am a geek; I have a computer — I instead took the offer from Stephen Owen to head out for dinner with a handful of other enthusiasts. Best decision of the summit. I had a great time! I sat at a table at some hip and trendy joint with Warren F, JC, Paul, and Josh Atwell, and had one of the most entertaining conversations, with fellow IT people, in a long time. Never mind that we didn’t talk that much shop, but we laughed our asses off. Josh has to be one of the funniest people I’ve met. If you’re at the summit, do your best to get a seat next to him outside the conference. I also gathered he’s a smart guy, and an author — you go, Josh. Here’s the thing with this conference: everyone is smart. This was an elite group of people. Don’t let that scare you off though — I was there, too.

Extra – PowerShell Summit North America 2015 [#6]

Read them all here: http://tommymaynard.com/extra-powershell-summit-north-america-2015-0-2015/

Today was the first day of my first PowerShell Summit. What, an amazing opportunity. We’ll get to today, but a bit about last night first.

It started when I arrived at the Ri Ra, a downtown Irish pub and grub. I had a ride with Dave Wyatt, who, as I learned sometime between 4 and 5 p.m. today, has code from his Pester project shipping with Windows — well damn, that’s quite the accomplishment, Dave.

We spent a hour or so at the pub where I was able to chat with Jeffery Hicks, Richard Siddway, Teresa Wilson, and several others. At that point in time, many of these people seemed like celebrities. They still do; however, I’ve come to realize that this summit is designed to break down what might separate speakers from attendees, at a large conference. I’ve shaken hands with Mark Minasi and had a book signed by Mark Russin — hold on while I go figured out how to spell his name — ovich, but this was different kind of experience. The same Jason Helmick I watched on the DSC videos earlier that day, was standing over by the bar. I’ve yet to meet him personally, as well as plenty of others, but still, none of this felt real until today — like mid morning.

Dave and I left Ri Ra after a quick decision to find a place to eat — perhaps one of the BBQ joints we saw on our walk from the garage, where he parked. Sure, it’s only a couple blocks over… through. a. downpour (there had been tornado warnings). The umbrella and sweatshirt I considered bringing, were in my hotel room, dry and unused and probably grateful. When we arrived at the restaurant, I couldn’t see, as my glasses were drenched on both sides of the glass, and my clothes were drenched, too.

After an incredible meal at Queen City Q — something that was required to help keep me from thinking about having dinner with someone I just recently met while wearing clothes that felt as though they just came out of the washing machine — we headed out. Although the GPS in his rental car repeatedly lied to us, we finally made it out of downtown Charlotte. It required that we travel north, to go south. I’m glad to see he made it to the summit today, because I had my doubts about that thing.

So, today. It started off with (my second) breakfast and an opening welcome by Don Jones. Following that, I ended up attending the sessions I planned to originally. This meant I listened to Jason Helmick discuss PowerShell Web Access, permissions, and IIS application pools, all in relation to DSC and DSC resources. I’ve yet to create my own DSC Resources, but I’m looking forward to the opportunity. Jeffrey Snover said something later in the day about the impact of the community; he might be on to something.

I followed up Jason’s session learning about Pester, monitoring, DSC and AD, and then oData. As well, I enjoyed listening to Jeffery Hicks discuss constrained endpoints. Ever since my SharePoint constrained endpoint project, I’ve come to really enjoy the capabilities they provide. I’m looking forward to transferring that knowledge to JEA: limiting cmdlet parameters without a proxy function, sounds good to me.

Jeffrey Snover closed out the sessions in the early evening with his State of PowerShell discussion. It’s always great hearing Jeffrey speak, whether it’s at a TechED, an online video, or a lunch. That’s right, he also spoke at lunch — but not with everyone.

I was sitting amongst a group of other PowerShell enthusiasts. It was a full table, outside the empty chair that was at my right. Next thing I know, Mr. Snover sits down next to me. For 45 minutes to hour he told us stories, introduced us to topics he’d cover in his closing session, and answered questions from anyone at the table that spoke up. It was amazing, as was his ability to eat and chat so well — as if he’s perfected doing these at the same time. When we were rounded up to move to the after lunch activities, he asked where I worked and we briefly discussed Tucson, Arizona — my home town, a place he’s visited. It was a honor to be a part of the discussions that took place at that table. It was something I won’t soon forget, and something I didn’t see coming. Either was being able to guess the number of stickers on Ashley McGlone‘s laptop (25) — something that scored me a sticker.

To round out the evening, I chatted with Adam Bertram. It was an exceptional day and I’m so fortunate that I was a part of the community today, in person. I’m looking forward to tomorrow — round two.

Extra – PowerShell Summit North America 2015 [#5]

Read them all here: http://tommymaynard.com/extra-powershell-summit-north-america-2015-0-2015/

I said I’d write, and here I am. I’m on my first leg of my travel to Charlotte, North Carolina which will put me in Atlanta for an hour or so — just enough time to upload this post and eat something. I hope.

Prior to boarding I noticed that the #PSHSummit Twitter hashtag is already seeing an increased amount of usage, as many people begin their travels to the PowerShell Summit North America 2015. I can’t believe it’s already here. I’m on a full Delta flight coming out of Tucson at 6:10 a.m. That required a 4:00 a.m. start time — something I’ll end paying for at some point today.

The plan, after briefly listening to some music, is to fire up module 6 of the Getting Started with DSC JumpStart videos: I’ve got make the most of this time…

… time passes …

…Roughly an hour has past since I started that module. Like all of them so far, that module was beneficial. It helped solidify parameterizing configuration scripts, using configurations data and using credentials. I hadn’t heard of the DevOps reference, mentioned by Jeffrey Snover, before, but I like it. This was the one that indicated to treat your servers like cattle… without an emotional tie. I can do that.

I’m about a half hour outside of Atlanta and our smooth ride is about to get bumpy. Time to pack up, and return to some music to assist in drowning out the engines and squeaky luggage above me…

… time passes …

…Well, I made it to Atlanta despite the rain and bumpy ride. I’ve eaten, caught up on Twitter, and now to find my gate. The long part of my trip is over with a flight about as long as the last module in the first DSC JumpStart from Microsoft Virtual Academy.

Extra – PowerShell Summit North America 2015 [#4]

Read them all here: http://tommymaynard.com/extra-powershell-summit-north-america-2015-0-2015/

And, so it begins: packing. Today is the day I have to collect everything I think I’ll need for my trip to Charlotte. It’s not that I’ll forget something, it’s what will it be this trip: clothes, cords, cables and countless other things all need to fly along with me, from Arizona to North Carolina.

I noticed that today on Twitter, people have started to use the Twitter #PSHSummit hashtag a bit more. I think that’s a great idea. It’ll allow everyone the ability to share what they’re doing, as we all embark on this journey and experience. As well, it’ll serve as a meeting place for hearing about topics and speakers as the event is underway.

I’ve lined up my ride a bit more securely from my hotel to the meet-n-greet Sunday evening. Dave Wyatt, a PowerShell MVP and speaker at the summit, is going to let me catch a ride with him. Dave and I have yet to meet in person, and yet we’re riding together. I think this emphasizes what this event is all about. It’s not some huge conference, it’s about community, and knowledge sharing. It’s an event where a person like me — who’s an intermediate in PowerShell on a good day — can have a speaker’s ear for 15 minutes as we drive downtown. The fact that there’s an intentionally low number of attendees increases the learning possibilities, and will ensure all the questions we have, as participants, can be asked and answered. Like I wrote once before: This has the potential to be the highlight of my career so far. I can’t wait to find out.

Now, to start packing. See everyone soon.

Oh, and by the way, Jeffery Hicks followed me on Twitter today. It’s a crazy world.

An Improved Measure-Command: Multiple Commands, Multiple Repetitions, Calculated Averages, and Pauses Between Runs

Download the Measure-TMCommand function here: https://gist.github.com/tommymaynard/c97c5248d76aba08f1c8aa01096aa12b

In Windows PowerShell, there are often several ways to complete the same task. With that in mind, it makes sense that we might want to determine how long commands and scripts take to complete. Until now, Measure-Command has been the cmdlet we’ve used.

While Measure-Command has been helpful, I’ve often thought it should include some additional functionality. Therefore, I’ve written an advanced function, Measure-TMCommand, that adds all the benefits listed below:

– Continually measure the execution time of a single command and/or script, up to a user-defined number of repetitions.

– Continually measure the execution time of multiple commands and/or scripts, up to a user-defined number of repetitions.

– Calculate the average time a command(s), and/or a script(s)  takes to execute.

– Display limited hardware information about the computer where the command and/or script is being measured.

– Includes an option to display the output of the command and/or script, as well as the measurement results.

Updated 4/15/2015 (v1.2.1): Added a parameter -TimeInBetweenSeconds with a parameter alias of -Pause. This will pause the function between executions, allowing the ability to test at different times between a set time. For instance, let’s say you want to measure a command every 1/2 hour for six hours: 12 repetitions with 30-minute pauses. You would then run the command with the -Repetitions parameter with a value of 12 and the -TimeInBetweenSeconds (or -Pause) with a value of 1800 (as in 1800 seconds, or 30 minutes).

Here’s the function in action:

Measure-TMCommand1.2.1

In the example, above, we can easily determine that using the -Name parameter of the Get-Service cmdlet is faster than piping the entire result set to the Where-Object cmdlet, and then filtering on the name. Notice that not all properties were returned — only the ones in which I was interested.

With the addition of the -TimeBetweenInSeconds, or -Pause, parameter I have considered that this function might be better served to also have an -AsJob parameter. I’ll look into it, but no promises. Thanks, and enjoy.

Script Sharing – Determine the Node to GUID Mapping

If you’ve been following my recent posts, you know that the PowerShell Summit North America 2015 is only days away. I’ve used April to learn (as much as I can) about DSC. I wasn’t completely new to it — I’ve been following along for a bit — but there is still plenty I didn’t know, or at least haven’t experienced hands-on. Anyway, I’m doing whatever I can to absorb as much DSC knowledge as possible, before next week.

While I only have a single target node at this point, I stopped and wondered how obnoxious it may be to determine the GUID to node mapping, when I update a configuration script. I know I can get it from the target node, by using Get-DscLocalConfigurationManager, but what’s an easier way to get them all, at once? While I could query all the nodes, I figured I can also query my MOFs’ directory, providing we trust that source, and why shouldn’t we.

I wrote out a quick and dirty function that I’ve included below. Point this function at your MOFs’ directory on your DSC Pull Server and ta-da, it’ll create a PSCustomObject with your node names and matching GUIDs.

Disclaimer: I’m still learning DSC and may one day realize this function was a waste of time.

Function Get-TMDSCGuid {
    [CmdletBinding()]
    Param (
        [Parameter(Mandatory = $true)]
        [string]$Path
    )

    Begin {
        Write-Verbose -Message 'Collecting MOF files.'
        try {
            $Files = Get-ChildItem -Path $Path -Filter '*.mof' -ErrorAction Stop | Select-Object -Property *
        } catch [System.Management.Automation.ItemNotFoundException] {
            Write-Warning -Message "This path does not exist: $Path"
        }
    } # End Begin.

    Process {
        If ($Files) {
            Write-Verbose -Message 'Checking MOF files.'
            Write-Verbose -Message 'Writing ComputerName-GUID Mappings.'
            ForEach ($File in $Files) {
                $ComputerName = $null
                try {
                    $ComputerName = (Get-Content -Path $File.FullName |
                        Where-Object {$PSItem -like '@TargetNode*'}).Split('=')[-1].Trim("'")
                } catch {
                    $NoTargetNode += "$($File.Name);"
                } # End Try-Catch.

                If ($ComputerName) {
                    $Object = [pscustomobject]@{
                        ComputerName = $ComputerName
                        Guid = $File.BaseName
                    }
                    Write-Output -Verbose $Object
                } # End If.
            } # End ForEach.

            If ($NoTargetNode) {
                Write-Verbose -Message "---MOF Files without @TargetNode section---"
                $NoTargetNode = ($NoTargetNode.Trim(';')).Split(';')
                ForEach ($Node in $NoTargetNode) {
                    Write-Verbose -Message ($Node)
                } # End ForEach.
            } # End If.
        } Else {
            Write-Verbose -Message 'Cannot locate any MOF files.'
        }
    } # End Process.

    End {
        Write-Verbose -Message 'Function is done running.'
    } # End End.
} # End Function.

Below is an example of the output that will be displayed when we invoke the function against the directory that holds our <guid>.mof files for our DSC Pull Server.

ComputerName                                                       Guid
------------                                                       ----
serverX.mydomain.com                                               2766ffba-0c66-4358-8426-1a216c2b9d25
serverY.mydomain.com                                               7699bbcd-1a32-1429-9831-0f197d3a9b14
serverZ.mydomain.com                                               3224abda-2b41-4925-2948-4c317a1c0a54