Tag Archives: Invoke-RESTMethod

Coding Novice, APIs, and PowerShell

I read a recent post on the technical writing subreddit, “How proficient in coding do you have to be to write API Documentation?” I jumped in and posted, as technology is my jam, and writing is my passion.

The author wanted to know if they need to know how to program to make use of an API—an Application Programming Interface. I don’t think so. I have well over 10 years of learning and working with PowerShell, and I don’t think anyone needs that to use an API. Maybe there will be a few things to learn, but not all of it. Why would you even want to focus on one language, as so many can make API calls? I think time would be better spent learning and understanding APIs before learning any one language. The languages the thread’s author mentioned were “HTML, CSS, JavaScript, and/or Python.” I could get some hate for this, but when I learned HTML and CSS I never once thought it was a programming language. I find it strange when people think it is. It’s a formatting language, like markdown. Do this to that, place that there, make this bold, etc. There’s no looping or conditional logic—I just never thought of either of those two that way.

My thought was to teach a bit about APIs with some help—the video linked at the end of this paragraph is great—and then, assume the reader has absolutely no experience with PowerShell. It can easily be installed on any operating system: Windows, macOS, or Linux. Once you have it installed, if you plan to [1], then let’s watch the below video. It is quick and painless and uses an easily understood analogy for APIs, and requests and responses. I’m going to assume that you, as the reader, have watched the video when I begin the next paragraph.

[1] If you’re on Windows, you can use the built-in Windows PowerShell if you’d rather, and not download and install PowerShell (yes, they’re different). If you’re on a Mac you can use the Terminal. If you’re on Linux you can use its built-in terminal, as well. Again, this is if you don’t want to use PowerShell for some strange reason.

Regardless of how you move forward, you now know enough about APIs for now. It’s a way in which you can send a request to, and get a response from, a web service. I absolutely love when I hear that a company is API first. For instance, my understanding is that AWS, or Amazon Web Services (Amazon’s Cloud computing platform), is API first. They build the API, then they build a UI or a website that calls the APIs. Then, they can write PowerShell and other CLI (command-line interfaces) that interact with the exact same API. Everything uses the same backend to get its results. If everything is API-driven, then there are numerous ways to do the same thing, to gather the same information. If there’s an API to start a virtual computer in their cloud, then you could do it from Python, PowerShell, Bash, and from the AWS Management Console—the AWS website. There are probably plenty of other languages that can interface with the exact same API, so why focus on any one single language? This is the point. Expose something in such a way that the language or shell program making the request doesn’t matter.

Before we use PowerShell and the Invoke-RestMethod command, or Bash and the cURL command, or whatever else you’ve chosen, let’s install Postman [2]. It can feel a little overwhelming, even for me, but we’re only going to use a small portion of it. You may have to create an account these days, and that’s perfectly fine to do. Once it’s installed, open it, sign in, etc. and let’s get to the main screen. You’ll probably begin in some default workspace. In my installation of Postman, I created a new workspace just for things related to my site and named it tommymaynard.com.

[2] While I’ve never used it, it appears that there’s a web-based version of Postman. I’m sure it’ll require registration, but it won’t require download and installation. My instructions will be for the installed version, however, I do suspect they’ll work similarly. Your choice.

Toward the top-left should be a plus (+) that allows you to create a collection. Think of a collection as a folder where we can separate like requests. It’s much like Workspaces, as, at minimum, it’s a means of separation, as well.

When given an option to name it, call it “ipify.org.” Once the collection is created, click the greater-than sign (>) to open the collection. There ought to be a message that says, “The collection is empty. Add a request to start working.” Click “Add a request” and rename the request to “IP address.” Now, take the below URL and copy and paste it where it says, “Enter request URL.”

https://api.ipify.org

When that’s in place, click Send on the far right. This will place your public-facing IP address in the lower, response pane. Again, think request (top-pane) and response (lower-pane). We send the waiter in, the waiter works with whatever is back in the kitchen—we don’t care—and then something comes back to us, at the table.

How did I know about this API, right? There are so many of them. There’s probably at least one for everything. At some point, you, like me, might be irritated you actually have to go to some website to find something out. If I could use Postman, or PowerShell, to securely check my bank balance, I would. Same interface for everything, please. I hate navigating 15 totally different websites; everything’s in a different place. It’s so annoying. APIs mean we don’t have to care where someone put something on their website.

I went off on a small tangent there. Anyway, APIs are most often documented, and the one we just used, is no exception. Take a look at the ipfiy.com website. It’s nothing but a shrine to how this API works. When there, scroll down to the “Code samples” section. There are 29 —yes, twenty-nine—different code examples: PowerShell, Python, PHP, Bash, Ruby, NodeJS, Go, C#, and more. Why would anyone learn one language? Postman isn’t even a language; do you even need a language? Postman may be all you need to write API documentation. You don’t have to be a coder to do this.

Let’s move back up on the ipify.org web page and take a look at the API Usage section for IPv4. There’s an option to include a query parameter. See if you can spot what’s different in the below image from what we saw previously. Right-click on our ipify.org collection and choose “Add Request” and see if you can’t rename it to “IP address (JSON).” Once your Postman looks like the below image, hit Send. Your public-facing IP address should appear in the response pane, too.

This request will include a query parameter inside the address. This is notated with the question mark (?), followed by a key-value pair. In our instance, the key-value pair consists of format (the key) and json (the value). When this is run, we return more than just text, but instead, a JSON object. Some systems that might make this request, might prefer it’s already formatted in JSON. It’s a data structure; it’s how—and this is going to sound crazy—we want the data to be structured. Different systems want certain things a certain way and so, if the API allows for it, we can get it back (in our response), the way we want it, without any manipulation on our part. Think about your food in that restaurant. Wouldn’t you prefer the spices are added to the entree while it’s cooking versus you, doing it at the table? It’s more work for you, and it may not turn out as well.

My computer has Windows PowerShell, PowerShell, Git Bash, and using Windows Subsystem for Linux (WSL), Ubuntu as well. In the below images, we’ll run through making use of this API in nearly all of them—we’ll skip Windows PowerShell. Before we take a look at that, however, take a look at this.

It’s over the far-right of Postman. This amazing piece of software just gave you all you need to invoke this API in multiple languages, shells, etc. The cURL command is what can be used in the Mac Terminal or in a Linux Bash shell.

So, let’s try it. Here are a few API requests and responses from my machine. WSL is first, followed by Git Bash, followed by my favorite, PowerShell. The first two don’t use the format query parameter, while the last one does. A note about that after the images, however.

The PowerShell Invoke-RestMethod command was written to take returned JSON objects and automatically convert them into PowerShell objects. Therefore, the second command, ConvertTo-Json, is included to convert it back into JSON. So, that’s what’s happening there.

Why you’d learn an entire language to do this, I have no idea. Learn as much as you need now, and then go back and fill in what you want, and what becomes necessary to know at a later date. And, this is from someone that hates only knowing parts of things. I also understand time constraints and that there’s a better way to get up to speed with APIs than to learn an entire programming language, or two, or a couple of formatting languages. Focus on learning more about APIs for now; there’s plenty more to know than what we covered here.

GitHub Rate Limit REST API, JSON, and Epoch Time

Note: If you read this post and grab some examples from it, make sure you read to the bottom. I wrote my functions using a GitHub-depreciated JSON object up until the final function. You have been warned.

I am working with the GitHub REST API currently and I learned something new. Why not share it and create a resource for myself for the moment in time when I need a refresh.

Unauthenticated API requests to GitHub are limited to 60 per hour. I did some of my own checks, but there is documentation around this. At home, which is where I work these days, the private/NAT’ed IPs on two different computers, behind my public IP address, were two different devices to GitHub. This was not a huge surprise, but good to know, nonetheless.

Let’s take the GitHub rate limit API for a spin. This API is not rate-limited, which means you can call it as many times as you would like in an hour in an unauthenticated manner. GitHub writes, “Accessing this endpoint does not count against your REST API rate limit.” In the below example, we assign a $Uri and $Header variable and use those as a part of our Invoke-RestMethod command, which stores the results in the $Json variable.

$Uri = 'https://api.github.com/rate_limit'
$Headers = @{Headers = 'application/vnd.github.v3+json'}
$Json = Invoke-RestMethod -Uri $Uri -Headers $Headers | ConvertTo-Json
$Json

Once that is done, we output the contents stored in the $Json variable. Let’s chat about some of the data returned in this JSON object. Beneath resources and then core, we have limit, remaining, and used. Limit is how many unauthenticated requests we have (per hour), remaining is how many of 60 we have left in the hour, and used, is how many API calls we have already used this hour. Therefore if we had 59 remaining, we would have 1 used. Toward the bottom, we have rate. The content stored in this object is identical to what is stored in resources.core.

{
  "resources": {
    "core": {
      "limit": 60,        
      "remaining": 60,    
      "reset": 1646979987,
      "used": 0,
      "resource": "core"  
    },
    "graphql": {
      "limit": 0,
      "remaining": 0,     
      "reset": 1646979987,
      "used": 0,
      "resource": "graphql"
    },
    "integration_manifest": {
      "limit": 5000,
      "remaining": 5000,
      "reset": 1646979987,
      "used": 0,
      "resource": "integration_manifest"
    },
    "search": {
      "limit": 10,
      "remaining": 10,
      "reset": 1646976447,
      "used": 0,
      "resource": "search"
    }
  },
  "rate": {
    "limit": 60,
    "remaining": 60,
    "reset": 1646979987,
    "used": 0,
    "resource": "core"
  }
}

In the above, Invoke-RestMethod command, we piped the output to ConvertTo-Json in order to see the JSON object. Below, we will not do that, in order that we can see the PowerShell object and work with its properties. Remember, this command is quite powerful due to its built-in ability to deserialize returned content into something more useable by individuals working with PowerShell.

Without viewing the nested properties we are presented the resources and rate properties. This should seem familiar from the above JSON. We can return the rate property or the resources.core property and get to limit, remaining, and the used property, as well.

$NotJson = Invoke-RestMethod -Uri $Uri -Method Get -Headers $Headers
$NotJson

resources                                          rate
---------                                          ----
@{core=; graphql=; integration_manifest=; search=} @{limit=60; remaining=60; reset=1646692433; used=0; resource=core}

$NotJson.rate

limit     : 60
remaining : 60
reset     : 1646980018
used      : 0
resource  : core

There is another property that we have yet to discuss, and that is the reset property. This value is set on the first API call. It is set 60 minutes into the future, so we know when our remaining value will return to 60, and our used value will return to 0.

This property is recorded and stored as Epoch time (or Unix time, or Unix epoch). It is the number of seconds since January 1, 1970. The below PowerShell will covert the Epoch time to a human-readable form. We will use this further below inside a function we will use to see all of this working together.

((Get-Date 01.01.1970)+([System.TimeSpan]::fromseconds(1646980018))).ToLocalTime()
Thursday, March 10, 2022 11:26:58 PM

Like I often do, I created a function, which I will include below. I will not do a line-by-line explanation. I will discuss it, however. It accepts three parameters. One is the GitHub rate limit URI, one is the Headers we send in with each API call, and the final parameter is a URI of one of my projects on GitHub. Following the parameters, when the function is invoked, it will execute a rate limit GitHub API call and return some data, execute a GitHub API call against my project (with Out-Null [so no output]), and then execute a rate limit GitHub API call and return some data again. This allows us to see the before rate limit information and the after rate limit information. Take a look at the example beneath the function, after you have looked it over.

function Watch-GitHubRateLimit {
	[CmdletBinding()]
	Param (
		[Parameter()]
		$GHRateLimitUri = 'https://api.github.com/rate_limit',
		[Parameter()]
		$Headers = @{Headers = 'application/vnd.github.v3+json'},
		[Parameter()]
		$GHGeneralUri = 'https://api.github.com/repos/tommymaynard/TMOutput'
	)

	$Json = Invoke-RestMethod -Uri $GHRateLimitUri -Headers $Headers
	[PSCustomObject]@{
		Invocation = 'Before'
		Limit = $Json.Rate.Limit
		Remaining = $Json.Rate.Remaining
		ResetEpoch = $Json.Rate.Reset
		ResetReadable = ((Get-Date -Date 01-01-1970)+
			([System.TimeSpan]::fromseconds($Json.rate.reset ))).ToLocalTime()
		Used = $Json.Rate.Used
	}
	
	Invoke-RestMethod -Uri $GHGeneralUri -Headers $Headers | Out-Null

	$Json = Invoke-RestMethod -Uri $GHRateLimitUri -Headers $Headers
	[PSCustomObject]@{
		Invocation = 'After'
		Limit = $Json.Rate.Limit
		Remaining = $Json.Rate.Remaining
		ResetEpoch = $Json.Rate.Reset
		ResetReadable = ((Get-Date -Date 01-01-1970)+
			([System.TimeSpan]::fromseconds($Json.Rate.Reset ))).ToLocalTime()
		Used = $Json.Rate.Used
	}
}
Watch-GitHubRateLimit
Invocation    : Before
Limit         : 60
Remaining     : 60
ResetEpoch    : 1646980115
ResetReadable : 3/10/2022 11:28:35 PM
Used          : 0

Invocation    : After
Limit         : 60
Remaining     : 59
ResetEpoch    : 1646980115
ResetReadable : 3/10/2022 11:28:35 PM
Used          : 1

Now stop and go back and look at the above function. What is wrong with that? Forget GitHub and API calls (kind of). How could that function be improved?

Figure it out? Do you see all that duplicated code? Yuck. Looking at something like this should trigger a feeling that something is wrong. What we need, instead of all that duplicated code, is a nested function. I have a rewrite below. Take a look and then view the next two examples.

function Watch-GitHubRateLimit {
	[CmdletBinding()]
	Param (
		[Parameter()]
		$GHRateLimitUri = 'https://api.github.com/rate_limit',
		[Parameter()]
		$Headers = @{Headers = 'application/vnd.github.v3+json'},
		[Parameter()]
		$GHGeneralUri = 'https://api.github.com/repos/tommymaynard/TMOutput'
	)

	function Get-GitHubRateLimitStats {
		param ($Invocation)
		$Json = Invoke-RestMethod -Uri $GHRateLimitUri -Headers $Headers
		[PSCustomObject]@{
			Invocation = $Invocation
			Limit = $Json.Rate.Limit
			Remaining = $Json.Rate.Remaining
			ResetEpoch = $Json.Rate.Reset
			ResetReadable = ((Get-Date -Date 01-01-1970)+
				([System.TimeSpan]::fromseconds($Json.Rate.Reset ))).ToLocalTime()
			Used = $Json.Rate.Used
		}
	}

	Get-GitHubRateLimitStats -Invocation 'Before'
	Invoke-RestMethod -Uri $GHGeneralUri -Headers $Headers | Out-Null
	Get-GitHubRateLimitStats -Invocation 'After'
}
Watch-GitHubRateLimit
Invocation    : Before
Limit         : 60
Remaining     : 59
ResetEpoch    : 1646980116
ResetReadable : 3/10/2022 11:28:36 PM
Used          : 1

Invocation    : After
Limit         : 60
Remaining     : 58
ResetEpoch    : 1646980116
ResetReadable : 3/10/2022 11:28:36 PM
Used          : 2

Much better, but then I read this: “Note: The rate object is deprecated. If you’re writing new API client code or updating existing code, you should use the core object instead of the rate object. The core object contains the same information that is present in the rate object.” This is taken from this Git Hub Rate limit page. Now that we know this, let’s update the final function so it uses the correct object.

function Watch-GitHubRateLimit {
	[CmdletBinding()]
	Param (
		[Parameter()]
		$GHRateLimitUri = 'https://api.github.com/rate_limit',
		[Parameter()]
		$Headers = @{Headers = 'application/vnd.github.v3+json'},
		[Parameter()]
		$GHGeneralUri = 'https://api.github.com/repos/tommymaynard/TMOutput'
	)

	function Get-GitHubRateLimitStats {
		param ($Invocation)
		$Json = Invoke-RestMethod -Uri $GHRateLimitUri -Headers $Headers
		[PSCustomObject]@{
			Invocation = $Invocation
			Limit = $Json.Resources.Core.Limit
			Remaining = $Json.Resources.Core.Remaining
			ResetEpoch = $Json.Resources.Core.Reset
			ResetReadable = ((Get-Date -Date 01-01-1970)+
				([System.TimeSpan]::fromseconds($Json.Resources.Core.Reset ))).ToLocalTime()
			Used = $Json.Resources.Core.Used
		}
	}

	Get-GitHubRateLimitStats -Invocation 'Before'
	Invoke-RestMethod -Uri $GHGeneralUri -Headers $Headers | Out-Null
	Get-GitHubRateLimitStats -Invocation 'After'
}
Watch-GitHubRateLimit
Invocation    : Before
Limit         : 60
Remaining     : 58
ResetEpoch    : 1646980116
ResetReadable : 3/10/2022 11:28:36 PM
Used          : 2

Invocation    : After
Limit         : 60
Remaining     : 57
ResetEpoch    : 1646980116
ResetReadable : 3/10/2022 11:28:36 PM
Used          : 3

I know how I feel about a post as I am writing it and those feelings stay consistent until the end. I like what I have written here. There is a great opportunity to learn about working with APIs and even some function writing. I hope is it helpful for others, obviously (as it is kind of what I do). The takeaway though is that if there is an API for something, work with it. It is fascinating to gather information from websites and web services via PowerShell. One day you will be asked to do it, so you might as well get some experience now.

AWS Vendor-Written Generated Code


Notice: The following post was originally published on another website. As the post is no longer accessible, it is being republished here on tommymaynard.com. The post was originally published on September 13, 2019.


Sometimes you read an error message, or in this case, come across some vendor-written code that you can’t find anywhere else on the Internet. It’s been years, but once PowerShell generated an error I had never seen. I couldn’t find a hit for it online anywhere either. I felt that once I had figured out the problem behind that error message, it was my duty to write about it—to help get that error message picked up by search engines, as well as my experience. I feel nearly the same way about the below code to which I was recently introduced, written by AWS, or Amazon Web Services. I’ll share it now. Just a note. This is exactly how it was found. There were no indentations. I’m not too concerned about the lack of indentations—I don’t have to stare at it each day. Perhaps it’s generated by something that may not be preserving the spacing/tabs. That’s just a guess.

try { 
Get-Service Ec2Config 
$EC2SettingsFile='C:\Program Files\Amazon\Ec2ConfigService\Settings\Config.xml' 
$xml = [xml](get-content $EC2SettingsFile) 
$xmlElement = $xml.get_DocumentElement() 
$xmlElementToModify = $xmlElement.Plugins 
foreach ($element in $xmlElementToModify.Plugin){ 
if ($element.name -eq 'Ec2SetPassword') {$element.State='Enabled'} 
elseif ($element.name -eq 'Ec2HandleUserData') {$element.State='Enabled'} 
elseif ($element.name -eq 'Ec2DynamicBootVolumeSize') {$element.State='Enabled'} 
} 
$xml.Save($EC2SettingsFile) 
} 
catch 
{ 
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -ExecutionPolicy Bypass -File 'C:\ProgramData\Amazon\EC2-Windows\Launch\Scripts\InitializeInstance.ps1' -Schedule 
} 
finally 
{ 
New-Item -Path HKLM:\Software\Amazon -Name WarmBoot 
Invoke-Expression -Command:'shutdown.exe /s /t 0' 
} 
while ($true){ 
}

It’s rare that I ever manually launch an AWS EC2 instance (a virtual server). Well, I was doing that recently for some quick testing and my UserData PowerShell script was not landing in the C:\Program Files\Amazon\Ec2ConfigService\Scripts\UserScript.ps1 file on my Windows Server 2012 R2 instance, as it should have been. I was doing something wrong and it wasn’t clear to me what, soon enough. Before we get to that, let’s discuss what we have here. We have a try-catch language construct. I know from my AWS experience that most of what’s going on in the try block was done for Windows Server 2012 R2 and newer. I also know that what’s in the catch block is how we ensure UserData is enabled in Windows Server 2016 and later. AWS couldn’t take my UserData and drop it on this instance. Instead, I got this code in its place. Ugh.

This code also includes the finally block. That code is run regardless of whether the try or catch block fired. The code creates a value in the Windows Registry and then restarts the computer using Invoke-Expression—interesting choice. It’s always fun to see vendor code. It closes with an empty While language construct. While $true is $true, this While loop will run—and do absolutely nothing. It will do that successfully, however.

Again, my UserData PowerShell wasn’t getting into this UserScript file. It was, however, available by “navigating” to 169.254.169.254 on the EC2 instance.

Invoke-RestMethod -Uri http://169.254.169.254/latest/user-data
# >> Add function to memory.
Function Set-SystemForNextRun {
...
    Set-SystemForNextRun -CodeSectionComplete 2
} # End If-ElseIf.

The problem was that my code didn’t have the begin and end PowerShell tags. In order for the UserScript.ps1 file to be populated with my code and not this code from Amazon, I needed to ensure I was including everything required by me. It seems like something in the AWS Management Console (the web front end) could’ve notified me after looking at my code, but before moving to the next step in manually building my instance. Or, be even less helpful, and additionally write something else in the UserScript.ps1 file. They could’ve just started their code with a comment to tell me I didn’t follow the directions. I’ve used UserData in CloudFormation; I know these tags are required.

<#
This doesn't look right, does it?
Did you remember to use script or
powershell start and end tags?
#>

Anyway, once I enclosed my PowerShell in the proper tags, it worked, and I moved on. And by moved on I mean, I found another problem to consume me—as is typical in this industry. It should’ve look liked this from the start. Ugh.

Invoke-RestMethod -Uri http://169.254.169.254/latest/user-data
<powershell>
# >> Add function to memory.
Function Set-SystemForNextRun {
...
    Set-SystemForNextRun -CodeSectionComplete 2
} # End If-ElseIf.
</powershell>

Things changed between Windows Server 2012 R2 and 2016, 2019. I’m not exactly sure where the UserData code ends up in the newer OS if it even does end up in a file on disk as it has previously.

Invoke-RestMethod with a SOAP API

Did it again. I’m back here this evening to write another post after authoring some PoC code for a project. I’ve got to put it somewhere for later. In doing that, there’s no reason it shouldn’t be content in which others can consume.

In the past, I have absolutely written PowerShell to interact with REST (REpresentational State Transfer) APIs. What I haven’t done until now, is interact with SOAP (Simple Object Access Protocol) APIs. The below example is a full-featured PowerShell function. Its purpose, with the help of W3Schools public SOAP API, is to convert Fahrenheit temperatures to Celsius and vice versa. Typical normal stuff, except that I’m not doing the calculations. Instead, I’m sending off a temperature to an API to be converted. One of the big differences between REST and SOAP is the content of the payload. With SOAP we’re sending XML, whereas, with REST, we’d likely send JSON.

While you can inspect the code yourself, I will at minimum tell you how the function can be invoked. Its name is Convert-Temperature and it includes two parameters: Temperature and TemperatureScale. Enter an integer for the Temperature parameter and either Fahrenheit or Celsius for the TemperatureScale parameter. It’ll send off the information as XML to the API. The returned value will be extracted from the returned XML and rounded to two decimal places. I was going to make the rounding optional but I obviously changed my mind. I didn’t find any value in allowing a user to make this determination.

Here are a couple of invocation examples first, and then, the code. Maybe this will be helpful for you. I suspect it will be for me in time, as I’m going to likely have to make use of a SOAP API.

[PS7.1.0] [C:\] Convert-Temperature -Temperature 212 -TemperatureScale Fahrenheit

TemperatureScale Temperature ConvertedTemperature
---------------- ----------- --------------------
Fahrenheit               212                  100

[PS7.1.0] [C:\] Convert-Temperature -Temperature 100 -TemperatureScale Celsius

TemperatureScale Temperature ConvertedTemperature
---------------- ----------- --------------------
Celsius                  100                  212
Function Convert-Temperature {
	[CmdletBinding()]
	Param (
		[Parameter(Mandatory)]
		[int]$Temperature,

		[Parameter(Mandatory)]
		[ValidateSet('Fahrenheit','Celsius')]
		[string]$TemperatureScale
	) # End Param

	Begin {
		$Headers = @{'Content-Type' = 'text/xml'}
	} # End Begin.

	Process {
		If ($TemperatureScale -eq 'Fahrenheit') {
			$Body = @"
<soap12:Envelope xmlns:xsi=`"http://www.w3.org/2001/XMLSchema-instance`" xmlns:xsd=`"http://www.w3.org/2001/XMLSchema`" xmlns:soap12=`"http://www.w3.org/2003/05/soap-envelope`">
	<soap12:Body>
		<FahrenheitToCelsius xmlns=`"https://www.w3schools.com/xml/`">
			<Fahrenheit>$Temperature</Fahrenheit>
		</FahrenheitToCelsius>
	</soap12:Body>
</soap12:Envelope>
"@
		} Else {
			$Body = @"
<soap12:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap12="http://www.w3.org/2003/05/soap-envelope">
	<soap12:Body>
		<CelsiusToFahrenheit xmlns="https://www.w3schools.com/xml/">
			<Celsius>$Temperature</Celsius>
		</CelsiusToFahrenheit>
	</soap12:Body>
</soap12:Envelope>
"@
		} # End If-Else.
	} # End Process.

	End {
		$Response = Invoke-RestMethod -Uri 'https://www.w3schools.com/xml/tempconvert.asmx' -Method 'POST' -Headers $Headers -Body $Body
		If ($TemperatureScale -eq 'Fahrenheit') {
			$ConvertedTemperature = $([System.Math]::Round($Response.Envelope.Body.FahrenheitToCelsiusResponse.FahrenheitToCelsiusResult,2))
		} Else {
			$ConvertedTemperature = ([System.Math]::Round($Response.Envelope.Body.CelsiusToFahrenheitResponse.CelsiusToFahrenheitResult,2))
		} # End If-Else.

		[PSCustomObject]@{
			TemperatureScale = $TemperatureScale
			Temperature = $Temperature
			ConvertedTemperature = $ConvertedTemperature
		}
	} # End End.
} # End Function: Convert-Temperature.

Part V: Splunk, HEC, Indexer Acknowledgement, and PowerShell

In the last four parts of this series (Part I, Part II, Part III, Part IV),  we discussed sending telemetry data to Splunk using HEC (HTTP Event Collector). This requires no software to be installed. We can send data for ingestion to Splunk using REST and a REST endpoint. In the previous four parts, we’ve included Indexer Acknowledgement. This set up uses a random GUID we create and send to Splunk in our initial connection using Invoke-RestMethod.

Then, in additional requests we use the GUID to continually poll Splunk to determine if the data has been more than just received, but that it’s being processed. There was much too much delay for me to consider its use and so I disabled Indexer Acknowledgment. In this post, we’ll take our final code for Part IV, below, and remove all the parts that were in place for Indexer Acknowledgment. This greatly reduces the amount of code and overall complexity seen in the previous parts of this series. Compare the two code examples below as we wrap up the series. Hopefully, if you were looking for ways to send data to Splunk using PowerShell that you found this series of articles and in time that they were helpful for you. If you’ve got Splunk available to you, then don’t forget that you have a place where data can be sent and stored after it’s been collected with PowerShell. And it’s not just about data storage. In fact, that’s such a small portion of the benefits to Splunk. If you’re collecting good data, then there’s nothing you can’t find by searching the data.

With Indexer Acknowledgement

#region: Read .env file and create environment variables.
$FilterScript = {$_ -ne '' -and $_ -notmatch '^#'}
$Content = Get-Content -Path 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\.env' | Where-Object -FilterScript $FilterScript
If ($Content) {
    Foreach ($Line in $Content) {
        $KVP = $Line -split '=',2; $Key = $KVP[0].Trim(); $Value = $KVP[1].Trim()
        Write-Verbose -Message "Adding an environment variable: `$env`:$Key."
        [Environment]::SetEnvironmentVariable($Key,$Value,'Process')
    } # End Foreach.
} Else {
    '$Content variable was not set. Do things without the file/the environment variables.'
}   # End If.
#endregion.
 
#region: Read clixml .xml file and obtain hashtable.
$HashTablePath = 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\eventhashtable.xml'
$EventHashTable = Import-Clixml -Path $HashTablePath
#endregion.
 
#region: Create Splunk / Invoke-RestMethod variables.
$EventUri = $env:SplunkUrl + '/services/collector/event'
$AckUri = $env:SplunkUrl + '/services/collector/ack'
$ChannelIdentifier = (New-Guid).Guid
$Headers = @{Authorization = "Splunk $env:SplunkHECToken"; 'X-Splunk-Request-Channel' = $ChannelIdentifier}
$Body = ConvertTo-Json -InputObject $EventHashTable
$HttpRequestEventParams = @{URI = $EventUri; Method = 'POST'; Headers = $Headers; Body = $Body}
#endregion.
 
#region: Make requests to Splunk REST web services.
$Ack = Invoke-RestMethod @HttpRequestEventParams -StatusCodeVariable StatusCode -Verbose
$StatusCode
$AckBody = "{`"acks`": [$($Ack.ackId)]}"
$HttpRequestAckParams = @{URI = $AckUri; Method = 'POST'; Headers = $Headers; Body = $AckBody}
 
Measure-Command -Expression {
    Do {
        $AckResponse = Invoke-RestMethod @HttpRequestAckParams -Verbose
        $AckResponse.acks.0
        Start-Sleep -Seconds 30
    } Until ($AckResponse.acks.0 -eq $true)
} # End Measure-Command
#endregion.

Without Indexer Acknowledgement

#region: Read .env file and create environment variables.
$FilterScript = {$_ -ne '' -and $_ -notmatch '^#'}
$Content = Get-Content -Path 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\.env' | Where-Object -FilterScript $FilterScript
If ($Content) {
    Foreach ($Line in $Content) {
        $KVP = $Line -split '=',2; $Key = $KVP[0].Trim(); $Value = $KVP[1].Trim()
        Write-Verbose -Message "Adding an environment variable: `$env`:$Key."
        [Environment]::SetEnvironmentVariable($Key,$Value,'Process')
    } # End Foreach.
} Else {
    '$Content variable was not set. Do things without the file/the environment variables.'
}   # End If.
#endregion.
 
#region: Read clixml .xml file and obtain hashtable.
$HashTablePath = 'C:\Users\tmaynard\Documents\_Repos\code\functiontemplate\random\telemetry\eventhashtable.xml'
$EventHashTable = Import-Clixml -Path $HashTablePath
#endregion.
 
#region: Create Splunk / Invoke-RestMethod variables.
$EventUri = $env:SplunkUrl + '/services/collector/event'
$Headers = @{Authorization = "Splunk $env:SplunkHECToken"}
$Body = ConvertTo-Json -InputObject $EventHashTable
$HttpRequestEventParams = @{URI = $EventUri; Method = 'POST'; Headers = $Headers; Body = $Body}
#endregion.
 
#region: Make requests to Splunk REST web services.
Invoke-RestMethod @HttpRequestEventParams -StatusCodeVariable StatusCode -Verbose
$StatusCode

Part IV: Splunk, HEC, Indexer Acknowledgement, and PowerShell

And now, Part IV! Like I’ve mentioned previously, please do yourself a favor and read the previous parts to this series of posts: Part I, Part II, and Part III. This won’t make sense without it. It may not make sense with it, and if that turns out to be true, then let me know by leaving a comment, or reaching me on Twitter. I know it’s possible that Splunk feels like a foreign language. What I know, I learned over the course of a few weeks, and it doesn’t feel like much. I probably shouldn’t be surprised that there’s so much to cover; so much was learned. Maybe you’ll have to do it too: I read the same articles over and over and over. I did have some help from a colleague, but if you don’t have that, then you can consider it me, up to a point,

The next thing we’ll look at is our first POST request to Splunk. This is us, sending in our JSON payload. Notice our first below Invoke-RestMethod command. First, it includes our $HttpRequestEventParams parameter hash table. This includes the Uri, Method, Headers, and the Body parameter and their corresponding parameter values. Here’s a reminder of what’s in that parameter hash table.

URI = @{$EventUri; Method = 'POST'; Headers = $Headers; Body = $Body}

As a part of this Invoke-RestMethod command, we also included the StatusCodeVariable parameter. When our command completes, it will have created the $StatusCode variable which should contain a 200 response, if our message was received by Splunk. Additionally, we have this command writing any output of the command (from Splunk) into the $Ack variable.

#region: Make requests to Splunk REST web services.
$Ack = Invoke-RestMethod @HttpRequestEventParams -StatusCodeVariable StatusCode -Verbose 
$StatusCode
$AckBody = "{`"acks`": [$($Ack.ackId)]}"
$HttpRequestAckParams = @{URI = $AckUri; Method = 'POST'; Headers = $Headers; Body = $AckBody}

Measure-Command -Expression {
    Do {
        $AckResponse = Invoke-RestMethod @HttpRequestAckParams -Verbose
        $AckResponse.acks.0
        Start-Sleep -Seconds 30
    } Until ($AckResponse.acks.0 -eq $true)
} # End Measure-Command
#endregion.

Keep in mind here, that if we weren’t using indexer acknowledgment, we’d probably be done. But now, we’ve got to create a second POST request to Splunk, so we can determine when the data that we know was received (200 response), is headed into the Splunk pipeline for processing. It’s a guarantee of data ingestion (not indigestion). But, since we’re going along down the indexer acknowledgment path (for now and only now), let’s walk through what else is happening here.

First, we create $AckBody. It uses the PowerShell object in $Ack and turns it back into JSON. Invoke-RestMethod has this helpful feature of turning JSON into a PowerShell object, so we’ve got to reverse that so we can send it back to Splunk. Once $AckBody is done, we’ll use it as a parameter value in $HttpRequestAckParams. About $AckBody, be sure to read this page (it’s been linked before), under the “Query for indexing status” section. Splunk sends us an ack ID and then we need to send it back with the same headers as in the first Invoke-RestMethod request. Remember, this includes the HEC token (pulled out of our .env file forever ago), and the channel identifier we created as a random GUID.

Following? Again, reach out to me if you need this information and I’m failing at getting it across to you. I’m not too big to help/rewrite/clarify. I get it, this has the potential to make someone launch their laptop out of a second-story window. Luckily, I work on the first floor of my house. Also, print out the Splunk articles I’ve linked and make them a home on the drink table next to your couch (like I did).

Okay, we’re on the last step. Remember, we’re still working as those indexer acknowledgement is enabled. This whole second POST request is irrelevant if you’re not going to use it. As is the channel identifier. Again, I’ll post modified code just as soon as my indexer acknowledgment is disabled.

Measure-Command -Expression {
    Do {
        $AckResponse = Invoke-RestMethod @HttpRequestAckParams -Verbose
        $AckResponse.acks.0
        Start-Sleep -Seconds 30
    } Until ($AckResponse.acks.0 -eq $true)
} # End Measure-Command

I mentioned that I’m not going to be using indexer acknowledgment because of the time it takes; I simply don’t have that. I’m in the business of automating. I record the duration of every function I invoke. It isn’t going to work for me. Anyway, I have my second Invoke-RestMethod request inside a Do-Until loop. This loop continues to run the command every 30 seconds until I get a $true response from Splunk (that means that Splunk is for sure going to index my data). For fun, that’s inside of a Measure-Command Expression parameter. This, is how I determined it took too much time for me. Below I’ll include the entire code as one block from all three posts. In the fifth post, yet to be written, I’ll include the entire code as one block, too. And remember, that post won’t have any requirements on an enabled indexer acknowledgment, the second POST request, the channel identifier, etc.

Thank you for hanging in for this one. Hopefully, it proves helpful for someone. Oh, that reminds me. There were two Splunk functions written to do all this when I started looking around. Maybe you found them too. They made little sense to me until I really took the time to learn what’s happening. Now that you’ve read this series, read over those functions; see what you understand about them now that you wouldn’t have been able to before. It might be much more than you thought it would be.

@torggler https://www.powershellgallery.com/packages/Send-SplunkEvent

@halr9000 https://gist.github.com/halr9000/d7bce26533db7bca1746

#region: Read .env file and create environment variables.
$FilterScript = {$_ -ne '' -and $_ -notmatch '^#'}
$Content = Get-Content -Path 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\.env' | Where-Object -FilterScript $FilterScript
If ($Content) {
	Foreach ($Line in $Content) {
		$KVP = $Line -split '=',2; $Key = $KVP[0].Trim(); $Value = $KVP[1].Trim()
		Write-Verbose -Message "Adding an environment variable: `$env`:$Key."
		[Environment]::SetEnvironmentVariable($Key,$Value,'Process')
	} # End Foreach.
} Else {
	'$Content variable was not set. Do things without the file/the environment variables.'
}	# End If.
#endregion.

#region: Read clixml .xml file and obtain hashtable.
$HashTablePath = 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\eventhashtable.xml'
$EventHashTable = Import-Clixml -Path $HashTablePath
#endregion.

#region: Create Splunk / Invoke-RestMethod variables.
$EventUri = $env:SplunkUrl + '/services/collector/event'
$AckUri = $env:SplunkUrl + '/services/collector/ack'
$ChannelIdentifier = (New-Guid).Guid
$Headers = @{Authorization = "Splunk $env:SplunkHECToken"; 'X-Splunk-Request-Channel' = $ChannelIdentifier}
$Body = ConvertTo-Json -InputObject $EventHashTable
$HttpRequestEventParams = @{URI = $EventUri; Method = 'POST'; Headers = $Headers; Body = $Body}
#endregion.

#region: Make requests to Splunk REST web services.
$Ack = Invoke-RestMethod @HttpRequestEventParams -StatusCodeVariable StatusCode -Verbose 
$StatusCode
$AckBody = "{`"acks`": [$($Ack.ackId)]}"
$HttpRequestAckParams = @{URI = $AckUri; Method = 'POST'; Headers = $Headers; Body = $AckBody}

Measure-Command -Expression {
	Do {
		$AckResponse = Invoke-RestMethod @HttpRequestAckParams -Verbose
		$AckResponse.acks.0
		Start-Sleep -Seconds 30
	} Until ($AckResponse.acks.0 -eq $true)
} # End Measure-Command
#endregion.

Hash Table in Hash Table to JSON

Edit: It turns out that I did in fact go over nesting a hash table inside a hash table in Part II of my Splunk series. There’s still some likable and solid content in this post though.

It’s how it works. A single topic, or idea, or even a real live project, can lead to additional writing and posting. As many might recognize, I use my blog for at least two things. One, it’s a place for you and others to come and potentially learn something new. Or maybe it’s just to reinforce a concept. I do my best to make things quick and clear. Two, it’s. for. me. Sometimes I share something simply because I need a place to store it for my own reference. Every post I’ve written — and I’m getting close to 350 of them — serves both purposes. This one certainly does, too.

If you’ve been paying attention, you know I’m currently working with my function template (which I write as FunctionTemplate at work), gathering telemetry data, and posting that to Splunk via a REST endpoint. It’s been fascinating. I had wanted an opportunity to work with Splunk and lucky for me a colleague mentioned it, even though I was preparing to work with AWS. I’m grateful they did!

A part of working with Splunk and REST and HEC requires that a payload be sent in as JSON. Luckily, PowerShell includes a command to convert a hash table to JSON. As a part of this project, I’ve converted numerous strings and even an array to JSON. Take a look.

I got to thinking that I want my telemetry code to acquire the city, state, and country via the public IP address and a geolocation API. Although it started this way, I decided I didn’t want single strings in my JSON for each of the properties.

Therefore, I needed to create a hash table of the data (within the parent hash table), and then convert that to JSON. Yes, you heard that correctly, we’re going to nest a hash table inside of another hash table and convert that to JSON. You may remember something about this in the Splunk series. Well, we cover it all again, and all because I want my Location data inside a hash table and not as three single strings. In the end, the below explanations — most of it anyway — will get us to this image.

Let’s pretend my public IP address is 8.8.8.8. Much of the hey-sign-up text in the below response won’t be there if you’re using your own public IP address, as opposed to one of Google’s. I’d still recommend you take a look at the ipapi.co website and do your own due diligence regarding the API.

Once I knew how to obtain my geolocation data via ipapi.co, I could use the below code. In the code I create three hash tables:

  • TelemetryHash
  • TelemetryHashStandard
  • TelemetryHashLocation

The TelemetryHashStandard hash table holds two key-value pairs: DateTime and Directory (as in our location within the operating system). These aren’t vital for more than the inclusion of a couple of standard entries in the parent hash table. The TelemetryHashLocation hash table holds three key-value pairs: City, Region, and Country.

Once the values are obtained and stored in one of the two hash tables, we store TelemetryHashStandard in TelemetryHash. Then we add our TelemetryHashLocation hash table as a single key-value pair to the TelemetryHash hash table. Now that you’ve gotten through that reading, be sure to review the below code.

Remove-Variable -Name TelemetryHash,TelemetryHashStandard,TelemetryHashLocation -ErrorAction SilentlyContinue
New-Variable -Name TelemetryHash -Value @{}
New-Variable -Name TelemetryHashStandard -Value @{}
New-Variable -Name TelemetryHashLocation -Value @{}

$TelemetryHashStandard.Add('DateTime',"$(Get-Date)")
$TelemetryHashStandard.Add('Directory',"$((Get-Location).Path)")

$Location = Invoke-RestMethod -Uri "https://ipapi.co/8.8.8.8/json"
$TelemetryHashLocation.Add('City',$Location.city)
$TelemetryHashLocation.Add('Region',$Location.region)
$TelemetryHashLocation.Add('Country',$Location.country_name)

$TelemetryHash = $TelemetryHashStandard
$TelemetryHash.Add('Location',$TelemetryHashLocation)
$TelemetryHash | ConvertTo-Json

{
  "Location": {
    "Region": "California",    
    "Country": "United States",
    "City": "Mountain View"    
  },
  "Directory": "C:\\",
  "DateTime": "01/13/2021 19:33:14"    
}

See that; just above? There are the two single strings — DateTime and Directory — as well as the single Location hash table and its nested keys and values. And, just for fun, here’s another example. Here we create two hash tables: one for parents and one for children. Once they’re both populated, we add the children to the parents’ hash table. Like we did above, we could’ve gotten everything into a single hash table that was neither the parent’s or children’s hash table. All this to say, there’s no requirement for a third and final hash table needed prior to the JSON export.

[PS7.1.0] C:\> $Parent = @{}; $Child = @{}
[PS7.1.0] C:\> $Child.Add('Son','CM')
[PS7.1.0] C:\> $Child.Add('Daughter','AM')
[PS7.1.0] C:\> $Child

Name                           Value
----                           -----
Son                            CM
Daughter                       AM

[PS7.1.0] C:\> $Parent.Add('Father','Tommy')
[PS7.1.0] C:\> $Parent.Add('Mother','JM')
[PS7.1.0] C:\> $Parent

Name                           Value
----                           -----
Mother                         JM
Father                         Tommy

[PS7.1.0] C:\> $Parent.Add('Children',$Child)
[PS7.1.0] C:\> $Parent

Name                           Value
----                           -----
Children                       {Son, Daughter}
Father                         Tommy
Mother                         JM

[PS7.1.0] C:\> $Parent.Children

Name                           Value
----                           -----
Son                            CM
Daughter                       AM

[PS7.1.0] C:\> $Parent | ConvertTo-Json
{
  "Children": {
    "Son": "CM",
    "Daughter": "AM"
  },
  "Father": "Tommy",
  "Mother": "JM"
}

By nesting a hash table inside of another hash table, we can convert to JSON and maintain the data’s original structure. Add arrays and hash tables to the hash table you intend to convert/export to JSON, and they’ll be there just as you expect them to be.

Part III: Splunk, HEC, Indexer Acknowledgement, and PowerShell

This is part three of a continuation of posts on Splunk, HEC, and sending data using PowerShell to Splunk for consumption. Make sure you’ve read the first two parts (Part I and Part II), as I’m going to assume you have!

Now that we have our data ready to be sent to Splunk in our $EventHashTable variable, we need to create some URLs using the environment variables we created in Part I. The first two lines create two URIs. In the first one, $EventUri, combines the value stored in $env:SplunkUrl with the string ‘/services/collector/event’. The second one, $AckUri, combines the same $env:SplunkUrl with the string ‘/services/collector/ack’. When completed, these two URIs are two different Splunk REST endpoints.

#region: Create Splunk / Invoke-RestMethod variables.
$EventUri = $env:SplunkUrl + '/services/collector/event'
$AckUri = $env:SplunkUrl + '/services/collector/ack'
$ChannelIdentifier = (New-Guid).Guid
$Headers = @{Authorization = "Splunk $env:SplunkHECToken"; 'X-Splunk-Request-Channel' = $ChannelIdentifier}
$Body = ConvertTo-Json -InputObject $EventHashTable
$HttpRequestEventParams = @{URI = $EventUri; Method = 'POST'; Headers = $Headers; Body = $Body}
#endregion.

We’re only going to need the second URI — $AckUri (see the above code) — if our HEC token requires indexer acknowledgment. For now, we’ll assume that it does. Beneath the two lines that create and assign the $EventUri and $AckUri variable is the creation of the $ChannelIdentifier variable. As the HEC has indexer acknowledgment enabled (a checkmark in the GUI), you’re required to make more than one POST request to Splunk. You send your first one, which we’ll see momentarily, and then you send a second one, as well. The first one gets your data to Splunk. The second (or third, or fourth, or fifth, etc.) lets you know if your data is being processed by Splunk. It may take more than just a second POST request to get back an acknowledgment. It’s a nice feature — who wouldn’t want to know!? Not only do you get your 200 status response letting you know Splunk received your data in the first request, but you can also know when Splunk is actually doing something with it!

I loved the idea at first; however, I was quick to discover it was taking anywhere from two to four, and sometimes five, minutes to return “true.” Once it returned true, I could rest assure that the data was being indexed by Splunk. Nice feature, but for me, that’s too much time to spend waiting. If I get a 200 back from my first POST, then I’d rather just carry on without concerning myself with whether or not my data actually hit the Splunk processing pipeline. From Part I through here, we’re going to continue as though you want to use indexer acknowledgment. Toward the end, we’ll see the code changes I implement to remove it.

I should say this, however, it’s in here, but the idea behind indexer acknowledgment is not about security. It’s about, and I quote, “…to prevent a fast client from impeding the performance of a slow client.” You see, index acknowledgment slows things down. So much so, I couldn’t use it. Lovely idea, but not for my purposes. It also uses channels, as you might’ve gathered from our $ChannelIdentifier variable. To create a channel, you create a random GUID and send that in with your initial POST request. Then you use the same channel identifier to obtain the acknowledgment. As I write this and read this, I realize that you really ought to read through the document I linked above. Here it is again: https://docs.splunk.com/Documentation/Splunk/8.1.0/Data/AboutHECIDXAck. It’s going to do this process more justice than I have. That said, I feel confident that what I’ve had to say might actually prove helpful when combined with the Splunk documentation.

After our $ChannelIdentifier variable is created and assigned, we’re going to create a $Headers (hash table) variable. It will contain two headers: one for authorization that contains our HEC token and one for our channel identifier used for indexer acknowledgment. The below code has been copied from the above code example.

$Headers = @{
    Authorization = "Splunk $env:SplunkHECToken"
    'X-Splunk-Request-Channel' = $ChannelIdentifier
}

Following that variable creation, we’re going to create two more. One will be our $Body variable and the other a hash table of parameters and their values that we’ll end up including with Invoke-RestMethod in Part IV. As a part of creating the $Body variable, we’re going to take our Hash table (with its nested hash table) and convert it all to JSON. This is what we want to send to Splunk. The below code has also been copied from the above code example.

$Body = ConvertTo-Json -InputObject $EventHashTable
$HttpRequestEventParams = @{
    URI = $EventUri
    Method = 'POST'
    Headers = $Headers
    Body = $Body
}

I think it’s important to see the JSON representation of our $EventHashTable variable because you’ll end up seeing this in the Splunk documentation. Notice the placement of the source, host, and time event metadata versus the (nested) event data.

$Body = Convertto-Json -InputObject $EventHashTable
$EventHashTable
{
  "source": "PowerShellFunctionTemplate",
  "host": "mainpc.tommymaynard.com\\TOMMYCOMPUTER",
  "event": {
    "OperatingSystemPlatform": "Win32NT",
    "PSHostProgram": "ConsoleHost",
    "PSVersionActive": "PowerShell 7.0.1"
  },
  "time": 1608148655,
  "sourcetype": "Telemetry"
}

The final line in our code example is the creation of a parameter hash table we’ll use — in a later part — to send our data to Splunk using Invoke-RestMethod.

$HttpRequestEventParams = @{
    URI = $EventUri
    Method = 'POST'
    Headers = $Headers
    Body = $Body
}

Until next time. … Part IV is now available.

Part II: Splunk, HEC, Indexer Acknowledgement, and PowerShell

In the first part of this series, we discussed using a .env file to obtain a Splunk HEC token and our Splunk URL. These two pieces of information are going to need to make their way into our POST request to Splunk. Be sure to read Part I if you haven’t already.

The next thing we need to do is pull in a hash table from the disk. The reason I’ve done it this way is to ensure I had a hash table to use without having to create it. It was a quicker way to work with my code, as I was still in the development and testing phase. In production, the hash table would be created not from the disk/stale data, but from real-time data collection.

#region: Read clixml .xml file and obtain hashtable.
$HashTablePath = 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\eventhashtable.xml'
$EventHashTable = Import-Clixml -Path $HashTablePath
#endregion.

The above code does two things: One, it sets the $HashTablePath variable to an XML file. This file was created previously using Export-Clixml. The second thing it does is import the XML file using Import-Clixml. So, what’s in the file and what are we doing here? As we’ll see later, we’re going to use a hash table—an in-memory data structure to store key-value pairs—and get that converted (to something else) and sent to Splunk.

This isn’t just any hash table, so let’s take a look at what’s stored in the $EventHashTable variable once these two lines of code are complete.

$EventHashTable
Name                           Value
----                           -----
event                          {OperatingSystemPlatform, IPAddressPublic, Command:0, PSVersionOther…}
sourcetype                     Telemetry
host                           mainpc.tommymaynard.com\TOMMYCOMPUTER
time                           1608148655
source                         PowerShellFunctionTemplate

If you’ve worked with PowerShell hash tables before, then you’ll probably recognize that there’s a nested hash table inside this hash table. That, or at minimum, that one of the values doesn’t seem right. This nested hash table is stored as the value for the event key. The four other keys — host, time, source, and sourcetype — are standard key-value pairs that have values that are not nested hash tables. Let’s take a look at the key-value pairs inside the nested hash table. We can use dotted notation to view the values stored in each key.

$EventHashTable.host
mainpc.tommymaynard.com\TOMMYCOMPUTER
$EventHashTable.time; $EventHashTable.source
1608148655
PowerShellFunctionTemplate
$EventHashTable.sourcetype
Telemetry

Now, let’s view the key-value pairs stored inside the nested hash table. Then we’ll run through how we’re able to nest a hash table inside another hash table before it was saved to disk using Export-Clixml.

 $EventHashTable.event  
Name                           Value
----                           -----
OperatingSystemPlatform        Win32NT
IPAddress                      64.204.192.66
Command:0:CommandName          New-FunctionTemplate
PSVersionAdditional            Windows PowerShell 5.1.19041.610
DateTime                       Wed. 12/16/2020 12:57:35 PM     
Command:0:Parameter:1          PassThru:True
Template                       FunctionTemplate 3.4.7
Command:0:CommandType          Function
IPAddressAdditional            192.168.86.127
Command:0:Parameter:0          Log:ToScreen
Duration                       0:00:00:00.9080064
Domain\Username                mainpc.tommymaynard.com\tommymaynard
CommandName                    New-FunctionTemplate
ModuleName
Domain\Computer                mainpc.tommymaynard.com\TOMMYCOMPUTER
OperatingSystem                Microsoft Windows 10.0.19041
PSVersion                      PowerShell 7.1.0
PSHostProgram                  ConsoleHost

Look at all that data! It’s beautiful (although not completely accurate, so that it may be posted here). When it’s not pulled from the disk/stale data, such as in this example, it’s being gathered by a function during its invocation, and then it’s sent off to Splunk! Because Splunk is involved, it’s important to understand why we’ve set up the hash tables the way we have. The host, time, source, and sourcetype keys in the top-level hash table are what Splunk calls event metadata (scroll down to Event Metadata). It’s optional to send in this data, but for me, it made sense. I wanted to guarantee that I had control over these values versus what their default values might be. The nested hash table is the event data. It’s not data about data, such as metadata is; it’s the data we collected and want Splunk to retain.

We’re going to wrap this part up, but I do want to provide how I got a hash table to nest inside of another hash table. We won’t bother working with the complete information above, but I’ll explain what you need to know. Here’s our downsized event hash table.

$Event = @{
    OperatingSystemPlatform = 'Win32NT'
    PSVersion = 'PowerShell 7.0.1'
    PSHostProgram = 'ConsoleHost'
}

And here it is, stored in the $Event variable and being used as a value in the parent, or previously used term, top-level hash table.

$EventHashTable = @{
    event = $Event
    host = 'mainpc.tommymaynard.com\TOMMYCOMPUTER'
    time = [int](Get-Date -Date (Get-Date) -UFormat %s)
    source = 'PowerShellFunctionTemplate'
    sourcetype = 'Telemetry'
}
$EventHashTable
Name                           Value
----                           -----
source                         PowerShellFunctionTemplate
sourcetype                     Telemetry
host                           mainpc.tommymaynard.com\TOMMYCOMPUTER
event                          {OperatingSystemPlatform, PSHostProgram, PSVersion}
time                           1608148655

If you opt to include the time event metadata, do recognize that it uses, and expects, epoch time. This is the number of seconds after epoch, or  June 1, 1970, 00:00:00 UTC. The -UFormat parameter with the %s parameter value will return this value. I believe this was added in Windows PowerShell 3.0, so if you have a client with an earlier version, first, I’m sorry, and second, there’s another way to obtain this value without the UFormat parameter: think datetime subtraction.

Okay, check back for more; there’s still plenty left to cover. Part III will be up soon!

Edit: Part III is now available.

Part I: Splunk, HEC, Indexer Acknowledgement, and PowerShell

A few weeks ago and I only knew one of the four words that make up this post’s title: PowerShell. I mean, I had heard of Splunk. But realistically, the other two words, which pertain to Splunk, I didn’t know. They weren’t a part of November’s vocabulary. Hopefully, you ended up here by doing some of the same searches I did. The same ones that didn’t really net me as much as I had hoped they would. I’m still no Splunk expert — I could see that taking some time and consistent interaction with the product — but, I have sent data to Splunk using PowerShell via REST. Who would have thought!? Perhaps that’s what you need to do, too.

It was a few weeks ago when I was wrestling with how I should handle collecting telemetry data, about users and their environment, when they used a function written with my function template. If just maybe you wondered, my function template is derived using some of the same techniques I wrote about in The PowerShell Conference Book: Volume 3.This book can be obtained from Leanpub, as an eBook. A print copy can be acquired from Amazon.

The initial idea was to use AWS API Gateway with DynamoDB. Lambda wouldn’t have been required, as those first two mentioned services integrate without it. I had a colleague mention Splunk though, as it’s available to me. With that, I quickly changed direction, and it’s worked out well. I didn’t get to checkoff those two AWS services from my I’ve-done-this-in-production list, but I’m nearing it for Splunk. I had wanted a reason to work with Splunk and up until that moment I didn’t have a project that offered that. At least I didn’t think I did.

There was at least one upfront requirement: I couldn’t install software. My PowerShell functions could be running anywhere, and so a REST endpoint was a requirement. Lucky for me, I had an option (that I was going to need to learn a bit about). I needed a term with which I could begin my searching, and reading, and research, and eventually my coding and integration. It was HEC. That’s pronounced “heck.” I know because I watched at least one Splunk education video. Sadly, it was later in the overall development process.

HEC, or HTTP Event Collector, is a means by which data can be securely sent to Splunk for ingestion. Read over the included link for a 20 second lesson. Seriously, it’s short and worth it! Using HEC allowed me to use a token as authentication and use REST to get my PowerShell function acquired telemetry data into Splunk. Let’s get started with some PowerShell already.

First things first, we need to protect our HEC token. You’ll need to get it from your Splunk administrator. There’s a few things out of my hands here and that, is one of them. I don’t have administrative permissions to Splunk and that’s been just fine for me. The HEC token will allow our PowerShell to authenticate to Splunk without the need for any other type of credentials. The problem was, that under no circumstances did I want to embed my HEC token in my function. Therefore, I opted to use a .env file (for now?). We’ll start with that concept. This isn’t 100% foolproof, but again, it’s much, much better than dropping my HEC token into my code. Here’s the first section of the code I’m going to share.

#region: Read .env file and create environment variables.
$FilterScript = {$_ -ne '' -and $_ -notmatch '^#'}
$Path = 'C:\Users\tommymaynard\Documents\_Repos\code\functiontemplate\random\telemetry\.env'
$Content = Get-Content -Path $Path | Where-Object -FilterScript $FilterScript
If ($Content) {
	Foreach ($Line in $Content) {
		$KVP = $Line -split '=',2; $Key = $KVP[0].Trim(); $Value = $KVP[1].Trim()
		Write-Verbose -Message "Adding an environment variable: `$env`:$Key."
		[Environment]::SetEnvironmentVariable($Key,$Value,'Process')
	} # End Foreach.
} Else {
	'$Content variable was not set. Do things without the file/the environment variables.'
}	# End If.
#endregion.

This has nothing to really do with Splunk so far, but it may help you protect your HEC token. I’d like to get the values in this file stored in Azure Key Vault someday. This is just for now… to see that everything else is working. Let me give you an example of what the .env file looks like, and then we’ll breakdown what the previously included code is doing.

# FunctionTemplate Environment Variables.

## Splunk Environment Variables.
SplunkUrl=https://splunkhectest.prod.splunk.tommymaynard.com:8088
SplunkHECToken=88a11c44-3a1c-426c-bb04-390be8b32287

The farther above code example creates and stores a script block inside the $FilterScript variable. This will exclude lines in our .env file that are empty or begin with a hash (#) character. It’s used as a part of the Get-Content command, which reads in the file’s lines that we actually want and assigns them to the $Content variable. As you can see, each line we want includes a string on the left, an equals sign (=) in the middle, and a string on the right. The remainder of the code — the Foreach loop — creates a process environment variable for each usable line during each iteration. When it’s completed, we’ll have $env:SplunkUrl and $env:SplunkHECToken and both will be populated with their respective, right side of the equals sign, value. It’s not perfect, but it’s 10x better than storing these values in the actual function template.

I’m going to stop here for now, but we’ll continue very soon with Part II. We haven’t gotten into Splunk yet, but we’re now set to do just that. With these two values (in variables), we can start working toward making our POST request to Splunk and passing along the data we want indexed.

Part II is now available.