Tag Archives: Add-Content

Let’s Learn the Get-FileHash Command

Someone, somewhere, sent me down a path. At the end of it, while it is not where I needed to be, I learned — or relearned rather — about the Get-FileHash cmdlet. Whether you know about it or not, we will quickly cover it and walk through some examples, as well. Get-FileHash, and I quote, “Computes the hash value for a file by using a specified hash algorithm.” This is what it does, but is not the why. Here is its reference page, however: https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/get-filehash, and the why, is definitely in there; you should read it.

The idea, for those that are not going to read it, is that we can obtain a file’s hash and then check the hash to ensure the file has not been changed. At this point in your career, you have likely seen file hashes near, or alongside, a file download. Checking the file hash against the file, after it is downloaded, allows you to be certain that the file is the right one and that it was not altered by the download process, or anything else. It is what you were expecting.

If you are going to run any of my below commands, first be certain you know your working directory and that it is a location where you have permissions to write. We will start by creating a new file, adding a sentence to it, and then returning that content to ensure it is properly in place.

New-Item -Name hashfile.txt -ItemType File

Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a---           4/19/2022  7:06 PM              0 hashfile.txt
Add-Content -Path .\hashfile.txt -Value 'This is our file at the beginning.'
Get-Content -Path .\hashfile.txt
This is our file at the beginning.

That all works and so now we have a file with which can experiment. If you are wondering how you learn PowerShell, this is how you do it. Follow along, as there is a goodie further down below. In this example we invoke Get-FileHash against our file, only returning the algorithm used and the hash. We are using Format-List in order to better display this content.

Get-FileHash -Path .\hashfile.txt | Tee-Object -Variable SaveMeForLater | Format-List -Property Algorithm,Hash
Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356

In this example, we do the same as we did above, however, now we are going to try out the other parameter values that the Algorithm parameter will accept. By default it uses SHA256, but it will accept SHA1, SHA384, SHA512, and MD5, too.

Get-FileHash -Algorithm SHA1 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA1
Hash      : BD002AAE71BEEBB69503871F2AD3793BA5764097


Get-FileHash -Algorithm SHA256 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356


Get-FileHash -Algorithm SHA384 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA384
Hash      : E6BC50D6465FE3ECD7C7870D8A510DC8071C7D1E1C0BB069132ED712857082E34801B20F462E4386A6108192C076168A


Get-FileHash -Algorithm SHA512 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : SHA512
Hash      : C0124A846506B57CE858529968B04D2562F724672D8B9E2286494DB3BBB098978D3DA0A9A1F9F7FF0D3B862F6BD1EB86D301D025B80C0FC97D5B9619A1BD7D86


Get-FileHash -Algorithm MD5 -Path .\hashfile.txt | Format-List -Property Algorithm,Hash

Algorithm : MD5
Hash      : 30091603F57FE5C35A12CB43BB32B5F5

For fun, let’s loop through these values and pump out all the hashes, one right after another. Notice that we are using hard-coded values for the Algorithm parameter. Obnoxious. We will get to another way, which is/was the goodie I mentioned above. The more I think about it though — as I have been in the last 10 minutes — the more I think it might need its own post. Anyway, more on that soon.

'SHA1','SHA256','SHA384','SHA512','MD5'
    | ForEach-Object {Get-FileHash -Algorithm $_ -Path '.\hashfile.txt'
    | Select-Object -Property Algorithm,Hash}

Algorithm Hash
--------- ----
SHA1      BD002AAE71BEEBB69503871F2AD3793BA5764097
SHA256    3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
SHA384    E6BC50D6465FE3ECD7C7870D8A510DC8071C7D1E1C0BB069132ED712857082E34801B20F462E4386A6108192C076168A
SHA512    C0124A846506B57CE858529968B04D2562F724672D8B9E2286494DB3BBB098978D3DA0A9A1F9F7FF0D3B862F6BD1EB86D301D025B80C0FC97D5B9619A1BD7D86
MD5       30091603F57FE5C35A12CB43BB32B5F5

And, there they are again. The various hashes for our file. Now, let’s add some new files. All we are going to do is copy and paste our hashfile.txt to the same directory two times. Rename them so that in addition to hashfile.txt, you have hashfilecopy.txt and hashfile.copy. Watch those names and file extensions, although really, how important do you have to be? Think about it…

When checking the hash of a file, we verify the file contents have not changed. And they have not been changed! Only the file name and file extension have. You are starting to see how this can be a useful tool and guess what? It is built-in.

Get-ChildItem | Get-FileHash | Format-List

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
Path      : C:\Users\tommymaynard\Documents\PowerShell_Get-FileHash\hashfile.copy

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
Path      : C:\Users\tommymaynard\Documents\PowerShell_Get-FileHash\hashfile.txt

Algorithm : SHA256
Hash      : 3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
Path      : C:\Users\tommymaynard\Documents\PowerShell_Get-FileHash\hashfilecopy.txt

Now real quick, let’s make another change. I am going to copy and paste hashfile.txt one last time. This copy I have renamed to hashfilechanged.txt. I opened it up and added a second sentence to it. Beneath the first line, I wrote, “This is our file at the end.”

Get-Content -Path .\hashfilechanged.txt
This is our file at the beginning.
This is our file at the end.
Get-FileHash -Path .\hashfilechanged.txt | Tee-Object -Variable SaveMeForNow | Format-List -Property Algorithm,Hash
Algorithm : SHA256
Hash      : 6998575555A0B7086E43376597BBB52582A4B9352AD4D3D642F38C6E612FDA76

I used Tee-Object a couple of times in this post to capture the original hash and this one, after adding a second sentence. As you can see, the file contents are indeed different now, even though the files could have had the same name, were they in different directories.

$SaveMeForLater.Hash
$SaveMeForNow.Hash

3C55E3C7D4C2EEF6910CB70FC425549981528CBBC0400A705104DC09A9391356
6998575555A0B7086E43376597BBB52582A4B9352AD4D3D642F38C6E612FDA76

And, the goodie I mentioned. It is official; it will get its own post. Why not? I make the rules. I’ll link it from here once it is up and published!

“Get Files” to a Remote Computer

I was a little quiet last week, but vacation can do that. I spent a week in Minneapolis, Minnesota in order to visit my wife’s family. Although away from my computer nearly all that time, PowerShell was still on my mind.

Not long before I left for vacation, I replied to a TechNet forum post. In that post, a user was PS Remoting to a computer, and from that computer trying to connect to another computer to pull down some files. Maybe you’ve been down that path before, or know someone that has.

Typically the user will use Invoke-Command to run commands on a remote computer. One of the commands will attempt to reach out to a network location and download some files. It sounds straight forward enough, but the part they forget, or aren’t aware of, is the inability to delegate your credentials (the user name and password) to the location where the files reside. This inability to delegate credentials to a remote computer, from your already remote connection to a computer, is called the second hop, or double hop, problem. It’s by design, as we don’t want repetitive credential delegation from machine to machine, especially if the credentials are being used maliciously, which is basically what would happen if this delegation were allowed.

When faced with this problem, there are three thoughts that I typically have:

The first though is to copy out the file(s) to the remote computers before the PowerShell Remoting connection. Then, when you need the files on the remote computer, they are already in place. I’ve recommend this several times over the years. Think of this file copy as a prestage for the work your remote command, or commands, will complete using those files.

The second thought is CredSSP, and it should be avoided at all costs whenever possible. Once set up, it allows you to approve the credential delegation from your remote computer to the remote location where you want to get the files. It sounds wonderful, but as of this writing, it is still includes some security concerns.

The third thought is to use an endpoint that runs under a different account. It requires that the endpoint you connect to — that’s the thing that accepts your incoming PowerShell Remoting connection — to run as a user other than the connecting user. When set up, or edited, to do this, the endpoint isn’t running as you, when you connect, and therefore, can delegate its own credentials to the location where your files live. This eliminates the second hop, as the credentials you used to connect to the remote computer aren’t used to get to the location of the file(s) you want to copy down.

Before I left for vacation, I came up with another idea; a fourth thought. When we use Invoke-Command we have the option to take variables with us into the remote session. In a previous post, I showed three examples of how to do this: Read it; it’s short. Now, knowing I can fill a variable with the contents of a file… and knowing I can take a variable into a remote session… did a light just go on?

As I mentioned in the forum post, my fourth thought, was mildly unconventional. It didn’t require someone to prestage (copy the files out, before the remote connection), it wasn’t unsecure (CredSSP), and it didn’t require any knowledge about creating, or editing, endpoints to use a run as account. All it required was reading some files into some variables and taking those variables into the remote PowerShell session. Here’s the example I gave them on TechNet (and here’s the forum thread). I’ll discuss the example, below the example, so you can understand the entire, logical flow.

$ConfigFile01 = Get-Content -Path C:\Config01.txt
$ConfigFile02 = Get-Content -Path C:\Config02.txt

Invoke-Command -ComputerName Server01 -ScriptBlock {
    Add-Content -Path C:\file01.txt -Value $Using:ConfigFile01
    Add-Content -Path C:\file02.txt -Value $Using:ConfigFile02

    $1 = Get-Content -Path C:\file01.txt
    $2 = Get-Content -Path C:\file02.txt

    $1
    $2

    Start-Sleep -Seconds 5

    Remove-Item -Path C:\file01.txt,C:\file02.txt
}

Beginning in the first two lines (1 and 2), I set two variables on my local computer with the content of two different files. The variable $ConfigFile01 stores the content from the file C:\Config01.txt, and $ConfigFile02 stores the content from the file C:\Config02.txt. With these variables set, we run the Invoke-Command command on Server01. Remember, the -ComputerName parameter can accept a comma-separated list of multiple computers.

Inside this script block, we do several things. Keep in mind, as you look this over, that this example isn’t doing much with this code except proving that there’s another way to “get a file to a computer.” The first two lines in the script block (lines 5 and 6), create new files on the remote computer. They create the files C:\file01.txt and C:\file02.txt. The value added to file01.txt comes from the $ConfigFile01 variable, and the value added to C:\file02.txt comes from the $ConfigFile02 variable. At this point, we have our files on the remote computer.

The rest of the script block isn’t necessary, but it helps show some of what we can do. Lines 8 and 9 put the contents of the newly created files on Server01 into variables. Lines 11 and 12 echo the contents of the two variables. On line 14, I pause the script block for five seconds. The reason I did this was so that I could test that the files were created and their contents written before the last line. In that line, line 16, we remove the files we created on the remote computer.

You can test with this example by creating C:\Config01.txt and C:\Config02.txt on your local computer with some text in each. Next, change Server01 in the Invoke-Command to one of your computers. Then, open File Explorer to C$ on the computer to which you’re going to create a PS Remoting session. In my case, I would open it to \\Server01\C$. Having this open will allow you to see the creation, and removal, of file01.txt and file02.txt.