Quantcast
Channel: GoateePFE
Viewing all 28 articles
Browse latest View live

PowerShell Remoting and Kerberos Double Hop: Old Problem – New Secure Solution

$
0
0

PshSummit2017PowerShell and DevOps Global Summit 2017

This week I enjoyed presenting at the PowerShell and DevOps Global Summit 2017. If you have not attended, I highly encourage it. You will get to meet PowerShell team members from Microsoft, MVPs, and the people you follow on Twitter! Follow @PshSummit on Twitter to get the alerts for registration. I even work for Microsoft, and I learn a ton every year from the amazing sessions. It is also great connecting with everyone in the PowerShell community.

See this previous blog post for the benefits of resource-based Kerberos constrained delegation and how it can apply to PowerShell remoting. It is not a complete solution, but it works for the key scenarios described below and a few others. The article also outlines a number of other possible Kerberos double hop solutions. The PowerShell documentation team took that article, tweaked it, and turned it into a documentation page here.

tl;dr

  • This is a follow up to my previous blog post on Kerberos double hop and PowerShell remoting.
  • I have published some helper functions for working with resource-based Kerberos constrained delegation (RB KCD) and PowerShell remoting: Enable-RBKCD, Disable-RBKCD, Get-RBKCD.
  • Get the files and slides on my GitHub here.
  • RB KCD works with a limited set of commands and functions running under SYSTEM account in the kernel context.
  • RB KCD does not support WinRM / PowerShell remoting, because that runs under the NETWORK account.
  • For cases where RB KCD does not work you can nest two Invoke-Command statements to make the double hop. See helper function Invoke-DoubleHop.
  • You can share these RB KCD articles and scripts with this short link: http://aka.ms/pskdh

The Problem

Classic Kerberos Double Hop

I am on ServerA, connected to ServerB where I need to reach ServerC. I have permissions to ServerC, but I still get Access Denied. Default Kerberos rules prevent ServerB from passing credentials to ServerC. The most common example is a user (ServerA) connected to a web server (ServerB/frontend) that needs to use the user’s credentials to access a database server (ServerC/backend). In a previous blog post I described multiple popular (and not-so-popular) work-arounds.

Scenario A: Jump Server

From my workstation I connect to my jump server (tool server, etc. whatever you like to call it) via PowerShell remoting (Enter-PSSession, Invoke-Command). From that server I want to reach out and collect data from multiple other servers for a report. I am in the Administrators group on all of these servers, but I get an Access Denied when attempting to access them from my jump server.

Why not connect directly to the servers? Perhaps I have limited network connectivity or restricted routing. Maybe it is a DMZ or a hosted environment. There are many legitimate scenarios why you may choose this approach.

Scenario B: Remote Software Install

Another popular scenario is installing software remotely. From my workstation (ServerA) I want to fan out to 50 servers (ServerB) and install an application whose source files are hosted on a file share (ServerC). Here again I will get Access Denied at the file share even though I know I have permissions. This is Kerberos double hop.

Scenario X

There are many more scenarios for Kerberos double hop. RB KCD will help with some of them. Invoke-DoubleHop should help with more of them. And some will likely have no other choice but to continue using CredSSP for the time being. You will need to experiment to see which commands are compatible with RB KCD (running as SYSTEM in kernel context).

For example, from your workstation you connect to your SharePoint server with PowerShell remoting. The SharePoint cmdlets need to access a backend SQL server, but they fail. Typically CredSSP is the solution. I have some peers who have not been successful yet getting RB KCD to work with this case. I suspicion that it would need to be configured on service accounts and may work then. Let me know if you figure this one out.

Two Solutions, One Module

I created a helper module for quickly configuring RB KCD and for cheating with nested Invoke-Command cmdlets.

PS> Import-Module rbkcd.psm1

PS> Get-Command -Module RBKCD

CommandType Name             Version Source
----------- ----             ------- ------
Function    Disable-RBKCD    0.0     RBKCD
Function    Enable-RBKCD     0.0     RBKCD
Function    Get-RBKCD        0.0     RBKCD
Function    Invoke-DoubleHop 0.0     RBKCD

Here are some examples:

# Both ServerB and ServerC in the same domain.
Enable-RBKCD -ServerB sb.proseware.com -ServerC sc.proseware.com -Credential (Get-Credential)

# ServerB and ServerC in different domains.
Enable-RBKCD -ServerB sb.proseware.com -ServerC ms1.alpineskihouse.com -DomainBCred (Get-Credential) -DomainCCred (Get-Credential)

# See which identities are allowed to delegate to ServerC
Get-RBKCD -ServerC sc.proseware.com -Credential (Get-Credential proseware\adminacct)

# Remove all identities allowed to delegate to ServerC
Disable-RBKCD -ServerC sc.proseware.com -Credential (Get-Credential proseware\adminacct)

# For scenarios that do not work with RB KCD
Invoke-DoubleHop -ServerB sb -ServerC sc -DomainBCred $DomainBCred -Scriptblock {
    dir \\sc\c$
}

While these functions were written to help with RB KCD for PowerShell remoting, they could be used for any other RB KCD scenario. Note that these only work with computer accounts. You could expand the code to work with service accounts or user accounts also.

Did this work for you?

This is a bit of a niche topic, but lots of people struggle with it. Hopefully this was helpful. Please use the comments below to help the community understand where this was helpful for you and where is was not helpful. This is an on-going research project for me, and your feedback is valuable. Thank you. Happy scripting!


Top 10 PowerShell DSC Node Events to Monitor

$
0
0

01_blivitIn a previous blog post I demonstrated how to get a list of all possible PowerShell Desired State Configuration (DSC) events for monitoring. Admittedly, that was an overwhelming list. Today I want to narrow that down to the essentials of DSC monitoring events.

These are the events you’re looking for.

Recently I was working with a customer who wanted specific events for DSC monitoring. I did my testing with a Windows Server 2012 R2 node running WMF 5.1. The pull server was on the same versions. I fired up a node connected to the pull server and labbed a number of common scenarios you would want to monitor.

DSC node events are recorded in the Microsoft-Windows-DSC/Operational log. Here are the main events you want to capture. I have assigned a simple category to each of these.

Category Event ID Level Status
Desired State 4115 / 4343 Information Consistency scan completed (ie. in desired state if 4249 is not also present)
Desired State 4249 Warning Failed consistency scan (ie. not in desired state). Only appears in ApplyAndMonitor mode.
Configuration Apply 4097 Error Configuration failed to apply
Configuration Apply 4332 Information Listing of resources applied in the configuration
Configuration Apply 4257 Information LCM settings during the configuration
Node Pull 4252 Error Node failed to download from pull server, only event 4252 with Error Category 8 in the message
Node Report 4264 / 4266 Information Node successfully reported to report server
Node Report 4260 Error Node failed reporting to report server

 

In some cases there may be other events to indicate similar status. These IDs are the least chatty. Of these ten events I have highlighted the three essential error conditions for monitoring.

Note the following points:

  • Event 4249 only shows up in ApplyAndMonitor configuration mode to indicate configuration drift. In my testing I could not find an event indicating configuration drift when ApplyAndAutocorrect actually makes a correction to the configuration.
  • In the message body of some events you will see PerformRequiredConfigurationChecks. These bit flag values are documented here.
  • Event 4252 appears for all kinds of conditions. Differentiate the events by the message body and the Error Category data inside the event.

Scripting to Capture Logs

Here is some quick syntax to remotely query the events. Note that I limit the total number of events returned for performance reasons. Tweak MaxEvents as needed.

Invoke-Command -ComputerName server1,server2,server3 -ScriptBlock {
  Get-WinEvent -LogName 'Microsoft-Windows-DSC/Operational' -MaxEvents 50} |
  Select-Object PSComputerName,TimeCreated,LevelDisplayName,Id,Message |
  Out-Gridview

Here is some quick syntax to export all of the DSC event logs, optional pull server details, and zip them up for analysis off-box. I use this when troubleshooting DSC.

New-Item -ItemType Directory -Path C:\logs -ErrorAction SilentlyContinue
(Get-WinEvent -ListLog *desired*,*dsc*).LogName |
Where-Object {$_ -notlike "*admin*"} |
ForEach-Object {
    wevtutil export-log /overwrite:true $_ "C:\logs\$($env:COMPUTERNAME)_$($_.Replace('/','-')).evtx"
}
'System','Application' | ForEach-Object {
    wevtutil export-log /overwrite:true $_ "C:\logs\$($env:COMPUTERNAME)_$($_).evtx"
}
If ((Get-WindowsFeature DSC-Service).Installed) {
    Get-ChildItem 'C:\Program Files\WindowsPowerShell\DscService' > C:\logs\DscService.txt
    Copy-Item -Path 'C:\inetpub\wwwroot\PSDSCPullServer\web.config' -Destination C:\logs
}
$PSVersionTable > C:\logs\PSVersionTable.txt
Compress-Archive -Path C:\logs\*.evtx,C:\logs\*.config,C:\logs\*.txt `
    -DestinationPath "C:\logs\$($env:COMPUTERNAME)_DSC_Logs.zip" -Update

The xDscDiagnostics module has a function New-xDscDiagnosticsZip which will get most of these things and a few other items. This code above is tailored for my own DSC troubleshooting needs. Note that my version will attempt to collect additional details from a pull server, assuming the default install paths.

Additional Resources

For more info on troubleshooting DSC and logs see the documentation here: https://msdn.microsoft.com/en-us/powershell/dsc/troubleshooting

Don’t forget to check out my previous blog post for more on working with DSC event logs.

Comments

What do you monitor for DSC events? Did I miss any? If so, let me know in the comments area below.

Getting Started with PowerShell Core on Windows, Mac, and Linux

$
0
0
This is deeper than Coke vs. Pepsi or Ford vs. Chevy. We are breaking down the barriers. Cats and dogs living together. Are you ready for this?
This month I posted over on the PowerShell team blog about my recent experiences with PowerShell on Linux and Mac. It is a ton of fun. Check out the post here:
Enjoy!

Slow Code: Top 5 Ways to Make Your PowerShell Scripts Run Faster

$
0
0

Slow code?

Are you frustrated by slow PowerShell scripts? Is it impacting business processes? Need help tuning your scripts? Today's post is for you.

Can you identify with any of these customer slow PowerShell stories?

Case #1

Customer is scanning Active Directory Domain Controllers in multiple domains and forests scattered all over the state on slow links. This key audit data takes 62 hours to collect and impacts the business response to audit requests. After applying these techniques, the customer reports that the script now completes in 30 minutes.

Case #2

Customer needs to update user licensing on Office 365 for daily new user provisioning. This impacts the business due to 10 hour run time of the script. After applying these optimization tips, the script finishes in 14 minutes.

Case #3

Customer is parsing SCCM client log files that rotate every few minutes. But the script takes longer to run than the log refresh interval. After applying these techniques, the script is 10% of its original length and runs 10 times faster.

Scripting Secrets

After several years of teaching and advising PowerShell scripting I have observed some routine practices that lead to poor script performance. Often this happens with people who copy and paste their scripts from internet sources without truly understanding the language. Other times it simply comes from a lack of formal training. Regardless, today I am going to share with you the secrets I have shared with many customers to improve their script run time.

The classic programming trade-off is speed vs. memory. We want to be aware of both as we write the most efficient code.

Problem #0: Not using cmdlet parameter filters

There is an ancient PowerShell pipeline proverb: Filter left, format right. Disobey it, and your script will take a while. It means that you should filter the pipeline objects as far to the left as possible. And formatting cmdlets should always go at the end, never the middle.

Early on a customer reported to me, "Querying event logs over the WAN is taking days!" Study these two code samples below. Which do you think is faster and why?

# Approach #1
Get-WinEvent -LogName System -ComputerName Server1 |
  Where-Object {$_.InstanceID -eq 1500}

# Approach #2
Get-WinEvent -FilterHashtable @{LogName='System';ID=1500} `
  -MaxEvents 50 -ComputerName Server1

The first approach retrieves the ENTIRE event log over the wire and then filters the results in local memory. The second approach uses cmdlet parameters to effectively reduce the dataset coming from the remote system.

This same advice applies to any cmdlet that queries data, whether local or remote. Be sure to explore the help for all the parameters, even if they look complicated at first. It is worth your time to write the code correctly.

Yes, #2 is faster. Much faster.

Problem #1: Expensive operations repeated

Usually I see this manifest as a query to Active Directory, a database, Office 365 accounts, etc. The script needs to process multiple sets of data, so the script author performs a unique query each time. For example, I need to report on the license status of 10 user accounts in Office 365. Which pseudo code would be faster?

For Each User
    Query the account from Office 365
    Output the license data of the user

Or this:

Construct a single query to Office 365 that retrieves all users in scope
Execute the query and store it into a variable
Pipe the variable into the appropriate loop, selection or output cmdlet

Yes, the second is more efficient, because it only performs the expensive operation once. It may be a little more involved to construct the query appropriately. Or you may need to retrieve an even larger data set if you cannot precisely isolate the accounts in question. However, the end result is a single expensive operation instead of multiples.

Another expensive operation is crawling an array to search a value:

For ($i=0; $i -lt $array.count; $i++) {
    If ($array[$i] -eq $entry) {
        "We found $entry after $($i+1) iterations."
        $found = $true
        Break
    }
}

Instead, add the items to a hash table which has blazingly fast search performance:

$hashtable.ContainsKey($entry)

See Get-Help about_Hash_Tables for more information on my favorite PowerShell data structure.

Problem #2 & #3: Appending stuff

Append-icitus is painful, but appending to objects is more painful. This usually comes in one of two forms:

  1. Appending to files
  2. Appending to arrays

Appending to files

I usually see this with script logging output. Cmdlets like Add-Content, Out-File -Append and Export-CSV -Append are convenient to use for small files. However, if you are using these in a loop with hundreds or thousands of iterations, they will slow your script significantly. Each time you use one of these it will:

  • Open the file
  • Scroll to the end
  • Add the content
  • Close the file

That is heavy. Instead use a .NET object like this:

$sw = New-Object System.IO.StreamWriter "c:\temp\output.txt"
for ($a=1; $a -le 10000; $a++)
{
    $sw.WriteLine($BigString)
}
$sw.Close()

For CSV output, this may require you to construct your own CSV delimited line of text to add to the file. However, it is still significantly faster.

Appending to arrays

I used to do this one often until someone pointed it out to me.

# Empty array
$MyReport = @()
ForEach ($Item in $Items) {
    # Fancy script processing here
    # Append to the array
    $MyReport += $Item | Select-Object Property1, Property2, Property3
}
# Output the entire array at once
$MyReport | Export-CSV -Path C:\Temp\myreport.csv

Now this is almost one better, because we are not appending to a file inside the loop. However, we are appending to an array, which is an expensive memory operation. Behind the scenes .NET is duplicating the entire array in memory, adding the new item, and deleting the old copy in memory.

Here is the more efficient way to do the same thing:

$MyReport = ForEach ($Item in $Items) {
    # Fancy script processing here
    $Item | Select-Object Property1, Property2, Property3
}
# Output the entire array at once
$MyReport | Export-CSV -Path C:\Temp\myreport.csv

You can actually assign the variable one time in memory by capturing all the output of the loop. Just make sure the loop only outputs the raw data you want in the report.

Another option is to use a hash table or .NET array list object. These data structures can dynamically add and remove items without the memory swapping of an array. See Get-Help about_Hash_Tables or System.Collections.ArrayList.

Problem #4: Searching text

The log parsing example I mentioned in Case #3 above gets a lot of people, especially if you started scripting in VBScript where string methods were quite common. Here is a quick chart comparing the three most popular text parsing methods, including links for more info.

Technique Friendly Power
String methods Yes No
Regular expressions No Yes
Convert-String / ConvertFrom-String Yes Yes

Sometimes string methods (ToUpper, IndexOf, Substring, etc.) are all you need. But if the text parsing requires pattern matching of any kind, then you really need one of the other methods, which are much faster as well.

Here is a simple example of using string methods:

$a = 'I love PowerShell!'
# View the string methods
$a | Get-Member -MemberType Methods
# Try the string methods
$a.ToLower()
$a.ToLower().Substring(7,10)
$a.Substring(0,$a.IndexOf('P'))

While string methods are easy to discover and use, their capability gets cumbersome very quickly.

Observe this comparison of three techniques:

$domainuser = 'contoso\alice'

# String methods
$domain = $domainuser.Substring(0,$domainuser.IndexOf('\'))
$user   = $domainuser.Substring($domainuser.IndexOf('\')+1)

# RegEx
$domainuser -match '(?<domain>.*)\\(?<user>.*)'
$Matches

# Convert-String
'contoso\alice' |
    Convert-String -Example 'domain\user=domain,user' |
    ConvertFrom-Csv -Header 'Domain','User'

RegEx is used widely in PowerShell: -split, -replace, Select-String, etc. RegEx excels at parsing string patterns out of text with speed. Take some time to learn it today (Get-Help about_Regular_Expressions).

The new Convert-String and ConvertFrom-String cmdlets were introduced in PowerShell 5. See the links in the chart above for more detailed examples of these powerful text parsing cmdlets. ConvertFrom-String excels at parsing multiple values out of multi-line patterns. And that is exactly what challenged the customer in Case #3 above.

How can I tell how long my script runs?

Use one of these techniques to test different versions of your code for speed.

PowerShell has a cmdlet Measure-Command that takes an -Expression scriptblock parameter. This is the first way most people measure execution time.

Measure-Command -Expression {
    #
    # Insert body of script here
    #
}

Others will do something like this:

$Start = Get-Date
#
# Insert body of script here
#
$End = Get-Date
# Show the result
New-Timespan -Start $Start -End $End

Either method returns a TimeSpan object with properties for any desired unit of time. Just be sure to use the total properties for accuracy.

If you do a slow operation one time, maybe that is little impact. But if you do it 1,000 times, then we are all in trouble. If the data processed in each loop is a rich object with many properties, then it is even worse (ie. more memory). Review your loops carefully to identify expensive commands and optimize them.

Disclaimer

One of the challenges of sharing code publicly is that I am always learning. If you go back to my posts six years ago, you will find that I used some of these poor practices. I have re-written and re-blogged some of them. Others are still there.

Take-aways:

  • Keep learning
  • Review (and optimize) all code you find online before implementing it
  • Periodically review your most-used scripts in light of your new knowledge

Keep scripting!

More Tips

You can find more tips in The Big Book of PowerShell Gotchas over at PowerShell.org/ebooks.

Use Hash Tables To Go Faster Than PowerShell Compare-Object

$
0
0

Compare-Object gotcha down? Slower than my old 300 baud modem? Have no fear. Today we go faster using hash tables.

Let me state first that I love the cmdlet Compare-Object, and I have used it many times with great results. But at scale my customer had some serious performance issues.

The Problem - “I feel the need. The need for speed.”

So my customer has employed all the tricks from the last blog post on making your scripts go faster. But still the script takes hours to run. Between each command he dropped a timestamp into a log file. The culprit… Compare-Object. That single command was taking hours.

But let’s be fair. He’s comparing about 800,000 email addresses between two lists. It would take me weeks to do that by hand with a pencil and paper. Compare-Object is pretty quick at 13 hours. But let’s get this down to seconds.

The Research

First things first. What exactly is Compare-Object doing? To find out, view the source code over at the PowerShell open source GitHub location. So I did that. But I’m not a .NET developer. However, I did notice the comments starting on line 120 helped me understand what it does. That is very similar to my idea.

All I know is that when I want list processing to go faster in PowerShell I use hash tables. I’ll write my own version in native PowerShell and see if it is faster.

The Approach

We have two lists, and we need to know what is different. We want to make the most efficient use of both memory and computation.

If I compare every item in List1 against every item in List2, well, that’s going to take a while (n*m).

Each list comes in as an array. I need to look up all the items in List1 against List2. The fastest way to do lookups is with a hash table.

To find the differences, I will delete the matching entries from both List1 and List2. Arrays are slow at removing a single item, so again I will use hash tables.

After deleting all of the equal values, the only things left in each list are the unique values.

If you want to see what is equal, then I will stuff that into a third list (hash table) containing only the equal values.

The Code

I have placed the hash table comparison into a function called Compare-Object2.

<#
.SYNOPSIS
Faster version of Compare-Object for large data sets with a single value.
.DESCRIPTION
Uses hash tables to improve comparison performance for large data sets.
.PARAMETER ReferenceObject
Specifies an array of objects used as a reference for comparison.
.PARAMETER DifferenceObject
Specifies the objects that are compared to the reference objects.
.PARAMETER IncludeEqual
Indicates that this cmdlet displays characteristics of compared objects that
are equal. By default, only characteristics that differ between the reference
and difference objects are displayed.
.PARAMETER ExcludeDifferent
Indicates that this cmdlet displays only the characteristics of compared
objects that are equal.
.EXAMPLE
Compare-Object2 -ReferenceObject 'a','b','c' -DifferenceObject 'c','d','e' `
    -IncludeEqual -ExcludeDifferent
.EXAMPLE
Compare-Object2 -ReferenceObject (Get-Content .\file1.txt) `
    -DifferenceObject (Get-Content .\file2.txt)
.EXAMPLE
$p1 = Get-Process
notepad
$p2 = Get-Process
Compare-Object2 -ReferenceObject $p1.Id -DifferenceObject $p2.Id
.NOTES
Does not support objects with properties. Expand the single property you want
to compare before passing it in.
Includes optimization to run even faster when -IncludeEqual is omitted.
#>
function Compare-Object2 {
param(
    [psobject[]]
    $ReferenceObject,
    [psobject[]]
    $DifferenceObject,
    [switch]
    $IncludeEqual,
    [switch]
    $ExcludeDifferent
)

    # Put the difference array into a hash table,
    # then destroy the original array variable for memory efficiency.
    $DifHash = @{}
    $DifferenceObject | ForEach-Object {$DifHash.Add($_,$null)}
    Remove-Variable -Name DifferenceObject

    # Put the reference array into a hash table.
    # Keep the original array for enumeration use.
    $RefHash = @{}
    for ($i=0;$i -lt $ReferenceObject.Count;$i++) {
        $RefHash.Add($ReferenceObject[$i],$null)
    }

    # This code is ugly but faster.
    # Do the IF only once per run instead of every iteration of the ForEach.
    If ($IncludeEqual) {
        $EqualHash = @{}
        # You cannot enumerate with ForEach over a hash table while you remove
        # items from it.
        # Must use the static array of reference to enumerate the items.
        ForEach ($Item in $ReferenceObject) {
            If ($DifHash.ContainsKey($Item)) {
                $DifHash.Remove($Item)
                $RefHash.Remove($Item)
                $EqualHash.Add($Item,$null)
            }
        }
    } Else {
        ForEach ($Item in $ReferenceObject) {
            If ($DifHash.ContainsKey($Item)) {
                $DifHash.Remove($Item)
                $RefHash.Remove($Item)
            }
        }
    }

    If ($IncludeEqual) {
        $EqualHash.Keys | Select-Object @{Name='InputObject';Expression={$_}},`
            @{Name='SideIndicator';Expression={'=='}}
    }

    If (-not $ExcludeDifferent) {
        $RefHash.Keys | Select-Object @{Name='InputObject';Expression={$_}},`
            @{Name='SideIndicator';Expression={'<='}}
        $DifHash.Keys | Select-Object @{Name='InputObject';Expression={$_}},`
            @{Name='SideIndicator';Expression={'=>'}}
    }
}

Note that for my purposes I did not need to compare multiple properties, so this approach does not entirely duplicate functionality of the native Compare-Object. You could probably adapt this code for that purpose. I would drop each list object into a hash table value, while making the key a string representation of the one or more properties to be compared. I’ll leave that bit up to you.

Also note that, yes, I used ForEach. General consensus is that ForEach is slower than For. Feel free to adjust and see if that makes a difference in execution time for you.

The Results

# Native Compare-Object
Measure-Command -Expression {
    Compare-Object -ReferenceObject (Get-Content .\file1.txt) `
        -DifferenceObject (Get-Content .\file2.txt) -IncludeEqual
} | Select-Object TotalMilliseconds

# Hash table comparison
Measure-Command -Expression {
    Compare-Object2 -ReferenceObject (Get-Content .\file1.txt) `
        -DifferenceObject (Get-Content .\file2.txt) -IncludeEqual
} | Select-Object TotalMilliseconds

When racing the native Compare-Object against my hash table implementation here are the results:

  • For test lists of 1,000 items, Compare-Object finishes in five seconds while the hash table version finishes in <1 second.
  • For test lists of 100,000 items, the hash table finishes in five seconds while Compare-Object had not finished after multiple minutes (so I just killed the task).
  • For the customer’s 800,000 items, the hash table finished in 30 minutes, as opposed to 13 hours for Compare-Object. To be fair, the script does other tasks besides this Compare-Object. Regardless that is a 25x performance improvement!

How is that for efficiency gain?!

The Moral of the Story

Learn hash tables today! They are the single most versatile, powerful, and fun data structure in all of PowerShell. Let me know your results in the comments area below.

“Goose, it’s time to buzz the tower.”

Function to Create Certificate Template in Active Directory Certificate Services for PowerShell DSC and CMS Encryption

$
0
0

Today I’m cleaning out my code closet. I found this script that I have wanted to share for a while now.

Problem

Active Directory Certificate Services does not include a template for Document Encryption. This is required for DSC credential encryption and the CMS encryption cmdlets. Current processes require manual effort to create the template. Or you must figure out how to use the less-than-friendly AD CS API from .NET. We all know I ain’t no .NET developer.

Solution

I reverse-engineered the resulting OID and certificate objects in Active Directory and wrote a function to create this template from code. This provides a fully-automated solution for creating the template in a lab or production environment.

Functionality

  • Take parameters
  • Generate a unique OID for the template
  • Create the template
  • Optionally permission the template with Enroll for specified group(s)
  • Optionally add AutoEnroll permission as well
  • Optionally publish the template to CA(s)
  • Optionally target all operations to a designated DC

Requirements:

  • Enterprise AD CS PKI
  • Tested on Windows Server 2012 R2 & 2016
  • Enterprise Administrator rights
  • ActiveDirectory PowerShell Module

Template generated will have these properties:

  • 2 year lifetime
  • 2003 lowest compatibility level
  • Private key not exportable
  • Not stored in AD
  • Document Encryption
  • No digital signature

Sample Usage

# Create only the template
# Least valuable approach
New-ADCSTemplateForPSEncryption -DisplayName PowerShellCMS
# Full template creation, permissioning, and deployment
New-ADCSTemplateForPSEncryption -DisplayName PSEncryption `
-Server dc1.contoso.com -GroupName G_DSCNodes -AutoEnroll -Publish

# From a client configured via GPO for AD CS autoenrollment:
$Req = @{
    Template          = 'PSEncryption'
    Url               = 'ldap:'
    CertStoreLocation = 'Cert:\LocalMachine\My'
}
Get-Certificate @Req
# Note: If you have the Carbon module installed, it conflicts with Get-Certificate native cmdlet.

$DocEncrCert = (dir Cert:\LocalMachine\My -DocumentEncryptionCert |
 Sort-Object NotBefore)[-1]

Protect-CmsMessage -To $DocEncrCert -Content "Encrypted with my new cert from the new template!"

Get the code

I have posted this code to the PowerShell Gallery here. Enjoy!

New, Improved Group Policy Link Report with PowerShell

$
0
0

A peer asked me to update one of my classic Group Policy reporting scripts this week, so I thought I would share the update with y'all.

Continuous Improvement

Over the years I have released a number of Group Policy scripts. This one shows you all kinds of goodness:

  • GPOs linked to OUs
  • OUs where block-inheritance may be turned on
  • Situations where no-override is used
  • Forensic data about the last time a GPO was linked or updated on an OU

By popular demand the improvements in this release are:

  • Unlinked GPOs are included
  • Columns are added for easy filtering where User or Computer version do not match
  • Output no longer defaults to CSV, so that you can pipe the output wherever you like

Show me some code

Call the script like this:

.\gPLinkReport.ps1 | Out-GridView
.\gPLinkReport.ps1 | Export-Csv -Path .\GPLinkReport.csv -NoTypeInformation

You can download today's script from the PowerShell Gallery here.

New-TimeSpan -Start ‘9/1/2017’

$
0
0

Dear readers of the GoateePFE blog,

It is with mixed emotion that I announce after seven years of real-world script blogging this is the final GoateePFE blog post. I have chosen to take the next step in my career with a company outside of Microsoft. My new role will involve automation, security, and customers.

When speaking at conferences it has been a pleasure and honor to meet people impacted by the GoateePFE blog posts and videos. I have enjoyed your Tweets and post comments as well. Please continue to share this content with others when it is helpful.

Thank You

You may not be aware that you have played a significant role in my career. Each review cycle I would give my manager a count of blog visits, demonstrating value delivered to the global community, to you. Sometimes I would include your tweets and comments, validating that I was connecting with real issues and providing real answers. Thanks for reading and sharing in the journey with me.

What will happen to "GoateePFE"?

My TechNet blog will remain for years to come, although I will no longer be able to update it. I will keep the @GoateePFE Twitter handle to maintain contact with the PowerShell community. My PowerShell video content will remain on Microsoft Virtual Academy and the Microsoft Premier video education subscription. My Facebook and GitHub locations will also remain. My next role will contain significant automation responsibilities, so I plan to continue involvement in the PowerShell community on some level.

Career Advice

In this post a few years back I shared some tips for career success. I hope that the content provided on the blog has helped you make your own mark on the world. I firmly believe that community participation is one of the best things you can do to advance your career and advance the industry. Join the conversation.

00100010 01000001 00100000 01100111 01101100 01101001 01110100 01100011 01101000 00111111 00100000 01001110 01101111 00101100 00100000 01110100 01101000 01100001 01110100 00100111 01110011 00100000 01101110 01101111 01110100 00100000 01110000 01101111 01110011 01110011 01101001 01100010 01101100 01100101 00101110 00100000 01001001 00100000 01110000 01110010 01101111 01100111 01110010 01100001 01101101 01101101 01100101 01100100 00100000 01101001 01110100 00100000 01101101 01111001 01110011 01100101 01101100 01100110 00101110 00100010 00100000 00101101 00100000 01000110 01110010 01100101 01100100 00100000 01010010 01100001 01101110 01100100 01100001 01101100 01101100 00100000 00101101 00100000 01010010 01101111 01100011 01101011 01100101 01110100 01101101 01100001 01101110

Cheers,

Mr. Ashley McGlone

@GoateePFE


Viewing all 28 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>