The Perfect PowerShell Script


Crista Perlton

Crista Perlton


Do I really need to leave .NET Framework for .NET 8? 13th September, 2023

Demystifying Microsoft .NET Long Term Support (LTS) in 2024 30th August, 2023


The Perfect PowerShell Script

Posted on .

There are a million-and-one books and articles about how to improve your coding practices, but identifying which ones are worth your time would take forever.

THIS article summarizes the core skills that your team must learn to be able to write wonderful scripts.

Every member of your team has different PowerShell skill levels; pros and absolute beginners will likely be working side-by-side. Can your script library meet your organization’s ever-changing demands AND be accessible to your teammates of all skill levels?

There’s no such thing as a truly perfect script, but we can do all we can to create the most-perfect scripts.

The perfect script is not about bug-free code or optimization. It’s about writing scripts so that all members of your team can understand, use, and modify them, letting them easily change and evolve alongside your team and organizational needs.

To help your team write and use “perfect” scripts, we will cover:

Who Will Use Your Scripts?

Your PowerShell scripts will likely be used by your entire team, NOT just experts. There are knowledge gaps you’ll need to accommodate for these different “XP levels.”

Let’s use the “Level-Up” analogy from a previous article:

1Uses THE PowerShellanyone who knows how to open the literal PowerShell shell/command line. Everyone uses it, as it’s the only way to run/use cmdlets
10Runs Scriptsknows what a script is and how to carefully paste in parameters
50Reads Scriptscan understand what most do by reading it and knows how to research cmdlets
200Modifies Scriptsmakes changes to existing scripts to solve a somewhat different problem but does NOT write wholly new scripts
1,000Creates Scriptsidentifies problems and creates a wholly new/mostly new script to solve that problem
10,000Develops  Modulescreates and packages functions into modules that can be used to help create scripts
50,000Develops cmdletsuses C# to write, test, and build cmdlets that are packaged in modules

Most of your team will probably fall into the range of 10XP to 1,000XP. Ideally, you want your entire team to level up as fast as possible so that they can carry out more complex tasks—because this will let you scale faster and more efficiently as a team and as an organization.

Especially for your low XP members, scripts need to help this leveling-up, not stall it. Less time spent trying to understand a script means more time spent on constructive learning and growth.

Proper Commenting

The key to efficient script usage and maintenance is good comments. “Comments are a waste of my time, Crista.” No way. Remember: just because you (or Larry or whoever) wrote a script doesn’t mean you’ll also maintain it. Most scripts outlive their creators in an organization. High-quality comments help lower-XP teammates level up faster, as all the context they need is right in the comments. They can learn functions in a script faster and level up faster.  And remember, you can always add and improve commenting on existing documents if you don’t feel they’re up to scratch.

You can’t assume that someone will be able to understand your scripts by just reading them, especially if you use custom aliases and variable names. For example:

Set-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server'-name "fDenyTSConnections" -Value 0 

Someone who has never modified a registry entry before would be lost. They may have no clue where that -Path points to, and even if they know that it’s modifying a registry entry, there is no indication of what exactly this entry changes. So they’ll waste a bunch of time trying to figure it out instead of being productive.

A simple comment line can make this much more accessible to readers:

# Disable the registry value that blocks connections needed for RDP
Set-ItemProperty -Path 'HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server'-name "fDenyTSConnections" -Value 0  

A super, super important point: Comments are totally useless if they aren’t accurate or clear. Have a colleague dummy-test your comments as one form of testing.

Next-Level Commenting: Comment-Based Help

Comment-Based Help (CBH) is a collection of descriptions and keywords enclosed in a block comment. Unlike normal comments, PowerShell can read COMMENT-BASED HELP and display it upon request using the “Get-Help” command.

Example for using CBH:

function OpenClosePorts {
Create, modify, or toggle a firewall rule

Create, modify, or toggle a firewall rule
Set the parameters for the new rule.
If a rule with the specified name exists > Update the rule to match the new parameters 
If a rule with the specified name does not exist > create a new firewall rule with the specified parameters

.PARAMETER DisplayName
Specifies the name of the firewall rule to be modified

(Optional) Specify the port affected by the firewall rule

.PARAMETER Direction
(Optional)Specify if the connection is Inbound or Outbound. (Default = Inbound)

(Optional) Select if you want to Allow or Block a connection. (Default = Allow)

(Optional) Enable or Dissable the rule. True/False (Default = True)
param (
        [string]$DisplayName = "Test",$LocalPort = 8626, [string]$Direction = "Inbound",[string]$Action = "Allow",$Exists = $true
    $CheckRule = Get-NetFirewallRule -DisplayName $DisplayName 2> $null
    if ($CheckRule){
        Set-NetFirewallRule -DisplayName $DisplayName `
        -LocalPort $LocalPort `
        -Direction $Direction `
        -Protocol TCP `
        -Action $Action `
        -Enabled $Exists.ToString()
    else {
        New-NetFirewallRule -DisplayName $DisplayName `
        -LocalPort $LocalPort `
        -Direction $Direction `
        -Protocol TCP `
        -Action $Action `

By using the “Get-Help” command, PowerShell is able to read the support information specified in the script, contained within “<#” and “#>”, and then output it as help documentation. The output will be a combination of the descriptions provided by the author and information provided by PowerShell itself.

For example, using “Get-Help” in the script above would result in an output similar to the following:


Create, modify, or toggle a firewall rule

Create, modify, or toggle a firewall rule
Set the parameters for the new rule.
If a rule with the specified name exists > Update the rule to match the new parameters 
If a rule with the specified name does not exist > create a new firewall rule with the specified parameters

             Specifies the name of the firewall rule to be modified

             Required?                  true
             Position?                    0
             Default value
             Accept pipeline input?       false
             Accept wildcard characters?

		(Optional) Specify the port affected by the firewall rule

		          Required?                  false
            	Position?                    0
            	Default value
             	Accept pipeline input?       false
             	Accept wildcard 

CBH comes with a bonus feature for Otter users. Otter can generate a GUI your PowerShell scripts into an accessible GUI executed as a job

Long-term Usage and Maintenance of Scripts

Scripts are company assets, so they should be maintained as long as possible. If you are constantly spending time writing new scripts, you may as well not be using automation and still be doing everything by hand.

To ensure your scripts are maintainable in the long term, start by writing a sentence or two that describes the purpose of your script. Then add this as a comment, ideally before the .SYNOPSIS or .DESCRIPTION of your Comment-Based Help.

Then think about what types of systems the script can be run against, such as desktops or domain controllers, and make sure it’s clear in your .DESCRIPTION. If this is for a mission-critical, high-availability system, the script will need extra care to maintain, so note that in your .DESCRIPTION as well. If a script is not fit for the intended purpose, it will also make it more difficult for low-XP members to use the script. They may learn the wrong lessons from it, which turns into a whole new problem of untraining (nobody wants that).

Once you’re confident that a script is written well, confirm that it can be updated efficiently for the long term by having it peer-reviewed by different team members at different XP levels.


To help troubleshoot scripts, logging is an absolute must. It is not an optional practice for creating perfect scripts, because logging creates a record of everything that happens when executing a PowerShell script.

This makes it easier to run scripts because whoever ran a faulty script can see what went wrong more clearly.

PowerShell provides various cmdlets that allow users to create detailed, readable logs, such as:

  • Write-Information: Writes information into PowerShell stream 6 (information). This information can be redirected into any of the other streams via the system variable $InformationPreference. The same variable can also be used to pause the script upon writing a message.
  • Write-Debug writes a debug message for the code following the cmdlet. The code is not usually visible but can be displayed by using the “-debug” parameter at specified points in the code. For example, you could use it to display a debug message upon failure.
  • Write-Warning displays a yellow color-coded warning message to the PowerShell. The script execution is not stopped.
    • write-warning "We might have a problem"
    • Output: WARNING: We might have a problem
  • Write-Error displays a red color-coded warning message to the PowerShell. The script execution is not stopped.
    • Write-Error "We have a problem!"
    • Output: C:/Users/kawauso/Untitled-1.ps1 : We have a problem!
         + CategoryInfo         : NotSpecified: (:) [Write-Error], WriteErrorException
         + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Untitled-1.ps1

Write-Verbose displays all information about things that are happening while executing a script. As this can sometimes result in too much information being displayed, it’s possible to toggle using verbose logging within a script using the “-verbose” parameter.

Combining these logging methods, it’s possible to create a color-coded, easy-to-read log that will make analyzing script executions easier and faster for the entire team, no matter their PowerShell XP level.

External Infrastructure management tools, like Otter, can read and collect the various output streams from PowerShell to create permanent records that can be used to analyze scripts more efficiently than simply scrolling through hundreds of lines of output within the PowerShell.

Code Quality through Better Practices

It’s a no-brainer: simple is better, especially for code. Why? Because when things inevitably go sideways, simple is much easier to fix than big, long, complicated things.

Start with a simple assumption: “Everything I write will fail.” I always assume that the script I’m writing won’t work, no matter how simple the script is. A script that runs on my machine may fail on another. It could be a simple error as hard-coding a file path into the script just to realize that the desired location is different on a different server.

To fix these errors easily and (perhaps more importantly) enabling others to easily fix these errors in the future, there are several best practices to follow:  

Consistent Code Style and Formatting

Everyone has their preferred way of writing code. It may be something as simple as deciding whether to put a “{“ on the same line as the function name or on the next line.

It’s fine to write in your preferred way when writing personal scripts that only you will use. But when writing company scripts, all scripts need to be formatted the exact same way. Having consistent formatting throughout your entire script library ensures that whoever is reading the script does not have to waste time making sense of your non-indented mess of a script (but I’m not judging).

Handle Different Paths and Working Directories

Scripts may fail to run against a server because the file path specified in the script did not match the actual file path on the server. Path management is a complex issue as each system may have a different file structure.

To avoid errors, you could (for example) write a function to open a file dialog window that’ll allow you to manually select a file (perhaps by following these instructions for how to create an open file/folder dialog box with PowerShell). However, that kind of defeats the purpose of automation.

Instead, use Environment Variables to avoid setting specific paths. For example, instead of assuming “C:\”, use “$env:SystemDrive” to get set the location to your System Drive. (You can learn how to use environment variables by reading Microsoft’s documentation.) Once you’ve specified your paths, make sure to test them using the “Test-Path” function.

Error Handling

Remember the mantra: “Everything I write will fail.” This will help with error handling.

A failure could be something as simple as not being able to find the location of “cute-otters.jpg” or something far more serious, such as a setting not applying correctly, leading to a spiral of misconfigured servers.

PowerShell supports a simple but powerful “Try/Catch” function that will try to execute a command and catch any errors that occur.

Once you “catch” an error, you can use “Flow Control” to decide how the script should proceed. Using clear flow control will make your script much more readable by clearly defining what will happen in specified scenarios. For example:

## Source:

catch [DivideByZeroException]
 Write-Information “Divide by zero exception”
catch [System.Net.WebException],[System.Exception]
 Write-Warning “Other exception”
Write-Information “cleaning up …”

Split Your Scripts

It’s a fact that shorter scripts are easier to read and edit, especially for your low-XP engineers. That’s not the only benefit though. Shorter scripts also take less time and fewer resources to execute.

How long your script should be coming down to what we discussed before: the script’s purpose. When writing new scripts or editing existing scripts, ask yourself:

  • What does the script need to do? 
  • If the script does more than one thing, do you really need those functions in the same script?
  • What system will the script run on?
  • Can you afford to run a script that may take hours to complete?

Those may seem like common sense things, but software and script bloat are common issues that can easily sneak into your systems. You can avoid this by splitting scripts into smaller, more manageable parts.

Conclusion: The Perfect Script

As I said in the introduction, the perfect script is one that all members of your team can understand, use, and modify. It can easily change and evolve alongside your team and organizational needs.  

Using the practices outlined in this article, your 1000+ XP engineers should be able to write scripts that your more junior (<1000XP) engineers can easily understand, learn from, and most importantly feel comfortable editing/updating. This level of comfort is necessary for your team to level up their PowerShell faster and be able to tackle complex issues while continuously improving your scripts.

Crista Perlton

Crista Perlton