So...Powershell. I need a deep dive into it.
June 7, 2021 10:09 AM   Subscribe

It's come to my attention that I'm writing Powershell as though I'm a java programmer (which I am). And this is not what the team requires. I need help to learn how to write Powershell like a Powershell scripting expert.

Solution #1 is to carefully read and work off of my colleagues' code. But this isn't enough as we are only just starting our work. And we are under some intense time pressure.

Before you ask - what are you trying to do in Powershell - we are doing A LOT more than I think people normally do. This is not for Sys Admin We are reading in complex batches of data from files and validating it and other such things before upload to DB. Why? We are replacing a mainframe.

And example of my oops - I tried to solve a problem using a class in Powershell, for example. It seemed like an excellent solution over creating a System.Object[]. But I've been told the problem with my code is that it doesn't take performance into account. I'm creating new variables all over the place, willy-nilly. True. I'm a 'young generation' programmer who has never had to deal with memory allocation or garbage collection.

The very senior dev (like, a person who can write Fortran and Cobol and really anything) is so patient and kind. As you can imagine, he does things the 'old' way. So I need resources that explain how we do things the 'old' way. Like always piping stuff. Which I find a little confusing.

Unfortunately, Google serves up mostly the popular 'new' ways of doing things. So I'm pretty lost.

Can you help? I know this seems ridiculous, like - you can't write Powershell? What kind of moron are you? But what I'm looking for is how to write in the old-school style.
posted by kitcat to Technology (8 answers total) 8 users marked this as a favorite
 
Best answer: To write PowerShell "the old way", you need to think of it as a series of discrete functions, rather than a programme. These articles might be useful to get your head round that UNIX thought pattern:

How to use PowerShell Objects and Data Piping
Use the Pipeline to Create Robust PowerShell Functions

I'd also spend some time in Microsoft's "deep dive" documentation to properly get a feel for PowerShell and how it's designed to be used.
posted by underclocked at 10:55 AM on June 7, 2021 [5 favorites]


Best answer: very OODD dude here. I'm a *bit* experienced in ps. here's a coupla very generic things i found helpful.
- looking at bash scripting approaches
- you're back in procedural land. change your thinking from noun-verb to verb-noun.
- think of your script flow like a recipe: all top down.
- you can do great stuff in objects and modules with ps, but your scripty friends will not be able to debug or extend that.
- you won't ever get your team to stop using hundreds of temp or throwaway vars. your best bet is to get them to do good naming (clean code, sec 1).
- the pipeline is the big mental leap. understand the guts, and always look at a problem from the 'current object' perspective.
- the type system is based on net framework, but rejiggered so you can do a lot of implicit casts. it's when you *can't* that gets to be a pain.
- your teammates are letting scripts fail on execution, without ever looking at any partial or destructive steps that occurred before failure. exception handling is an afterthought is ps teams.
- that ^^^ is because a scripting culture runs on trial and error, not unit test.

powershell is, uh, powerful. but i would never choose it for any production level code. i admit my prejudice and limited experience. i hope others can speak with more authority.

cheers!
posted by j_curiouser at 11:03 AM on June 7, 2021 [2 favorites]


As others have said, Powershell shares far more with Bash than with anything Windows.

It's really meant as sort of a "system utilities" sort of thing, or as glue to transfer data from one to another. It's not meant as a full-blown programming environment. Think of it more as modern form of DOS batch files.

With that said, if that's what your company decided to use, then I guess that's what you'll have to use. But personally I'd have picked Python, which has tons of libraries for you to manipulate data with.

Best resource on Powershell is, without surprise, from Microsoft.

https://docs.microsoft.com/en-us/learn/modules/introduction-to-powershell/
posted by kschang at 11:40 AM on June 7, 2021 [3 favorites]


Best answer: PowerShell is indeed powerful and seeing as it is default means of automating Microsoft Office365 tasks it has widespread support.

As for the question, "Is PowerShell a scripting facility or a full blown programming environment?" The answer is Yes. It is a floor wax AND a dessert topping.

Things were muddled a bit in the early years because what was accessible to the average person was to use it as a MS-DOS Command Prompt replacement and to use some light scripting. Unfortunately the style of the day was very unix-y/perl-esque in the sense that people got a kick out of writing absurd one-liners with all the opaque aliases for commands. The only way to do more sophisticated things was to write binary compiled PowerShell add-ins which was far less accessible to most people and required you to understand the POSH add-in framework.

As time has passed, there are far more people writing high quality PowerShell scripts and PowerShell script modules (essentially collections of commandlets that are packaged together and can potentially share state). Tooling like Visual Studio Code's default settings nudge people away from using aliases and there is support for Try/Catch/Finally, unit testing with object mocking and first party class support.

As your question about old school vs. new school approaches that seems a bit muddled as I often see the old school approaches as the things that are less efficient but that just might be a terminology distinction. In any case zeroing in on your memory efficiency reference:


$theFile = Get-Content "myfile.txt"
$output = @()
foreach ($line in $theFile) {
  //do something to $line
  $output += $line
}
$output | Set-Content "updated-file.txt"


is functionally equivalent to:

Get-Content "myfile.txt" |
Foreach-Object {
  $line = $_
  //do something to $line
  $line
} |
Set-Content "updated-file.txt"


The first approach is less optimal for two reasons. One, you are reading in the whole contents of the file into memory ($theFile) rather than letting the pipeline pass/process a line at a time. Secondarily, appending to arrays is expensive as the size of the array increases.

As for should you be doing this in Python or insert-your-favorite-language-here? It comes down to convenience, your group's comfort with particular technologies. I use both in my work which involves lots of people data and lots of Office365 data. Microsoft supplies lots of relatively high quality way & easy ways of getting the o365 data so for small things we have written high quality, straight forward PowerShell. In other cases where I am doing analysis on these data sets, it makes sense to use something like python+pandas.
posted by mmascolino at 12:37 PM on June 7, 2021 [9 favorites]


ty for bringing expertise, mmascolino.
posted by j_curiouser at 12:59 PM on June 7, 2021 [2 favorites]


It's also worth reading up about LINQ and anonymous functions in C#, which set the expectations for lazy-evaluated, streamed data transformations for PowerShell.

In terms of your data pipeline, having consistent methods in a class you control to act on the data -- great because you can unit test, profile for performance and use versioned data schema to accept/reject the data the data you feed the class -- but having allocations all over the place interrupts the actual work to get the dotNET CLR or OS to find the space for that data. Put 'em up-front, like old-school OO would define data structures as part of the class definition.
posted by k3ninho at 12:46 AM on June 8, 2021 [1 favorite]


I am not a Powershell expert, I'm a mainframe expert.

My first question is: is this a one-time thing, or will this be running regularly? If this is a one-time data migration thing then I'd do all data cleanup on the mainframe before trying to load it into your DB

My second question: does this process happen in COBOL, in a batch job on the mainframe? You might save time and effort by working with MicroFocus COBOL which claims to be able to migrate your code _and_ the job control language to a Windows platform.
posted by TimHare at 10:33 AM on June 8, 2021


Response by poster: Thank you all, this is extremely helpful! I have no say in our language/approach. Just for interest's sake, this thing will be running regularly. And I think the original is in Natural. Wish us luck. I hope some of you won't mind if I memail. Nice that we have such kind, helpful programmers here.
posted by kitcat at 11:02 AM on June 8, 2021 [2 favorites]


« Older Help us move to the North Woods   |   What's this SF story about insects? Newer »

You are not logged in, either login or create an account to post comments