Making Custom Objects & Integration Packs Using PowerShell Scripts

I don't know how I manage to find myself doing things that are "out of the ordinary" so often. Maybe it's just in my nature to want to tweak things so that they're "just right" or provide the exact user experience I'm looking for, rather than settling for less elegant ways of doing them. Maybe it's just my unique combination of A.D.D. and perfectionism that fuels this behavior, but in most cases, it leads me to finding new ways to do things or simply leads me to do things that aren't documented well (or at all) and drives me to find out.

Such is the case in the effort that led to this article. As part of the Opalis 6.3 effort, we produced an Integration Pack for Configuration Manager 2007. It's a brand new IP, part of the group of new System Center IPs, and it contains 11 activities:

  • Advertise Task Sequence
  • Add Computer to Collection
  • Create Advertisement
  • Create Collection
  • Delete Collection
  • Deploy Software Update
  • Get Advertisement Status
  • Get Collection Member
  • Get Software Update Compliance
  • Refresh Client
  • Refresh Collection

Each of these activities has a specific purpose, and when combined with all of the other activities available in the base product and other Integration Packs, the potential variations are virtually limitless. It's a very flexible set of activities that addresses a specific slice of activities faced by ConfigMgr admins. Those of you familiar with software development know that you have to choose specific scenarios (or "user stories") to address in a release, and that you can't solve every problem. That means there are always gaps in the coverage of features that admins may need. In development, we have to choose, based on customer feedback, what's important enough to get into the schedule, what can be moved to the next released, and what can be dropped.

An example of this scenario is deploying software updates. We created a "Deploy Software Update" activity, but it relies on the existence of an update list and a deployment template. During beta testing, we heard that "it would be nice" if we added the ability to create an update list as part of a workflow, but it never got to a high enough priority to make the release. But that doesn't mean the functionality isn't useful now, just that it didn't make it into the official product release. It would be a useful example if it were a community sample – it doesn't have to be in the official release. The added benefit is that I can use it as a teaching example (and a learning example for myself).

This is where the flexibility and extensibility of Opalis really shines. I know that using Opalis I can use scripts in several ways:

  • The Run .NET Script activity (using C#, JScript,m PowerShell or VB.NET)
  • The Run Program activity (to run cscript.exe and a script file, or powershell.exe and a script file or command line)
  • Using QIK CLI I can create a DLL file the encapsulates a command line, and use that with the Invoke .NET activity from the QIK .NET IP
  • I can use the QIK Wizard to package the assembly into a full Integration Pack and include separate script files I can run from a command line

Of course, there are various advantages to choosing one method over another. In my case, I wanted to be able to run a PowerShell script that accepted parameters (so that it could be used as a generic object to create an update list),, and make it easy for the user to supply those parameters. I also wanted to output properties of the new update list to published data. Of all the choices, creating an assembly using QIK CLI was the only one that allowed me to specify a list of parameters that would be sent to the script.

Building the Custom Object using QIK

I had already built and tested some PowerShell scripts for doing things in ConfigMgr, and one of those things was creating an update list. Now I just had to figure out how to use the scripts in QIK CLI. opening up QIK CLI, I started with a blank slate. I gave the new assembly a name and told QIK where to place the file:


I then defined the more detailed Assembly Information


Next I needed to Add the command to run. What I am trying to do is run a PowerShell script that creates an update list in ConfigMgr. In order to create an update list, I need to define what updates to include, so I need update IDs. Of course, from a usability standpoint, requiring update IDs is not as useful as Article IDs. The reason is that Article IDs are visible in the ConfigMgr Admin Console, but to get Update IDs, you have to dig into WMI or SQL. So, as you can see from the description I provided, I want my object (and my script) to be able to use either update IDs or article IDs.


Adding the Command Line and Parameters

My script is actually two scripts. I have divided my functionality into a base script (called CMIP_Base.ps1) and another script specific to the updates functionality (called CMIP_SoftwareUpdates.ps1). This modularity allows me to continue to add new functionality via new scripts and not have to replicate the same code for logging, connecting to ConfigMgr, and other utilities over and over in each new script.

In order to connect to a remote ConfigMgr server, I need to establish my credentials and locate the proper WMI namespace path. To do that, I need to supply parameters for:

  • The ConfigMgr server name
  • The ConfigMgr site name (optional, in case I need to specify a particular site)
  • A Username (domain\username format)
  • A Password

Then, in order to create my update list I need some more parameters:

  • A name for the update list (required, must be unique)
  • A description (optional)
  • A list of Update IDs –or- a list of Article IDs

My script also does some logging, so I supply a logging level parameter too. By default, it's set to no logging, but if I have a problem, I can turn it on.

Lesson Learned #1 – Running a script (not just a PowerShell CmdLet) from the Command Line window

Normally, when you run a PowerShell script, you use a command line like this:


While this works from a PowerShell command line, it doesn't work from a QIK object. The reason is that Opalis will spawn the PowerShell environment that's not rooted in the same directory as the assembly (or script). The appropriate working directory will automatically be added to the command line, so you have to leave off the ".\"

Part 2 of this lesson I learned was about PowerShell in general, and that if you want the objects and variables to be used across multiple scripts and to be available back in the environment outside the script, you have to "dot source" the command. In other words, the script has to be run like this:

. MyPowershellScript.ps1        (note the space between the period and the script name)

UPDATE! For Orchestrator 2012, this behavior is different! Now because all of an IP's dependent files are placed into a GUID-named subdirectory with the IP's assemblies, and the working directory is set to that location before running the PowerShell command, the typical ".\" notation now works. However, you still have to use the dot source notation to ensure any functions loaded in a PS1 file are exposed to the environment (otherwise you can use a module and then use the import-module command). So, in Orchestrator, your command would look like this:

. .\MyPowershellScript.ps1 (note the space between the two periods)


Ok, so I have my scripts, I know what parameters I need, now I need to write the command line to load the script and call the functions I need to complete the job. The scripts do not actually run anything on their own. Well, the CMIP_SoftwareUpdates.ps1 script calls the CMIP_Base.ps1 script to load it, but they don't actually process anything. This lets me be as flexible as I need to with the scripts and use them only as containers for the functions that actually do the work.

So I need to load my script, then run the Connect-SCCMServer function to get my connection info, then pass the connection object over to the New-SCCMUpdateList function to actually create the update list. Here's what that looks like with my parameters already defined in the QIK UI:

. CMIP_SoftwareUpdates.ps1; $conn = Connect-SCCMServer -sccmserver $(SCCM Server) -sitename $(Site Name) -username $(Username) -password $(Password); New-SCCMUpdateList -sccmserver $conn -updateListName $(Update list name) -updateListDesc $(Description) -updateIDs $(UpdateIDs) -articleIDs $(Article IDs)

This looks good. If I replaced the $(XXX) parameters from Opalis with real strings and ran it from a PowerShell command line, it would work. But I found another issue…

Lesson Learned #2 – There is no "Optional Parameter" logic in QIK CLI command lines

The command line I used above may seem valid, but what happens if I leave a parameter blank? I don't have the ability to modify the actual command line parameters when an Opalis parameter is blank. For example, I can't leave off "-sitename" if the "$(Site Name)" parameter is blank. Leaving "Site Name" blank results in this error:


The easy fix is to add single quotes around all your parameter names and make sure that your script checks for null values in parameters. Here's the updated command line:

. CMIP_SoftwareUpdates.ps1; $conn = Connect-SCCMServer –sccmserver '$(SCCM Server)' –sitename '$(Site Name)' –username '$(Username)' –password '$(Password)'; New-SCCMUpdateList -sccmserver $conn –updateListName '$(Update list name)' –updateListDesc '$(Description)' –updateIDs '$(UpdateIDs)' –articleIDs '$(Article IDs)'

This also takes care of any instance where you have spaces in a string, or in the case of the last two parameters, commas in the string.

My Final command line and parameters looks like this:



Adding Published Data

Now that I have set up my script command line and parameters, I need a way to take the results and pass them back into published data so that the information can be used later on in the workflow. For example, if I create an update list, I need the update list name so I can use it in the "Deploy Software Update" activity. It took some trial and error to figure it all out, but once I did, it turned out to be simpler than I expected.

The way Published Data works with a QIK CLI PowerShell script is that you specify the property names of the object(s) that are returned to the PowerShell pipeline from the command. For example, my script adds the Update List WMI object ("SMS_AuthorizationList" object) to the pipeline. Looking at the object from the PowerShell command line, I see this:

__GENUS : 2
__CLASS : SMS_AuthorizationList
__SUPERCLASS : SMS_ConfigurationItemBaseClass
__DYNASTY : SMS_BaseClass
__RELPATH : SMS_AuthorizationList.CI_ID=15620
__DERIVATION : {SMS_ConfigurationItemBaseClass, SMS_BaseClass}
__NAMESPACE : root\sms\site_C07
__PATH : \\SCX-RH-CM07\root\sms\site_C07:SMS_AuthorizationList.CI_ID=15620
ApplicabilityCondition :
CategoryInstance_UniqueIDs : {}
CI_ID : 15620
CI_UniqueID : ScopeId_292DE7D8-BEE1-4D0D-BFF9-1D3E2464C1B7/AuthList_0B5FE237-68C3-4684-BDBC-
CIType_ID : 9
CIVersion : 1
CreatedBy : SCX-RH\Administrator
DateCreated : 20101118000812.000000+000
DateLastModified : 20101118000812.000000+000
EffectiveDate :
EULAAccepted : 2
EULAExists : False
EULASignoffDate : 20101118000813.000000+***
EULASignoffUser :
IsBundle : True
IsDigest : True
IsEnabled : False
IsExpired : False
IsHidden : False
IsQuarantined : False
IsSuperseded : False
IsUserDefined : True
LastModifiedBy : SCX-RH\Administrator
LocalizedCategoryInstanceNames : {}
LocalizedDescription :
LocalizedDisplayName : Blah22
LocalizedInformation : {Blah22}
LocalizedInformativeURL :
LocalizedPropertyLocaleID : 1033
ModelName : ScopeId_292DE7D8-BEE1-4D0D-BFF9-1D3E2464C1B7/AuthList_0B5FE237-68C3-4684-BDBC-
PermittedUses : 0
SDMPackageLocalizedData : {}
SDMPackageVersion : 1
SDMPackageXML : <?xml version="1.0" encoding="utf-16"?><DesiredConfigurationDigest
2006/03/24/DesiredConfiguration"><AuthorizationList AuthoringScopeId=
"ScopeId_292DE7D8-BEE1-4D0D-BFF9-1D3E2464C1B7" LogicalName="AuthList_
0B5FE237-68C3-4684-BDBC-6E3CA4B11D8D" Version="1"><Annotation><DisplayName
Text=""/><Description Text=""/></Annotation><Upda tes><SoftwareUpdateReference
AuthoringScope="Site_292DE7D8-BEE1-4D0D-BFF9-1D3E2464C1B7" LogicalName=
"SUM_cf1f446d-6a5a-4db9-9119-30881192a19c" Version="1"/><SoftwareUpdate
Reference AuthoringScope="Site_292DE7D8-BEE1-4D0D-BFF9-1D3E2464C1B7"
SourceSite : C07
Updates : {530, 580}

So if my command returns this object to the pipeline when complete, then QIK will see that object and I can choose from all those properties to add to published data. So in the simplest sense, if all I wanted was the update list name, then I just Add a new property under the Published Data tab and fill in the appropriate information like below:



Best Practice Tip #1: Re-use property names of existing objects

If I am going to output the properties of an object that I just created or updated, I always use the same property names that the object does. This avoids confusion about what the property is supposed to represent (was "Update List Name" the value of "LocalizedDisplayName" or was it something else?). It also makes it easier when putting multiple activities together in a workflow to know that the output from this object is a specific type and has these specific properties.


Best Practice Tip #2: Group your object's properties together by using a prefix

Whenever I look through published data values in the testing console or the Operator Console, it's always hard sorting through the properties specific to the object and the default published data properties. So what I like to do when I create a custom object is to prefix all of the published data values with an underscore ("_"). This way, all of the object-specific published data is grouped together and I can quickly see all the values!

That was simple, right? Just provide the property name of the object you're returning to the PowerShell pipeline, and they'll appear in Published Data in Opalis! So what if I also want to get the list of updates that are included in this update list? I specified article IDs in the parameters – I want to verify that the update list got all the right update IDs from that. Well, I could just add the "Updates" property to the published data list and it will pull the list of updates into published data, right? Wrong. Try that and here's what you get:


The problem is that the list of updates is an array of integers, not a string. And since QIK only pulls the string value of properties back from the pipeline, you get the representation of the object back, not the actual values.

Lesson Learned #3 – QIK needs string properties, not arrays, to display your data

If QIK was displaying "System.UInt32[]" instead of my actual values, how could I get it to display the values I wanted? Here is where the beauty of PowerShell kicks in, through the use of the ETS (Extended Type System). To PowerShell, every object can be modified to make a new object. Knowing this, I can take my existing WMI object and add a new property that contains a string representation of my array of Update IDs. The code for this is very very simple:

$newUpdateList.Get() $newUpdateList | Select *, @{name="UpdateList";Expression={$_.updates -join "," }}

What I did here was pipe the update list to the Select-Object cmdlet (alias = "Select"), and then used the hash command to add a new property called "UpdateList" where the value was the joined list of array values, separated by commas. I then added a new value to my Published Data list in the QIK assembly:


Now when I run the policy, I get this:


So now I have an actual list of update IDs that I could use in some other activity!

So now I have an assembly that I package up using the QIK Wizard and also specify my two PowerShell script files to package with it into the OIP file.





When the wizard is done, I import the Integration Pack into the client, I see this:


And when I use the activity in a workflow, the activity's properties look like this:


I really like this method of using PowerShell because it hides the complexity of it being a script (the user doesn't have to look at or modify a script) and I can specify nice parameter names and even specify defaults if I like.

Of course, I have lots more lessons learned and best practices to tell you about, but this article is already long enough, so I'll save that for another post. Smile

I will put the sample OIP and source files (scripts) on the Opalis CodePlex site so you can download and play with them.

Until next time... enjoy!