Hey guys,
Ran into an interesting issue today that I should have probably known about but didn't. Was trying to image a brand new Haswell based HP EliteBook 840 G1 and no matter which driver I added to my Boot Image in SCCM 2012, it would come up with the "Network adapter not found" or something along those lines. I am doing Windows 7 image for this machine so I downloaded Windows 7 drivers to use during the task sequence, and used the exact same network driver for the WinPE portion. As nothing was working I tried doing a quick search online and some people reported similar problems on other HP hardware and had success with older drivers. I tried older drivers, as well as drivers for Windows 8.1, but still nothing. I did more digging, and found out that WinPE for SCCM 2012 SP1 CU3 is based on Windows 8. Once I inserted the Windows 8 driver, the network came up without issue! So I guess these drivers are very OS specific.
I have always used Windows 7 drivers in all my deployments here in WinPE and they always seemed to have worked, but to my surprise not Windows 7 or even Windows 8.1 drivers worked, it had to be the exact Windows 8 driver.
Hope this saves people some frustration!
Hello. My name is Dmitri Bobko and I work in IT out of Calgary, AB, Canada with certifications in VMware, Cisco, EMC, Microsoft among others and experience many different challenges every day. I always thought that a lot of the challenges that I deal with are unique as I can't always find the right answer online, so I always wanted to create a blog to contribute to the IT community. I hope the information you find here will be useful to solve your problem!
Tuesday, 19 November 2013
Wednesday, 16 October 2013
Workstation Migration Assistant - Run USMT as Non-Admin user
Hi guys,
Recently I have been tasked with creating another task sequence to capture and rebuild a Windows 7 workstation using USMT. As I have previously built our XP to Windows 7 migration task sequence, I was well aware of what to do; however, I thought it was cumbersome to do the same thing for a system rebuilt type of scenario, and not a complete OS migration.
I have in the past created a little batch script that would allow me to use USTM executables manually outside of SCCM, but unfortunately I forgot to back them up when we moved on from SCCM 2007 to SCCM 2012. The idea is to have an IT person go to the users computer, backup their profile, and then restore it, the easiest way possible.
Not wanting to invest time in recreating a simple batch script with no user interface, I started looking around for a GUI based USMT product. There are a few different GUIs out there that basically put a pretty face on USMT commands, however, those that looked good were not free, and those that were free, well, were not very good. I did stumble onto one GUI that looked very professional and minimalistic, yet had a bunch of great features built in that made it "smart". Workstation Migration Assistant by Dan Canningham seemed the fit the bill.
I downloaded the program and configured the MigAssistant.exe.config file with the settings I would like, including my designated USMT data store. This datastore is used to save the backups, as well as restore from, all without having to specify anything within the app. This is great as it makes it very easy for a desktop administrator to simply backup and restore, without having to worry about where the files are placed ect. Using the config file I was able to make this my default selection within the app:
I tried running the program as my slightly elevated standard account on my desktop and it ran perfectly. I was then able to restore the information on another machine and everything worked great. I thought that I have found the perfect tool, I could run it as the user, backup the data, and be on my way; however, when I tried to do the backup as a standard user with no system rights, I was prompted for the admin account. This in itself isn't bad; however, the default configuration of "I am the only person who uses this Workstation" that I wanted to use no longer worked, as the system tried to backup the profile of the admin account I entered to let it run in the first place. To top it off, every time I opened the program even as an administrative user, it would always prompt with the UAC prompt to allow the program to run.
Because I really liked the program and wanted it to be as simple as possible for our staff, I decided to do some digging to see why the UAC prompt was popping up, and how to make the program run as a standard user.
During my research I found that when a program is compiled, the "requestedExecutionLevel" is what controls the UAC prompts, which decides if you should enter admin credentials to run the program or not. The source code for Workstation Migration Assistant was available on Dan's website, but I haven't touched any development tools in ages and would be lost trying to recompile the original source code. Remembering a tool I used back in the day, I thought I would give Resource Hacker a shot and see if I can do anything within that tool to modify the MigAssitant.exe file from Dan's program. I loaded up Resource Hacker and opened the MigAssistant.exe file, and low and behold, under a strange entry of 24>1>0 I found exactly what I was looking for
As expected, trying to run the backup and restore procedures failed, with the logs pointing to permissions, as scanstate.exe and loadstate.exe that this program calls require admin rights to actually capture and restore the profile. So I did what I thought would be crazy if it worked, and opened up both scanstate.exe and loadstate.exe using Resource Hacker, and under the same strange grouping of 24 > 1 > ## I found the execution level assignment for both loadstate.exe and scanstate.exe.
With the user logged in, we are able to now open the program without any prompts, and simply press "Start", at which point scanstate.exe is called, and a prompt for administrator password is required. We enter our administrator password, and off it goes, backing up the original user who opened the Migration Assistant in the first place, and not the administrator who allowed the action. Restore works exactly the same way, asking for the administrator password to continue. The great thing about this GUI is that is even has a drop down box should you have multiple backups, and allows you to select which one you would like to restore. The program automatically looks for all the backups performed by this user using the _userID naming context and gives you the option to restore any of the backups you have, from any other computer.
I have attached the full program below with the modified MigAssistant.exe and scanstate.exe and loadstate.exe files. Extract the bundle, make the proper changes within the MigAssistant.exe.config file, and it should be ready to use!
Hope you guys find this info, and especially the tool as useful as I have. Big props to Dan for making this awesome GUI.
DOWNLOAD
Recently I have been tasked with creating another task sequence to capture and rebuild a Windows 7 workstation using USMT. As I have previously built our XP to Windows 7 migration task sequence, I was well aware of what to do; however, I thought it was cumbersome to do the same thing for a system rebuilt type of scenario, and not a complete OS migration.
I have in the past created a little batch script that would allow me to use USTM executables manually outside of SCCM, but unfortunately I forgot to back them up when we moved on from SCCM 2007 to SCCM 2012. The idea is to have an IT person go to the users computer, backup their profile, and then restore it, the easiest way possible.
Not wanting to invest time in recreating a simple batch script with no user interface, I started looking around for a GUI based USMT product. There are a few different GUIs out there that basically put a pretty face on USMT commands, however, those that looked good were not free, and those that were free, well, were not very good. I did stumble onto one GUI that looked very professional and minimalistic, yet had a bunch of great features built in that made it "smart". Workstation Migration Assistant by Dan Canningham seemed the fit the bill.
I downloaded the program and configured the MigAssistant.exe.config file with the settings I would like, including my designated USMT data store. This datastore is used to save the backups, as well as restore from, all without having to specify anything within the app. This is great as it makes it very easy for a desktop administrator to simply backup and restore, without having to worry about where the files are placed ect. Using the config file I was able to make this my default selection within the app:
I tried running the program as my slightly elevated standard account on my desktop and it ran perfectly. I was then able to restore the information on another machine and everything worked great. I thought that I have found the perfect tool, I could run it as the user, backup the data, and be on my way; however, when I tried to do the backup as a standard user with no system rights, I was prompted for the admin account. This in itself isn't bad; however, the default configuration of "I am the only person who uses this Workstation" that I wanted to use no longer worked, as the system tried to backup the profile of the admin account I entered to let it run in the first place. To top it off, every time I opened the program even as an administrative user, it would always prompt with the UAC prompt to allow the program to run.
Because I really liked the program and wanted it to be as simple as possible for our staff, I decided to do some digging to see why the UAC prompt was popping up, and how to make the program run as a standard user.
During my research I found that when a program is compiled, the "requestedExecutionLevel" is what controls the UAC prompts, which decides if you should enter admin credentials to run the program or not. The source code for Workstation Migration Assistant was available on Dan's website, but I haven't touched any development tools in ages and would be lost trying to recompile the original source code. Remembering a tool I used back in the day, I thought I would give Resource Hacker a shot and see if I can do anything within that tool to modify the MigAssitant.exe file from Dan's program. I loaded up Resource Hacker and opened the MigAssistant.exe file, and low and behold, under a strange entry of 24>1>0 I found exactly what I was looking for
<?xml version="1.0" encoding="utf-8"?>Right there, not only does it show the exact field I was hoping to change, it also shows just above the various possible execution levels that I can set! I changed the "requireAdministrator" field to "asInvoker", pressed the compile button, followed by a save, and ran the executable, and to my surprise, Migration Assistant opened without a single prompt for even the most basic user! Now, I knew the real test would come with being actually able to backup and do the restore, so I was only a third of the way there, at best.
<asmv1:assembly manifestVersion="1.0" xmlns="urn:schemas-microsoft-com:asm.v1" xmlns:asmv1="urn:schemas-microsoft-com:asm.v1" xmlns:asmv2="urn:schemas-microsoft-com:asm.v2" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<assemblyIdentity version="1.0.0.0" name="MyApplication.app" />
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v2">
<security>
<requestedPrivileges xmlns="urn:schemas-microsoft-com:asm.v3">
<!-- UAC Manifest Options
If you want to change the Windows User Account Control level replace the
requestedExecutionLevel node with one of the following.
<requestedExecutionLevel level="asInvoker" />
<requestedExecutionLevel level="highestAvailable" />
If you want to utilize File and Registry Virtualization for backward
compatibility then delete the requestedExecutionLevel node.
-->
<requestedExecutionLevel level="requireAdministrator" />
</requestedPrivileges>
<applicationRequestMinimum>
<defaultAssemblyRequest permissionSetReference="Custom" />
<PermissionSet class="System.Security.PermissionSet" version="1" Unrestricted="true" ID="Custom" SameSite="site" />
</applicationRequestMinimum>
</security>
</trustInfo>
</asmv1:assembly>
As expected, trying to run the backup and restore procedures failed, with the logs pointing to permissions, as scanstate.exe and loadstate.exe that this program calls require admin rights to actually capture and restore the profile. So I did what I thought would be crazy if it worked, and opened up both scanstate.exe and loadstate.exe using Resource Hacker, and under the same strange grouping of 24 > 1 > ## I found the execution level assignment for both loadstate.exe and scanstate.exe.
<?xml version='1.0' encoding='utf-8' standalone='yes'?>Running a program asInvoker means that it will run in the user context, and if that user has no rights to do what is needed, the program will fail to execute. So I decided to try out "requireAdministrator" as the execution level within scanstate.exe and loadstate.exe, compiled the script and saved the executables. I did not expect it to work, but to my surprise, it worked even better than I expected!
<assembly
xmlns="urn:schemas-microsoft-com:asm.v1"
manifestVersion="1.0"
>
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
<security>
<requestedPrivileges>
<requestedExecutionLevel
level="asInvoker"
uiAccess="false"
/>
</requestedPrivileges>
</security>
</trustInfo>
</assembly>
With the user logged in, we are able to now open the program without any prompts, and simply press "Start", at which point scanstate.exe is called, and a prompt for administrator password is required. We enter our administrator password, and off it goes, backing up the original user who opened the Migration Assistant in the first place, and not the administrator who allowed the action. Restore works exactly the same way, asking for the administrator password to continue. The great thing about this GUI is that is even has a drop down box should you have multiple backups, and allows you to select which one you would like to restore. The program automatically looks for all the backups performed by this user using the _userID naming context and gives you the option to restore any of the backups you have, from any other computer.
I have attached the full program below with the modified MigAssistant.exe and scanstate.exe and loadstate.exe files. Extract the bundle, make the proper changes within the MigAssistant.exe.config file, and it should be ready to use!
Hope you guys find this info, and especially the tool as useful as I have. Big props to Dan for making this awesome GUI.
DOWNLOAD
Tuesday, 1 October 2013
Windows 7 Password Change Prompt as Scheduled Task
Hey everyone,
It has been a while since I have updated my blog, mainly because I was away to Vegas, followed by Nepal, and mainly due to the fact that I haven't had to do anything special for a while, just mundane administrative tasks.
There was a request to see if we can come up with some solution to the Windows 7 password prompt which many of our users seem to miss, and even when they don't miss the actual prompt, the fact that it almost never comes back makes people forget about it. Unlike the previous version of Windows which repeatedly prompted you for a password change when your password was nearing expiration, Windows 7 (and 8 from what I can tell) don't seem to do as good of a job.
I looked online to see if there was something in group policies which we could adjust, but it appears that there isn't, and based on my research, this is a common problem that many admins seem to experience yet not real good solution in place. It seems that at the end of the day, a solution needed to be crafted, which lead me to this post by Mark-K on TechNet. Mark-K posted a script that he has created (I believe so at least) which did the job. The script would run against your user account and the AD, verify how long there is before the password expires, and if it is within a specified period, prompt the user with the instructions on how to change it.
Now, I am not a big fan of scripts and doing things the OS should do itself, but after much looking around, this did seem like the best solution. I was set on using the script in it's entirety when I realized that I really did not like the way the script looked, as the popup window also brought up the taskbar icon for Windows Scripting Host, which didn't really seem like a complete product for a perfectionist like myself.
This is what the script looked like initially (normally pops up in the middle, but I moved it down to capture everything in one shot):
The above screenshot also shows the script running with a 90 days notification set, so that I can see the script output during testing.
I decided that I wanted the script to run as an executable instead, so I converted the script in its entirety to exe using ScriptCryptor and added an icon which, in my opinion, make it look a lot more professional and complete.
It's a very small change, but the file is now an exe file with the proper icon, vs. a .vbs file with the default windows icon for scripting host.
To make the whole thing run at the end of the day, I created a new users GPO which creates a scheduled task that will run daily at a predefined time. If the user's password is not within the predefined expiration window, the user will not see any prompts at all as the program runs in the background, but as soon as the program evaluates the password expiry condition as true, the prompt will pop up, and continue to do so on the interval defined in the scheduled task until the user changes their password. I placed the executable in the scripts directory on the SYSVOL so that it is accessible by everyone on the domain.
I have precompiled a bunch of different versions of the executable with different warning periods preset if anyone is interested (download links below), otherwise you can use the script below and adjust it as you see fit, and recompile it with ScriptCryptor (not freeware) to get the same results. In addition, if you want the popup box to always stay on top, you can simply change this portion of the script "Change a password' option.", 0, "Password Expiration Warning" to this "Change a password' option.", 262144, "Password Expiration Warning"
Download links (select the one that matches the number of days you want before warning the user):
1 2 3 4 5 6 7 14 21 30 90
It has been a while since I have updated my blog, mainly because I was away to Vegas, followed by Nepal, and mainly due to the fact that I haven't had to do anything special for a while, just mundane administrative tasks.
There was a request to see if we can come up with some solution to the Windows 7 password prompt which many of our users seem to miss, and even when they don't miss the actual prompt, the fact that it almost never comes back makes people forget about it. Unlike the previous version of Windows which repeatedly prompted you for a password change when your password was nearing expiration, Windows 7 (and 8 from what I can tell) don't seem to do as good of a job.
I looked online to see if there was something in group policies which we could adjust, but it appears that there isn't, and based on my research, this is a common problem that many admins seem to experience yet not real good solution in place. It seems that at the end of the day, a solution needed to be crafted, which lead me to this post by Mark-K on TechNet. Mark-K posted a script that he has created (I believe so at least) which did the job. The script would run against your user account and the AD, verify how long there is before the password expires, and if it is within a specified period, prompt the user with the instructions on how to change it.
Now, I am not a big fan of scripts and doing things the OS should do itself, but after much looking around, this did seem like the best solution. I was set on using the script in it's entirety when I realized that I really did not like the way the script looked, as the popup window also brought up the taskbar icon for Windows Scripting Host, which didn't really seem like a complete product for a perfectionist like myself.
This is what the script looked like initially (normally pops up in the middle, but I moved it down to capture everything in one shot):
The above screenshot also shows the script running with a 90 days notification set, so that I can see the script output during testing.
I decided that I wanted the script to run as an executable instead, so I converted the script in its entirety to exe using ScriptCryptor and added an icon which, in my opinion, make it look a lot more professional and complete.
It's a very small change, but the file is now an exe file with the proper icon, vs. a .vbs file with the default windows icon for scripting host.
To make the whole thing run at the end of the day, I created a new users GPO which creates a scheduled task that will run daily at a predefined time. If the user's password is not within the predefined expiration window, the user will not see any prompts at all as the program runs in the background, but as soon as the program evaluates the password expiry condition as true, the prompt will pop up, and continue to do so on the interval defined in the scheduled task until the user changes their password. I placed the executable in the scripts directory on the SYSVOL so that it is accessible by everyone on the domain.
I have precompiled a bunch of different versions of the executable with different warning periods preset if anyone is interested (download links below), otherwise you can use the script below and adjust it as you see fit, and recompile it with ScriptCryptor (not freeware) to get the same results. In addition, if you want the popup box to always stay on top, you can simply change this portion of the script "Change a password' option.", 0, "Password Expiration Warning" to this "Change a password' option.", 262144, "Password Expiration Warning"
'==========================================
' Check for password expiring notification
'==========================================
' First, get the domain policy.
'==========================================
Dim oDomain
Dim oUser
Dim maxPwdAge
Dim numDays
Dim warningDays
warningDays = 90
Set LoginInfo = CreateObject("ADSystemInfo")
Set objUser = GetObject("LDAP://" & LoginInfo.UserName & "")
strDomainDN = UCase(LoginInfo.DomainDNSName)
strUserDN = LoginInfo.UserName
'========================================
' Check if password is non-expiring.
'========================================
Const ADS_UF_DONT_EXPIRE_PASSWD = &h10000
intUserAccountControl = objUser.Get("userAccountControl")
If intUserAccountControl And ADS_UF_DONT_EXPIRE_PASSWD Then
'WScript.Echo "The password does not expire."
Else
Set oDomain = GetObject("LDAP://" & strDomainDN)
Set maxPwdAge = oDomain.Get("maxPwdAge")
'========================================
' Calculate the number of days that are
' held in this value.
'========================================
numDays = CCur((maxPwdAge.HighPart * 2 ^ 32) + _
maxPwdAge.LowPart) / CCur(-864000000000)
'WScript.Echo "Maximum Password Age: " & numDays
'========================================
' Determine the last time that the user
' changed his or her password.
'========================================
Set oUser = GetObject("LDAP://" & strUserDN)
'========================================
' Add the number of days to the last time
' the password was set.
'========================================
whenPasswordExpires = DateAdd("d", numDays, oUser.PasswordLastChanged)
fromDate = Date
daysLeft = DateDiff("d",fromDate,whenPasswordExpires)
'WScript.Echo "Password Last Changed: " & oUser.PasswordLastChanged
if (daysLeft < warningDays) and (daysLeft > -1) then
Msgbox "Your password will expire in " & daysLeft & " day(s)" & " at " & whenPasswordExpires & chr(13) & chr(13) & "Press CTRL + ALT + DEL and select the 'Change a password' option.", 0, "Password Expiration Warning"
End if
End if
'========================================
' Clean up.
'========================================
Set oUser = Nothing
Set maxPwdAge = Nothing
Set oDomain = Nothing
Download links (select the one that matches the number of days you want before warning the user):
1 2 3 4 5 6 7 14 21 30 90
Thursday, 23 May 2013
SCCM2012 with SCUP 2011 - Move your database and make it shared
So as you can see, a lot of SCCM 2012 posts mainly due to the fact that I setup SCCM2012 from the begining to the end at my company, and experienced lots of challenges along the way, but it is all now running as a well oiled machine, with Endpoint Protection 2012, MBAM2.0 integration, Citrix Publisher Integration, WSUS, SCUP 2011 with Shavlik updates, pretty much the whole nine yards.
One of the requirements around here was to get SCUP 2011 working to be able to publish Adobe updates at the least. Since there are only a few live catalogue providers (Dell, HP and Adobe last I checked) this wasn't really a requirement per say, but in an effort to make the transition from SCCM 2007 to 2012 as seamless as possible, I took the challenge on to get this working.
As some of you may know, SCUP 2011 is a single user application by default. You set things up, everything is great, it works for you, but if anyone else wants to use it, then they have to set everything up as well. This includes having to add all update sources, WSUS and SCCM integration components ect. I didn't know this myself until I had a co-worker test out SCUP, and to my surprise it was completely empty, with nothing configured at all. I figured there has to be a way to make this all work, so I searched on Google and found a few very interesting posts which got me 90% there, but the other 10% took me a few more days to figure out.
As posted here, the database file that SCUP 2011 uses is actually stored under the user's profile, which can be moved to a different, shared location. In my case, the database file was located at this directory
So as you can see, ConfigHelpDesk was granted full permissions on the SCUP install folder directly, even though it was already a member of the local Administrators group.
So all this work got me to about 90%...users were now able to login (one at a time) and see everything that is configured on this shared database. Great, so my work was done I thought. A few weeks later a co-worker was ready to publish something and the options were not there, so I was called in to take a look. Looked at the options, and sure enough, all the certificates, WSUS integration and SCCM integration is not even configured! So it looks as though the database holds some information, the configuration is stored somewhere else. I started to do some looking around, and stumbled upon this nicely named directory in my user profile:
I copied everything between <userSettings> and </userSettings> from the config file above and pasted it into the same Scup2011.exe.config file that we have previously modified with a new database location, replacing the empty fields with the same labels. Once I did this, every admin user could publish and all of the certificate, WSUS and SCCM integration configuration was there, in a truly shared way. Keep in mind that you still have to run SCUP 2011 with "Run as Administrator", otherwise it will not publish.
In addition, for every update category that you publish, that category or vendor name will need to be checked of as a "product" in SCCM 2012 console under Administraton > Site Configuration > Sites > Right click to Configure Site Components > System Update Point.
You will have to run "Synchronize Software Updates" function twice from within SCCM 2012 console to first get the update to show up in the list above to be checked off for synchronization, and then again to actually get the updates to show up within your update list in SCCM.
Somewhere along the line, I learned that we actually have an active Shavlik subscription, so I attached our SCUPUpdates catalogue from Shavlik into SCUP and can easily publish all 3rd party updates directly into our SCCM 2012 server. Here is a picture of the 3rd party products that Shavlik can patch via SCUPUpdates with SCUP 2011 and SCCM 2012
I didn't see any reference to this complete solution, so hopefully this is of use to someone.
One of the requirements around here was to get SCUP 2011 working to be able to publish Adobe updates at the least. Since there are only a few live catalogue providers (Dell, HP and Adobe last I checked) this wasn't really a requirement per say, but in an effort to make the transition from SCCM 2007 to 2012 as seamless as possible, I took the challenge on to get this working.
As some of you may know, SCUP 2011 is a single user application by default. You set things up, everything is great, it works for you, but if anyone else wants to use it, then they have to set everything up as well. This includes having to add all update sources, WSUS and SCCM integration components ect. I didn't know this myself until I had a co-worker test out SCUP, and to my surprise it was completely empty, with nothing configured at all. I figured there has to be a way to make this all work, so I searched on Google and found a few very interesting posts which got me 90% there, but the other 10% took me a few more days to figure out.
As posted here, the database file that SCUP 2011 uses is actually stored under the user's profile, which can be moved to a different, shared location. In my case, the database file was located at this directory
C:\Users\EBLEND\AppData\Local\Microsoft\System Center Updates Publisher 2011\5.00.1727.0000\scupdb.sdfI copied the file to the install directory where I installed SCUP 2011 as it is accessibly by all administrators, and modified the Scup2011.exe.config file in the same directory to reflect the new location as described in that post I mentioned earlier.
<applicationSettings>Excellent I thought, this is now ready....but this was not the case. When a co-worker tried to open SCUP, he got an error "An error occured. Please refer to log file for details". I did a quick Google search and found this post where it explained that for whatever reason, a user or a group that is a member of a local administrator group, even if granted permissions directly to the SCUP folder, does not have access to access the database file. I looked at the permissions on the SCUP folder, and Administrators had full access, and I had my AD SCCM groups as part of the local Administrator's group, but it wouldn't work. The trick is to assign your AD SCCM administrator groups directly to the folder, after I did this, SCUP opened without issue.....I thought....more on this in a minute.
<Scup.Properties.Settings>
<setting name="SSCEDataFile" serializeAs="String">
<value>D:\Program Files (x86)\System Center Updates Publisher 2011\scupdb.sdf</value>
</setting>
</Scup.Properties.Settings>
</applicationSettings>
So as you can see, ConfigHelpDesk was granted full permissions on the SCUP install folder directly, even though it was already a member of the local Administrators group.
So all this work got me to about 90%...users were now able to login (one at a time) and see everything that is configured on this shared database. Great, so my work was done I thought. A few weeks later a co-worker was ready to publish something and the options were not there, so I was called in to take a look. Looked at the options, and sure enough, all the certificates, WSUS integration and SCCM integration is not even configured! So it looks as though the database holds some information, the configuration is stored somewhere else. I started to do some looking around, and stumbled upon this nicely named directory in my user profile:
C:\Users\EBLEND\AppData\Local\Microsoft\Scup2011.exe_StrongName_2wzdfznimh1kefuisr0pqsefwkw5k4tp\5.0.1727.0\user.configI opened up this file and sure enough, it seems like all of the configution for Certificates, WSUS and SCCM integration is stored here, and looks something like this:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<userSettings>
<Scup.Properties.Settings>
<setting name="EnableWSUSPublish" serializeAs="String">
<value>True</value>
</setting>
<setting name="LastKnownCertificateIssuer" serializeAs="String">
<value>CN=certauthsa1, DC=auc, DC=ab, DC=ca</value>
</setting>
<setting name="LastKnownCertificateExpire" serializeAs="String">
<value>3/21/2015 10:01:07 AM</value>
</setting>
<setting name="IsTimestampEnabled" serializeAs="String">
<value>False</value>
</setting>
<setting name="SigningCertSectionEnabled" serializeAs="String">
<value>True</value>
</setting>
<setting name="EnableCMIntegration" serializeAs="String">
<value>True</value>
</setting>
<setting name="LocalSourcePublishingFolder" serializeAs="String">
<value>I:\AdobeUpdates</value>
</setting>
<setting name="UseCustomLocalSourceFolder" serializeAs="String">
<value>False</value>
</setting>
<setting name="Height" serializeAs="String">
<value>720</value>
</setting>
<setting name="Width" serializeAs="String">
<value>1000</value>
</setting>
<setting name="Left" serializeAs="String">
<value>412</value>
</setting>
<setting name="Top" serializeAs="String">
<value>368</value>
</setting>
<setting name="TrustedPublishers" serializeAs="Xml">
<value>
<ArrayOfString xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<string>MIIFkTCCBHmgAwIVyaXNpZ24uY29tL3Jw...........9/cD+1zUzqr</string>
</ArrayOfString>
</value>
</setting>
</Scup.Properties.Settings>
</userSettings>
</configuration>
I copied everything between <userSettings> and </userSettings> from the config file above and pasted it into the same Scup2011.exe.config file that we have previously modified with a new database location, replacing the empty fields with the same labels. Once I did this, every admin user could publish and all of the certificate, WSUS and SCCM integration configuration was there, in a truly shared way. Keep in mind that you still have to run SCUP 2011 with "Run as Administrator", otherwise it will not publish.
In addition, for every update category that you publish, that category or vendor name will need to be checked of as a "product" in SCCM 2012 console under Administraton > Site Configuration > Sites > Right click to Configure Site Components > System Update Point.
You will have to run "Synchronize Software Updates" function twice from within SCCM 2012 console to first get the update to show up in the list above to be checked off for synchronization, and then again to actually get the updates to show up within your update list in SCCM.
Somewhere along the line, I learned that we actually have an active Shavlik subscription, so I attached our SCUPUpdates catalogue from Shavlik into SCUP and can easily publish all 3rd party updates directly into our SCCM 2012 server. Here is a picture of the 3rd party products that Shavlik can patch via SCUPUpdates with SCUP 2011 and SCCM 2012
I didn't see any reference to this complete solution, so hopefully this is of use to someone.
Tuesday, 21 May 2013
SCCM 2012 - Endpoint Protection Antimalware Activitiy Report Subscription with Dynamic Dates
Over the weekend I decided to work on some additional subscriptions for reports that I wanted out of SCCM 2012, and although the "Antimalware Activity Report" looks much like the "Antimalware Overall Status and History" I discussed in my previous blog post, it it somewhat different in it's execution.
I opened the report in SQL Server Report Builder and noticed that it had the same @StartDate and @EndDate parameters as well as the DateRange dataset as the "Antimalware Overall Status and History" report, so I figured this would be easy. I saved the report with an appended name (usually add -last 7 days) and tried to figure out how to make it work automatically. I deleted the @StartDate and @EndDate parameters and tried to save, which gave me an error that a subreport is using these parameters somewhere. It seems that a report won't save if there are errors, and if there are errors, it tells you where they are specifically, so this has become my new meathod of tinkering.
Subreport, well that's news to me, what are these? I looked over to the right hand side of report builder and noticed these gray boxes:
Right clicking on each individual box will show give you an option for "Subreport Properties"....so I guess I found the sub reports which are referencing my date variables. Go into the Subreport Properties on each gray box , navigate to Parameters and you should see something like this:
Click on the little "fx" icon next to StartDate and replace the expression within with this:
Repeat the same thing with the EndDate and replace the expression with this:
Why Microsoft didn't make this a default behavious is anyone's guess. On the bright side, the MBAM 2.0 report doesn't require any dates and just shows the state of your BitLocker encryption at the time the report is ran.
I opened the report in SQL Server Report Builder and noticed that it had the same @StartDate and @EndDate parameters as well as the DateRange dataset as the "Antimalware Overall Status and History" report, so I figured this would be easy. I saved the report with an appended name (usually add -last 7 days) and tried to figure out how to make it work automatically. I deleted the @StartDate and @EndDate parameters and tried to save, which gave me an error that a subreport is using these parameters somewhere. It seems that a report won't save if there are errors, and if there are errors, it tells you where they are specifically, so this has become my new meathod of tinkering.
Subreport, well that's news to me, what are these? I looked over to the right hand side of report builder and noticed these gray boxes:
Right clicking on each individual box will show give you an option for "Subreport Properties"....so I guess I found the sub reports which are referencing my date variables. Go into the Subreport Properties on each gray box , navigate to Parameters and you should see something like this:
Click on the little "fx" icon next to StartDate and replace the expression within with this:
=DATEADD("d",-7,Today())
Repeat the same thing with the EndDate and replace the expression with this:
=DATEADD("d",-0,Today())Repeat this for every Subreport. Once this is done, you will be able to delete the DateRange Dataset as well. Save your report and create your subscription. The subscription will automatically update the dates everytime it executes and you will get your dynamic report showing Antimalware activity for the last 7 days.
Why Microsoft didn't make this a default behavious is anyone's guess. On the bright side, the MBAM 2.0 report doesn't require any dates and just shows the state of your BitLocker encryption at the time the report is ran.
Sunday, 19 May 2013
SCCM 2012 - Endpoint Protection Reporting with Dynamic Date Range
Recently I setup our SCCM 2012 server with System Center Endpoint Protection 2012 (formerly ForeFront Endpoint Protection, or FEP) and was asked to setup a report that would be mailed out weekly to show the overall status of our new antimalware solution which we were slowly rolling out throughout the enterprise.
This is normally pretty easy to do, find the report you are interested in, create a subscription and you are on your way; however I hit a snag. I created the subscription, but did notice that the built in report (Antimalware Overall Status and History) for SCEP2012 wanted a static start and end date. I didn't think much of this as I thought that surely being a subscription that is meant to be provide a report on a set interval, it would be dynamic and those dates are just your initial report start and end dates that would dynamically update every time the subscription is ran. Well I soon found out that this is not the case.
The first week this was not evident as the report was mailed out the day I setup the subscription, however, the following week, the report still had the same charts as it did the week prior, without dynamically updating to show the last 7 days like I wanted it to.
I did a Google search and found a post on TechNet where someone was having the exact same issues as me, and to my relief, someone posted what looked like a solution. I found the report in SCCM, right clicked on it and went "Edit" which opened the report builder. I quickly went to the file many and saved it as a new report and made the changes suggested. I created another subscription, and set it to run daily so that I could see the result quickly. To my disappointment, there was no change, the report still was showing just the static dates. I left the problem for a while to concentrate on other issues and didn't revisit it until last week. I looked around at additional information online referring to dates and tried everything that I could find, but to no avail, the reports were still coming out with static dates. If you run the report manually it will return the proper 7 day window in the charts; however, whenever I created a subscription, it was like those dynamic dates were set in stone and became static.
Last night, out of nowhere, it hit me! When I followed the TechNet article, it mentioned that we have to set the date parameters as hidden. The parameters rely on a query and generate a dynamic date when the report is run manually, however, when a subscription is created, the dynamically generated dates are still parameters that the subscription uses, they are just hidden. So the day the subscription is created, the correct dates are automatically populated into the subscription and off you go, but those dates don't change after the fact. I thought that the way to get around this issue, is to get rid of the date parameters all together! If the date parameters are not there in any fashion, then the subscription will not use them (even when hidden) at all and will dynamically generate the date.
I opened the report builder again, went to Parameters and deleted the @StartDate and @EndDate parameters. I tried to save the report at this point which gave me an error that the Parameters are used in some query. Not being very proficient in SQL Server Report Builder, I went to the Datasets area and deleted the whole "DateRange" dataset. I tried to save again and still the same error. At this point I remembered that there was a query setup under EPHistory dataset filed called "StatusTime". When you view the properties of "StatusTime" field, and click on Query, you will see something like this:
As you can see, this is the query that is referencing the @StartDate and @EndDate parameters and is what drives the charts. Since I deleted the parameters, I had to find a way to make this query work without the parameters. I dig into my previous attempts at getting all of this working and simply replaced the @ parameter values with actual date statements, and this was the result:
I saved the report at this point without error, created the subscription to run the report at 5 minutes past midnight, and stayed up until 12:05AM waiting for the e-mail. The e-mail arrived and the charts were correct!
This is a portion of the e-mail (using web archive format, as I found it preserves the layout best):
I spend a good chunk of time on this, so hopefully this information is useful to somebody! I now have to setup multiple subscriptions for various other things like MBAM 2.0, so hopefully this solution is more of less the same thing that I will have to do for any other report that will require static dates.
If I had more knowledge in Report Builder I am sure I could have figured it out right away, but I never used it before this time, so it took me some time.
Hope my first blog post isn't a disappointment. Sorry for the wall of text!
This is normally pretty easy to do, find the report you are interested in, create a subscription and you are on your way; however I hit a snag. I created the subscription, but did notice that the built in report (Antimalware Overall Status and History) for SCEP2012 wanted a static start and end date. I didn't think much of this as I thought that surely being a subscription that is meant to be provide a report on a set interval, it would be dynamic and those dates are just your initial report start and end dates that would dynamically update every time the subscription is ran. Well I soon found out that this is not the case.
The first week this was not evident as the report was mailed out the day I setup the subscription, however, the following week, the report still had the same charts as it did the week prior, without dynamically updating to show the last 7 days like I wanted it to.
I did a Google search and found a post on TechNet where someone was having the exact same issues as me, and to my relief, someone posted what looked like a solution. I found the report in SCCM, right clicked on it and went "Edit" which opened the report builder. I quickly went to the file many and saved it as a new report and made the changes suggested. I created another subscription, and set it to run daily so that I could see the result quickly. To my disappointment, there was no change, the report still was showing just the static dates. I left the problem for a while to concentrate on other issues and didn't revisit it until last week. I looked around at additional information online referring to dates and tried everything that I could find, but to no avail, the reports were still coming out with static dates. If you run the report manually it will return the proper 7 day window in the charts; however, whenever I created a subscription, it was like those dynamic dates were set in stone and became static.
Last night, out of nowhere, it hit me! When I followed the TechNet article, it mentioned that we have to set the date parameters as hidden. The parameters rely on a query and generate a dynamic date when the report is run manually, however, when a subscription is created, the dynamically generated dates are still parameters that the subscription uses, they are just hidden. So the day the subscription is created, the correct dates are automatically populated into the subscription and off you go, but those dates don't change after the fact. I thought that the way to get around this issue, is to get rid of the date parameters all together! If the date parameters are not there in any fashion, then the subscription will not use them (even when hidden) at all and will dynamically generate the date.
I opened the report builder again, went to Parameters and deleted the @StartDate and @EndDate parameters. I tried to save the report at this point which gave me an error that the Parameters are used in some query. Not being very proficient in SQL Server Report Builder, I went to the Datasets area and deleted the whole "DateRange" dataset. I tried to save again and still the same error. At this point I remembered that there was a query setup under EPHistory dataset filed called "StatusTime". When you view the properties of "StatusTime" field, and click on Query, you will see something like this:
select * from fn_rbac_EndpointProtectionHealthStatus_History(@UserSIDs)
where CollectionID=@CollID and
DATEADD(day, 0, DATEDIFF(day, 0, statustime)) between @StartDate and @EndDate
As you can see, this is the query that is referencing the @StartDate and @EndDate parameters and is what drives the charts. Since I deleted the parameters, I had to find a way to make this query work without the parameters. I dig into my previous attempts at getting all of this working and simply replaced the @ parameter values with actual date statements, and this was the result:
select * from fn_rbac_EndpointProtectionHealthStatus_History(@UserSIDs)
where CollectionID=@CollID and
DATEADD(day, 0, DATEDIFF(day, 0, statustime)) between DATEADD("d",-7,GetDate()) and DATEADD("d",-0,GetDate())
I saved the report at this point without error, created the subscription to run the report at 5 minutes past midnight, and stayed up until 12:05AM waiting for the e-mail. The e-mail arrived and the charts were correct!
This is a portion of the e-mail (using web archive format, as I found it preserves the layout best):
I spend a good chunk of time on this, so hopefully this information is useful to somebody! I now have to setup multiple subscriptions for various other things like MBAM 2.0, so hopefully this solution is more of less the same thing that I will have to do for any other report that will require static dates.
If I had more knowledge in Report Builder I am sure I could have figured it out right away, but I never used it before this time, so it took me some time.
Hope my first blog post isn't a disappointment. Sorry for the wall of text!
Subscribe to:
Posts (Atom)