O R G A N I C / F E R T I L I Z E R: 2014

Dec 5, 2014

PowerShell: Retrieve site location of computer object

This is a nice find that I am cataloging from Shay Levy.

You can get the site location of a computer if you run this PS script on the computer itself.



It’s effectively the same as using NLTEST.

nltest /dsgetsite


I had intended to use it with Compliance Settings but the compliance rules limitations made it a impractical. Still going to be useful for other stuff. If you want remote options, read more about it at the original post: #PSTip Get the AD site name of a computer.

Oct 27, 2014

Preparing for the End of Windows Server 2003

It’s a little embarrassing, or maybe I should say lucky, that somehow I hadn’t the need to review the changes to the dynamic port range assignments. I say it that way because the range wasn’t something that recently changed. By recent, let’s call it 2012. No, in fact, it goes back to 2008. Microsoft changed the dynamic port range to comply with IANA recommendations effectively moving the range:












The troubles you’ll find with this kind of change usually won’t present itself until you try to restrict it somehow. This issue was noticed when domain controllers were upgraded to 2012. The version previous was 2003. :-|

The kinds of issues witnessed appeared all over the place, compounded with confusion since the issues weren’t well captured or documented during troubleshooting. Here’s what was seen along with the corresponding error messages:

  • Failure to connect to a share
    • Windows cannot access <share>
    • The trust relationship between this workstation and the primary domain failed
  • Failure to test secure channel
    • Access Denied
  • Failure to join a domain
    • The join operation was not successful. This could be because an existing computer account having name <myComputer> was previously created using a different set of credentials. Use a different computer name, or contact your administrator to remove any stale conflicting account. The error was: Access is denied.

Notably, netlogon.log would also show errors suggesting problems during the domain join such as:

10/03/2014 02:00:52:695 SamOpenUser on 564842 failed with 0xc0000022
10/03/2014 02:00:52:695 NetpManageMachineAccountWithSid: status of attempting to set password on <myDomainController> for <myComputer>: 0x5
10/03/2014 02:00:52:695 NetpJoinDomain: status of creating account: 0x5
10/03/2014 02:00:52:695 NetpJoinDomain: initiaing a rollback due to earlier errors



Sometimes the quickest way to resolution is what some people assume to be the hardest. It’s important to get trace packets from both hosts at the same time. After that, the other trick is to read it. :o)

In this dump, you’ll see where EPM (endpoint mapper) negotiates to port 50445. After that, all hell breaks loose. Just kidding. In reality, you simply won’t see any of those packets reaching the destination port since the environment was configured to respect the old dynamic port range. (Never mind the IPs. I’m protecting the innocent.)

4846    4:12:03 AM 10/3/2014    62.5715595      svchost.exe   <myDomainController>  EPM     EPM:Request: ept_map: NDR, DRSR(DRSR) {E3514235-4B06-11D1-AB04-00C04FC2DCD2} v4.0, RPC v5, (0x87) [DCE endpoint resolution(135)]    {MSRPC:857, TCP:856, IPv4:130}
4847    4:12:03 AM 10/3/2014    62.5721299      svchost.exe     <myDomainController>   EPM     EPM:Response: ept_map: NDR, DRSR(DRSR) {E3514235-4B06-11D1-AB04-00C04FC2DCD2} v4.0, RPC v5, (0xC50D) [50445]  {MSRPC:857, TCP:856, IPv4:130}
4848    4:12:03 AM 10/3/2014    62.5944302      svchost.exe   <myDomainController>  TCP     TCP:Flags=......S., SrcPort=65207, DstPort=50445, PayloadLen=0, Seq=3334272165, Ack=0, Win=8192 ( Negotiating scale factor 0x8 ) = 8192 {TCP:858, IPv4:130}
5010    4:12:06 AM 10/3/2014    65.5937718      svchost.exe   <myDomainController>  TCP     TCP:[SynReTransmit #4848]Flags=......S., SrcPort=65207, DstPort=50445, PayloadLen=0, Seq=3334272165, Ack=0, Win=8192 ( Negotiating scale factor 0x8 ) = 8192    {TCP:858, IPv4:130}
5380    4:12:12 AM 10/3/2014    71.5937519      svchost.exe   <myDomainController>  TCP     TCP:[SynReTransmit #4848]Flags=......S., SrcPort=65207, DstPort=50445, PayloadLen=0, Seq=3334272165, Ack=0, Win=8192 ( Negotiating scale factor 0x8 ) = 8192    {TCP:858, IPv4:130}

A quick, client-side port query would confirm that in fact, it cannot do anything over that port.



In short, prepare for your transition away from 2003. I know many of you (myself included) still have things running on 2003. This is one you should look for and remediate wherever possible. Here’s a link to the article describing the default dynamic port range changes:

The default dynamic port range for TCP/IP has changed in Windows Vista and in Windows Server 2008

By the way, did you know you can run a packet trace from netsh? Oh yes, you can. Here’s a link to my blog post: Using netsh to Capture Packets

Oct 14, 2014

boosting the powershell ise with ise steroids

Ever since the PowerShell ISE was released, I slowly moved away from using some of the other things I was pretty fond of like PowerShellPlus and PrimalScript. It’s mostly because it’s super convenient.

Along came ISE Steroids. I can’t really speak to 1.0 since I just started on 2.0 and just started very recently, actually. So far, I’m pretty impressed. The best part of using it, is it doesn’t force the convenience factor to change at all. Installing it is as simple as unzipping the files to your module path ($env:PSModulePath -split ';'). After that, you launch it with Start-Steroids. That gives me the convenience of using the plain ol’ ISE or switching into a hyper-capable ISE.

I’ve only begun scratching the surface of its capabilities though here are some things I’ve been using so far:


Help. I love this feature. Anything I click on in a script, the help add-on will attempt to look up and present relevant information.


Variables. This is another feature I love. Having a variables window makes debugging so much easier.



Is there someone on your team that codes in a manner that only their mother could love? If so, you might benefit from using the Refactor process. It’s basically a series of scripts that will comb the hair and wash behind the ears of your PowerShell script. It’s not perfect, but it performs admirably. It’s also configurable if you need to tune things down from default. Here’s an example:


foreach ($item in $smsobjects){
#write-host $item.name
$machinesfromSMS = $machinesfromSMS + $item.name}

foreach ($item in $sms2012objects){
#write-host $item.name
$machinesfromSMS = $machinesfromSMS + $item.name}


foreach ($item in $smsobjects)
    #write-host $item.name
    $machinesfromsms = $machinesfromsms + $item.name

foreach ($item in $sms2012objects)
    #write-host $item.name
    $machinesfromsms = $machinesfromsms + $item.name

Which would you rather read and interpret?


I would love to see the context sensitive help add-on retrieve things from the console or at least a search box to look up information manually. At this time, I have an empty script where I type in the command to make it show me help information.


ISE Steroids isn’t a new shell, a giant development environment, or anything that fancy. It’s a lot of little things that tunes out the default PowerShell ISE into a highly functional shell and scripting environment. It’s extensible with other add-ons and supports launching applications from the ISE. (ILSpy and WinMerge come loaded.)

It’s my new favorite. I’m hooked. If you like the PowerShell ISE environment, you should check it out. There are many more features I haven’t brought up (signing, version control, wizards, etc.)

Oct 8, 2014

using netsh to capture packets

Outages. Aside from the massive pressure of having to restore service, they can be pretty useful to learn new things. One recent discovery that was news to me is that you can use netsh to capture network traces. WHAT?! Yeah1.

It appears on modern-ish operating systems (Windows 7/Windows 2008 R2 and above) you no longer need to install your favorite packet tracing application to capture packets. Who doesn’t like to cuddle up with a nice packet trace, eh? Obviously if you’re on a desktop OS, you should just load packet capturing utility of choice (and it had better be Network Monitor if you intend to open the .ETL trace) -- unless you like to read it in some other way. That would mean your skillz are simply amazing and are wasting your time here!



The most basic way to start and stop a trace is by performing the following commands:

As you can see, netsh displays the trace configuration as well. It’s not the full configuration of defaults though.

netsh trace start capture=yes

Trace configuration:
Status:             Running
Trace File:         C:\Users\me\AppData\Local\Temp\NetTraces\NetTrace.etl
Append:             Off
Circular:           On
Max Size:           250 MB
Report:             Off

netsh trace stop


Pulling the help file (trace start help) will provide the list of defaults if you run the command as indicated above. I have them illustrated here for reference.

capture=no (specifies whether packet capture is enabled in addition to trace events)
report=no (specifies whether a complementing report will be generated along with the trace file)
persistent=no (specifies whether the tracing session continues across reboots, and is on until netsh trace stop is issued)
maxSize=250 MB (specifies the maximum trace file size, 0=no maximum)
overwrite=yes (specifies whether an existing trace output file will be overwritten)
correlation=yes (specifies whether related events will be correlated and grouped together)
traceFile=%LOCALAPPDATA%\Temp\NetTraces\NetTrace.etl (specifies location of the output file)

I have highlighted the defaults which do not show up in the trace file output.



Message Analyzer. You can open the file in Message Analyzer quite simply. (It’s also a brilliant tool.) By simply, I mean, just open the file once you have it running. This is my favorite way.


Network Monitor. You can expect the same results in Network Monitor but will need to do one additional step. Even without doing it, you’ll be able to read the frame details. The frame summary won’t make much sense though.

  1. Go to Tools | Options | Parser Profiles.
  2. Choose Windows and select Set As Active.

Wireshark. You are pretty much on your own. There are ways2 to convert .ETL to .CAP or whatever if you really need to stick that much to your guns.


I’m not really sure why I used this photo. Maybe it’s because it looks like a tangled mess of grapevines when in reality, it’s stacked. It’s a matter of perspective, I suppose. Right, LouisG?

1 Probably as strange as just realizing I wrote this blog post with capitalization.





2 http://blogs.technet.com/b/yongrhee/archive/2013/08/16/so-you-want-to-use-wireshark-to-read-the-netsh-trace-output-etl.aspx

Oct 1, 2014

Microsoft Most Valuable Professional (MVP) 2015

Hello everyone. I received the news today that my MVP award has been renewed. I feel privileged to receive such a distinguished honor in company with some of the brightest minds in technology. Congratulations to all of my fellow MVPs who were also renewed today.

imageIt is with great pride we announce that Marcus Oh has been awarded as a Microsoft® Most Valuable Professional (MVP) for 10/1/2014 - 10/1/2015. The Microsoft MVP Award is an annual award that recognizes exceptional technology community leaders worldwide who actively share their high quality, real world expertise with users and Microsoft. All of us at Microsoft recognize and appreciate Marcus’s extraordinary contributions and want to take this opportunity to share our appreciation with you.

With fewer than 4,000 awardees worldwide, Microsoft MVPs represent a highly select group of experts. MVPs share a deep commitment to community and a willingness to help others. They represent the diversity of today’s technical communities. MVPs are present in over 90 countries, spanning more than 30 languages, and over 70 Microsoft technologies. MVPs share a passion for technology, a willingness to help others, and a commitment to community. These are the qualities that make MVPs exceptional community leaders. MVPs’ efforts enhance people’s lives and contribute to our industry’s success in many ways. By sharing their knowledge and experiences, and providing objective feedback, they help people solve problems and discover new capabilities every day. MVPs are technology’s best and brightest, and we are honored to welcome Marcus as one of them.

To recognize the contributions they make, MVPs from around the world have the opportunity to meet Microsoft executives, network with peers, and position themselves as technical community leaders. This is accomplished through speaking engagements, one on one customer event participation and technical content development. MVPs also receive early access to technology through a variety of programs offered by Microsoft, which keeps them on the cutting edge of the software and hardware industry.

As a recipient of this year’s Microsoft MVP award, Marcus joins an exceptional group of individuals from around the world who have demonstrated a willingness to reach out, share their technical expertise with others and help individuals maximize their use of technology.

Rich Kaplan
Corporate Vice President
Customer and Partner Advocacy
Microsoft Corporation


If interested, this is my MVP profile: http://mvp.microsoft.com/en-us/mvp/Marcus%20C.%20Oh-10604

Sep 26, 2014

atlanta systems management user group 10.03.14

I cannot honestly believe it’s already time for our user group meeting. It’s one week from now. It’s kind of crazy how fast time goes by. It’s also a lot more effort to put these together than you would expect.

So for that, I am grateful to all of the folks that help keep this going, all of the sponsors that help keep us eating, our perpetual sponsors that give us lots of great giveaways and benefits, all of the speakers that bring great content, and all of the people, like you, that come share your knowledge.

At our last user group meeting, we took an opportunity to use the space in the MTC side of the Microsoft office. What we discovered was the interaction was entirely different than the classroom spaces. It provided a better environment for interaction which is ultimately what we’ve always strived for -- networking, meeting your peers in the industry, and sharing knowledge. That’s the benefit of tying into a user community. You grow your access to knowledge exponentially.

Shavlik is our sponsor this quarter. Their product addresses the big hole of patch management where Windows patching ends -- third party. Here’s a little blurb:

Today, it's not Windows that represents the most vulnerabilities, but instead it's the applications that run on Windows that expose businesses to holes in computing security. The National Vulnerability Database reports that 86% of reported vulnerabilities come from third party applications. With Shavlik Patch for Microsoft System Center, administrators have the ability to automate the patching of third party applications within the System Center Configuration Manager console, providing confidence that third party application vulnerabilities are covered.

Here’s the rest of the schedule:


We have three seats left by last count. The drawback to using the MTC space is we lose some room so we’ll be running a tight ship trying to maximize all the seating available. If you haven’t registered yet, there’s still time! Get all the details here, including the registration link: http://www.atlsmug.org/events/atlanta-systems-management-user-group-100314

See you there!

Sep 25, 2014

powershell: limitation on retrieving members of a group

If you have large group memberships, you might have already run into a limitation with Get-ADGroupMember where the cmdlet will fail with this message:

get-adgroupmember : The size limit for this request was exceeded
At line:1 char:1

(Don’t believe me? Go ahead; try it. I’ll wait.)

It seems the limitation comes up when you query a group with more than 5000 members. The easiest way to get around this would be for Microsoft to come up with a switch that let’s you set the size limit. That’s probably also the longest wait. :) Not to worry, there are ways to get around it.


Get-QADGroupMember. Remember this cmdlet? It’s a part of the Quest AD cmdlets. Of course, Quest no longer exists after being gobbled up by Dell so your mileage may vary. It does include a –SizeLimit switch so you can merrily bypass the limitations with it.

Get-ADGroup. If you query the group for its member property and expand it, you can get around the size limit. Here’s how it’s done:

Get-ADGroup myLargeGroup -Properties member | Select-Object -ExpandProperty member


Get-ADObject. This AD cmdlet to retrieve objects generically is useful also to get around this limitation. It’s pretty much the same as above. Just use Select-Object to expand the property.

Get-ADObject -LDAPFilter "(&(objectcategory=group)(name=myLargeGroup))" -Properties Member | Select-Object -ExpandProperty member


DirectorySearcher. I wouldn’t recommend doing this unless you actually feel like you need to or want a challenge. It’ll be more typing (or realistically, more cutting and pasting) than you want to do. It does seem much faster. Haven’t timed it though.

([adsisearcher]"name=myLargeGroup").FindOne() | % { $_.GetDirectoryEntry() | Select-Object –ExpandProperty member} 


Active Directory Web Service. The last thing you can do is modify the webservice.exe.config file and change the MaxGroupOrMemberEntries value. This will affect the behavior of other cmdlets as well. I haven’t tried this myself since the other workarounds are fine for me so make sure you read Jason Yoder’s post on this and TEST it out: http://mctexpert.blogspot.com/2013/07/how-to-exceed-maximum-number-of-allowed.html

Sep 24, 2014

powershell: reset user password

UPDATE: screwed up the last one. corrected. :o)


things to remember when resetting account passwords.

prompted (displays old, new password dialog)

Set-ADAccountPassword userid

unprompted (yeah, i don’t know why i’d choose this one, honestly.)

Set-ADAccountPassword userid -OldPassword (ConvertTo-SecureString -AsPlainText “myoldpassword” -force) -NewPassword (ConvertTo-SecureString -AsPlainText “mynewpassword” -force)

administrative reset (don’t know the old one, setting it for someone else)

Set-ADAccountPassword userid -Reset -NewPassword (ConvertTo-SecureString -AsPlainText “mynewpassword” -force)

Sep 15, 2014

powershell: converting int64 datetime to something legible

i find that i’m constantly converting AD datetime fields from something that looks like 130552642641560221 to something that looks like 9/15/2014 10:17:44 AM. i don’t know which you prefer, but to me, the second output is the one that most people won’t complain about when i give it to them.

over on stackoverflow.com i found this post that wraps it up pretty nicely. so, let’s say you want to look at the lastlogontimestamp attribute of a user named marcus. here’s a typical command that would show you the output:

get-aduser marcus -properties lastlogontimestamp | select lastlogontimestamp

bam. you get the int64 value. personally, i get lost counting nanoseconds* after i exhaust what i can count on both hands. if you’re like me, you can convert this handily to a readable datetime format like this:

get-aduser marcus -properties lastlogontimestamp | select @{ n='llts'; e={[datetime]::fromfiletime($_.lastlogontimestamp)} }

we’re just creating an expression in the hash table to format lastlogontimestamp to the way we want to read it -- like humans. now, the quest powershell modules will do this automatically. of course, no one uses those anymore, right? :)


* and if you were curious, this is the definition of the int64 format -- contains a 64-bit value representing the number of 100-nanosecond intervals since january 1, 1601 (utc).

Sep 2, 2014

enabling deduplication on unnamed volumes (and other stuff)

it dawned on me the other day that while i had enabled deduplication on my office computers, i never did enable it at home. back when ssd was very expensive, i had managed to get a very small drive (64gb.) well, it proved to be too small to be useful.

i ended up replacing the optical drive with a secondary hdd. it runs out of the optical chassis so it spins slower. it did it’s job though – which was to provide more space for not often accessed things. cool. i ran into a couple of things while toying around.

in case you didn’t know you could, windows 8.1 will support deduplication. you just have to get the binaries on to the os. once you install it and enable the features, you need to get into powershell to turn stuff on.

so, here’s a primer on getting all the deduplication commands:

gcm *dedup* | gcm –module deduplication (both work)

CommandType     Name                            ModuleName  
-----------     ----                            ----------  
Function        Disable-DedupVolume             Deduplication
Function        Enable-DedupVolume              Deduplication
Function        Expand-DedupFile                Deduplication
Function        Get-DedupJob                    Deduplication
Function        Get-DedupMetadata               Deduplication
Function        Get-DedupSchedule               Deduplication
Function        Get-DedupStatus                 Deduplication
Function        Get-DedupVolume                 Deduplication
Function        Measure-DedupFileMetadata       Deduplication
Function        New-DedupSchedule               Deduplication
Function        Remove-DedupSchedule            Deduplication
Function        Set-DedupSchedule               Deduplication
Function        Set-DedupVolume                 Deduplication
Function        Start-DedupJob                  Deduplication
Function        Stop-DedupJob                   Deduplication
Function        Update-DedupStatus              Deduplication


first problem i ran into happened when i went to enable the c: drive and received the following error:

enable-dedupvolume -Volume c:
enable-dedupvolume : MSFT_DedupVolume.Volume='c:' - HRESULT 0x8056530b, The specified volume type is not supported. Deduplication is supported on fixed, write-enabled NTFS data volumes and CSV backed by NTFS data volumes.

unfortunately searching for the error code did not yield any results. however, if we look at the error message, it speaks about the volume type. according to technet, this is what is supported:

  • Must not be a system or boot volume. Deduplication is not supported on operating system volumes.
  • Can be partitioned as a master boot record (MBR) or a GUID Partition Table (GPT), and must be formatted using the NTFS file system.
  • Can reside on shared storage, such as storage that uses a Fibre Channel or an SAS array, or when an iSCSI SAN and Windows Failover Clustering is fully supported.
  • Do not rely on Cluster Shared Volumes (CSVs). You can access data if a deduplication-enabled volume is converted to a CSV, but you cannot continue to process files for deduplication.
  • Do not rely on the Microsoft Resilient File System (ReFS).
  • Can’t be larger than 64 TB in size.
  • Must be exposed to the operating system as non-removable drives. Remotely-mapped drives are not supported.

the requirements fell apart on the first bullet for me. oh well, i still have the secondary hdd i can optimize. ran into a small snag, realizing that i had created a mount point so the secondary hdd isn’t an actual volume i can specify by drive letter.

not too big of a deal as long as i know the path where it’s mounted such as:

enable-dedupvolume –Volume c:\data


if the directory is unknown, you could also use the objectid, which you can get from get-volume. the following command would attempt to enable deduplication on all available volumes. obviously, this is not something you want to try on your desktop:

get-volume | % { enable-dedupvolume -volume $_.ObjectId }

Aug 18, 2014

dns resolver behavior

i had an occasion to have to look up windows client behavior when it came to dns. specifically, i wanted to know how the client behaves when the primary name server is offline. before i had to fire up packet trace and check for myself, i stumbled on a couple of useful articles that spell it out.

UPDATE: had a conversation with a talented linux dns guy and discovered a few more useful things to note.

dns client resolver behavior
dns client resolution timeouts
dns forwarders and conditional forwarders resolution timeouts

in summary, it works as follows:

  • dns query sent to preferred
  • if no response within 1 second, dns query sent to alternate
  • if no response within 1 second, dns query sent to preferred again
  • if no response within 2 seconds, dns query sent to preferred and alternate
  • if no response within 4 seconds, dns query sent to preferred and alternate again
  • if no response within 7 seconds, process times out


something to note for linux systems, these appear to be default values:

  • timeout:n
    sets the amount of time the resolver will wait for a response from a remote name server before retrying the query via a different name server.  Measured in seconds, the default is RES_TIMEOUT (currently 5, see <resolv.h>).  The value for this option is silently capped to 30.
  • attempts:n
    sets the number of times the resolver will send a query to its name servers before giving up and returning an error to the calling application.  The default is RES_DFLRETRY (currently 2, see <resolv.h>).  The value for this option is silently capped to 5.
  • rotate
    sets RES_ROTATE in _res.options, which causes round-robin selection of name servers from among those listed.  This has the effect of spreading the query load among all listed servers, rather than having all clients try the first listed server first every time.
  • search
    Resolver queries having fewer than ndots dots default is 1) in them will be attempted using each component of the search path in turn until a match is found.
  • ndots:n
    sets a threshold for the number of dots which must appear in a name given to res_query(3) (see resolver(3)) before an initial absolute query will be made.  The default for n is 1, meaning that if there are any dots in a name, the name will be tried first as an absolute name before any search list elements are appended to it.  The value for this option is silently capped to 15.

in summary, the timeout value indicates how long the client will wait until it tries the next server in the search list. the number of queries attempted per server is dependent on how ndots is configured. once the server list is exhausted, the attempts value indicates if the client should try the list again.

additional settings can be found here: http://man7.org/linux/man-pages/man5/resolver.5.html

Aug 12, 2014

troubleshooting wmi…

this exhaustive series on troubleshooting wmi from the ask the performance team blog is too good to pass up. use of wmi is pervasive, guaranteeing that just about all of us have run into wmi issues at some point or another. if you haven’t yet, it’s only a matter of time. might as well do your homework.

here are the topics the series will be covering:

  • WMI: Common Symptoms and Errors
  • WMI: Repository Corruption, or Not?
  • WMI: Missing or Failing WMI Providers or Invalid WMI Class
  • WMI: High Memory Usage by WMI Service or Wmiprvse.exe
  • WMI: How to troubleshoot High CPU Usage by WMI Components
  • WMI: How to Troubleshoot WMI High Handle Count


i’ve blogged a few times about wmi myself:

Jul 24, 2014

misc: flying with cortana

if you’re a windows phone 8.1 user, you’re probably in love with cortana already. she is a fantastic organizer! despite all that, sometimes, she fails to understand your flight itinerary, especially on multi-leg flights. she might capture just one leg of the flight. so how do you fix?

i looked for a way to do this but wasn’t able to find any well-documented procedures, so here’s my shot at it.

  • have cortana search for the flight information. in my test, i’m using aa1947 as an example.
  • click the Show AA 1947 updates link. this will add it to your itinerary.


if the date isn’t right, don’t worry. you can change it.

  • switch over to the interests section.
  • under travel, you should be able to find your flight information. click on it.


  • under the Flight date section, simply choose which date you’re interested in.

now cortana will track that flight for you.

Jul 23, 2014

misc: cool things about onenote

onenote has been my constant companion for many years now. between onenote and outlook, i can’t think of very many things that can’t be effectively managed, tasked, or tracked -- at least from a day-to-day perspective. i found some pretty cool things about onenote recently that i thought i’d share: subpages and onetastic.



Snagit screen capturefor those of that don’t know, i’m an avid pool player. naturally, since i use onenote, i’m a pool player that likes to keep a lot of notes about billiards as you can see in the screenshot.

the first thing i want to point out is onenote allows the use of subpages. i went for far too long without knowing that. if you look at #1, you can see how onenote looks when you collapse subpages. #2 is the expanded view.

once you collect your pages as subpages, it makes managing them easier since you can work with them in bulk (move, copy, delete, cut, etc.)

to create a subpage, right-click the page tab and choose make subpage. the shortcut trick to make or promote a subpage is to drag the page tab left or right with your mouse as shown below.

Snagit screen capture



i stumbled on this gem while i was looking for a tool to apply styles like you can in pretty much every other office product. enter onetastic. onetastic for microsoft onenote is an amazing collection of tools Snagit screen capturethat integrate directly with onenote. since most of the tools are macros, it’s as extensible as you want to make it. so far, i’ve found that their collection of macros is more than enough for me.

if you’re a heavy onenote user, it’s definitely worth checking. i found all kinds of other cool macros like sorting pages, creating a table of content, search & highlight, etc.




onenote 2013 keyboard shortcuts

Jun 4, 2014

excel: my first use of power query (and i love it)

let’s face it. if you’re a techie and you don’t use excel, you are not peeking out your geeking out. :o) i use excel for a number of different things. it’s a really powerful program which can handle doing much more than figuring out how much i’ve spent on lunch over the last three months.

at teched, i got my first taste of power query during some of the hands-on-labs (available online for free now.) something came up recently that gave me a chance to explore it a bit more to see its value. let’s explore a scenario where your organization is absorbed or is absorbing another organization. after a domain migration, human resources decides they want to start over with new employee IDs.



HR provides you with a file claiming it has all the information you need. (and clearly, you’ve no reason to doubt their claim.) upon examining you file, you notice that the only thing in the file is a column with the old employee ID and a column with the new employee ID.

drawing from your history, you know that employeeID is not an indexed attribute in active directory and would take quite a length of time to query each user by their old employeeID and write their new one. you’re also short on time to get any changes to the AD schema pushed through.

knowing you’ll need it, you create your own export of user ID and employee ID from active directory. game on.



the way i would have approached this in the past (and did to be honest until my colleague encouraged me to use power query) is to create two sheets in excel. one sheet would contain the information from HR, and the other would contain the information from AD. from the HR sheet, you could run a vlookup formula in a new column, pulling the user IDs that match from AD.



i’m about to demonstrate is magic, my friends. magic known as PFM… the purest kind available. i’m going to take one file that looks with just the old id and the new id and merge it with another that contains the old id and the user id ending up with a new dataset that contains the old id, the new id, and the user id. there’s also a little wrinkle here. the new id value they want starts with a leading zero. excel loves converting them which basically switches it to a number, removing the leading zero. no problem.

here’s a sample of how the files look:


  1. open excel (DUH) and start with a blank worksheet.
  2. if you haven’t downloaded power query, you will need to do that first.
  3. switch to the power query tab.
  4. select from file > from csv.
  5. open the first .csv file.

everything looks good except that leading 0 problem. hell, even the first row header was automatically applied. the leading 0 is a non-issue since it can be changed as an applied step!

  1. change the query name to emplIDs.
  2. click the new id column.
  3. right-click the new id column and choose change type > text.
  4. make sure the old id column stays as number. if not, change the type to number.
  5. under load settings, choose load to data model only.
  6. click apply & close.


you can’t see it in the worksheet, but have faith in magic that it’s all there (or use the peek by hovering your mouse over the query.)


let’s work on getting the merged results from both files into a worksheet now that we have a dataset with which we can combine.

  1. on the power query tab, choose from file > from csv.
  2. open the second file.
  3. name the query userIDs.
  4. choose merge queries from the ribbon bar.
  5. drop down the query selection and choose emplIDs.
  6. select the old id column in both tables and click OK.

a new column now shows up in the query with a peculiar little symbol image and values that state Table. we need to indicate what we want to pull back from the emplIDs data model.

  1. click the symbol of confusion.
  2. select the column new id and click OK.
  3. rename the new column to new id.
  4. click apply & close.

the result should be a beautifully merged set of data!


May 30, 2014

misc: power savings problem with snagit 12

I have been a fan of snagit for very long time now. when I saw snagit 12 was released, I had to get my hands on it! as a mvp, one of the many benefits you get is nfr (not for resale) licenses for a lot of different software by a lot of vendors.

I won’t pretend there was some immediate correlation I drew to the problem I started having after installing snagit. it wasn’t something immediate or evident. my monitors will go into low power mode after 10 minutes of inactivity. I noticed after coming back to my desk several times that it wasn’t happening anymore.

I checked all my power settings to make sure nothing changed. everything looked fine. I recalled at some point that powercfg was a utility I had seen and played with some while back that could be useful in narrowing down where the issue might be.



the first thing I did (other than figuring out how to use the tool) was run an energy report.

powercfg /energy /output "energy.html"


without the /duration switch, the default collection period is 60 seconds. this seemed more than plenty to catch what I needed to see. looking through my report (energy.html) I found these lines:

System Availability Requests:System Required Request
The program has made a request to prevent the system from automatically entering sleep.
Requesting Process \Device\HarddiskVolume3\Program Files (x86)\TechSmith\Snagit 12\Snagit32.exe
System Availability Requests:Display Required Request
The program has made a request to prevent the display from automatically entering a low-power mode.
Requesting Process \Device\HarddiskVolume3\Program Files (x86)\TechSmith\Snagit 12\Snagit32.exe


I was able to verify what was displayed here by looking at the current requests:

powercfg /requests


as you can see, the snagit32.exe process is clearly registered as a process in two places.

[PROCESS] \Device\HarddiskVolume3\Program Files (x86)\TechSmith\Snagit 12\Snagit32.exe
[PROCESS] \Device\HarddiskVolume3\Program Files (x86)\TechSmith\Snagit 12\Snagit32.exe


I ran a very quick test to make sure I could reproduce the effect (as I had intended to open a bug.) sure enough, when I closed the snagit program and ran powercfg /requests, it no longer appeared. when I turn on snagit, it still doesn’t register anything. it happens only after you initiate a screen capture.

it would appear proper that it should request not to go into a low power mode while the capture is occurring. I think where it’s failing is removing the request after it completes the capture. bug filed.



powercfg has another useful switch: requestsoverride. it’s not one of the more friendlier switches since it doesn’t provide any positive feedback if you do something right. (it ain’t your mama.) I like to have snagit running all the time. you never know when I need to capture a picture of a cat and create a quick meme. it happens.


since, in my scenario, the snagit32.exe process is registered in two places (display and system,) I ran the requestsoverride switch like this:

powercfg /requestsoverride process snagit32.exe display system


and when I hit enter, nothing. validation was only found running the switch with no parameters:

snagit32.exe DISPLAY SYSTEM

now my monitors switch into low power mode just fine with snagit running. hurray for cats!



I’m sure there must be a hidden switch to do this in powercfg, but I wasn’t able to find it. I took a guess that the overrides were written to the registry somewhere and thus fired up procmon. tracing powercfg.exe to find it was cake.



jumping over to this section in the registry confirmed that this is where the overrides are:


so… if you need to clear them, just delete the values of interest.



techsmith is a great company to work with. I used their support forum to file a bug indicating that I could easily repro it. the same day, techsmith responded with their acknowledgment. :]

Hi Marcus,

This is a bug we have logged and are hoping to get this fixed for an upcoming release we're working on. Im really sorry for the trouble.

Please let me know if there is anything else I can do to help.

Kind Regards,
Senior Support Specialist

May 20, 2014

atlanta techstravaganza 06.06.2014

did you save the date? well, it’s not too late!

what is atlanta techstravaganza you ask? it’s a yearly group meeting where atlanta systems management user group, the atlanta powershell user group, and the atlanta windows infrastructure and virtualization user group come together for a gigantic event.

we have three tracks running concurrently providing information from topics on system center, powershell, and windows server. along with that, we have a BYOD hands-on lab. along with great content, networking opportunities, and free food, we always end the event with some great giveaways.

we’ve moved locations this year from the microsoft alpharetta campus to the georgia tech research institute. while we love and appreciate what microsoft does for us, their campus size was unfortunately limited to 100 people. at GTRI, we have doubled the capacity!

having twice the space doesn’t mean you should wait. seats will go fast, and as in previous years, we are likely to completely sell out. come get educated with a fully belly and meet some of your atlanta peers! look forward to seeing you at the event.

registration link is available here: http://www.atltechstravaganza.com/

May 19, 2014

managing local admin passwords

one of the missing features that gives some windows administrators (and ALL security administrators) heartburn on the windows desktop platform is the lack of built-in controls to manage local passwords. group policy preferences was one of the ways you could get around this problem, but as you probably already know, it was quite insecure and recently addressed by a security update. okay, so where does that leave us?

recently, tom ausburne wrote this bang up article which goes into quite a few things, like the insecurity of group policy preferences, the jiri method, and pass the hash. it’s definitely worth the read and provides all the steps necessary to set up the jiri method in your environment.

so what’s this jiri method? it basically changes the local admin password to something random and stores the value in AD. the disclaimer is that the password is stored unencrypted in clear text. tom’s article goes a bit into protecting the attribute (a concept called confidential bit.)


helpful links:

how to automate changing the local administrator password
pass-the-hash (PtH) whitepaper
group policy preferences elevation vulnerability
the jiri solution
confidential attributes (or bits as i came to know it)

Mar 14, 2014

03.28.2014 atlanta systems management user group

it’s been awhile. you guys ready to meet up again?

bluestripe, if you recall from system center universe 2014, offers a really compelling way to use opsmgr 2012. it’s magic! they also happen to be sponsoring our event! pretty exciting. topics will include security compliance manager, configuration manager, operations manager, and azure.

got all the details over on www.atlsmug.org, including our schedule, speakers, and registration link. the link is a little bare right now but will be filled out soon. please do register so we know how much food to get.

looking forward to seeing you there, fellow geeks. click HERE for the event post.

compact headers in outlook 2013 sp1

i nearly missed this new feature, confusing it with how the usual ribbon bar folds and unfolds. have you made the switch yet? if not, allow me to illustrate. here’s the old, massive, gigantic header in outlook 2013.



and now, after sp1…



to each their own, i guess… but i really do prefer this new compact view. flip it on or off using the button indicated above. more information about it is available on the office blog.

Feb 13, 2014

configuration manager support center

this tool kind of came out of the blue. it’s pretty cool though! the next time you’re on the phone w/ premier, don’t be surprised if you’re asked to use it for log gathering. in order to get the tool, you will have to join the configmgr open beta community. here’s the link for the file: https://connect.microsoft.com/ConfigurationManagervnext/Downloads/DownloadDetails.aspx?DownloadID=52192

after the installation, you’ll have two programs you can use, the support center utility and the support center viewer. since the viewer is really just designed to open up archive bundles, we’ll skip that. support center has some very cool things you can do (much of it you can find in client center, though) and works locally and remotely.


looking across the ribbon, you will find the following areas:

  • data collection: where you go to do exactly that. you can pick which data elements you want to collect. it grabs a lot of good stuff – logs, policies, certs, configuration data, registry, wmi info, dumps, etc. don’t overlook it.
  • client details: you can find a few, scant client details here – version, site code, mp, etc.
  • client policy: good stuff. load up the client policy (illustrated above), request and evaluate, or listen.
  • content: all of the local content (programs and applications on the client.
  • troubleshooting: this section appears to go through the logs and validate that your client is healthy. that’s kind of nice!
  • logs: this is basically like trace – except with a few neat features like using a quick filter to find information or (my favorite) opening sets of logs (below).


i recommend you join the beta, spin it up, and give it a try. :)

Jan 14, 2014

system center universe 2014

if you are in the system center space, you know about SCU from the past successful events… and most likely know about the 2014 date. it all happens 1/30/2014. for those of you in the atlanta area, ATLSMUG is hosting a viewing party if you’d like to come watch it with us. get all the details at http://systemcenteruniverse.com. while you’re there, check out the oppressive theme music for sc-uminator.

Atlanta, GA
Microsoft Office
1125 Sanctuary Pkwy
Alpharetta, GA 30009

don’t worry. if you can’t make it in person, you’ll be able to watch it online. if you come out though, we’ll feed you breakfast, lunch, and snacks.

here’s the agenda (CENTRAL TIME) of what you can expect to see:

7:00am - 8:00am

8:00am - 8:10am
Welcome - Your Emcee, Cameron Fuller, MVP

8:10am - 8:55am
Become the Hero of the Day - Master ConfigMgr 2012 R2 with a Limited Budget and Free Community Tools

  • Presented by Kent Agerlund
  • No doubt that Configuration Manager 2012 R2 is a very powerful product that can be used “right out of the box. However, with the help from the massive list of free community tools, you can take the standard features to a whole new level of automation. In this session, Kent Agerlund will demonstrate how to solve daily challenges like managing clients, software updates, operating system deployment, application management and much more using nothing but Configuration Manager and free tools - Expect a session full of demos.

9:00am - 9:45am
SCSM SSP WAP SMA PS - How a Bunch of Letters Can Help Deliver Solutions Faster, with Fewer Issues, Save Money, Impress the Boss, and Get You a Promotion

  • Presented by Travis Wright
  • In this presentation, Travis Wright will get into the details of the new Service Management Automation (SMA) platform built on Power Shell (PS) that is part of the new Windows Azure Pack (WAP) and show how to integrate with SCSM (System Center Service Manager) using the new SMA Connector and Self-Service Portal (SSP). You'll leave the session with a clear understanding of the importance of PS and WAP/SMA in Microsoft's systems management strategy and how that ties into SCSM for process control and automation. Advanced concepts of SMA will also be covered such as how to create custom modules and programmatically interact with SMA.

9:50am - 10:10am
Applications in Operations Manager - Dynamic Maps and Status with BlueStripe

  • Presented By Nick Burling, VP Product Development, BlueStripe
  • IT organizations have made System Center the foundation of their infrastructure management strategy. The challenge is how to manage the performance of applications and business services that run on that infrastructure.
    Microsoft partnered with BlueStripe Software to deliver complete, multi-tier application management in System Center. Nick Burling, VP of Product Management at BlueStripe, will provide an introduction to the System Center + BlueStripe solution for delivering dynamic application mapping, monitoring, and service-level alerting in System Center. Attendees will learn how System Center + BlueStripe FactFinder delivers:
    • Dynamic application mapping, monitoring, and service-level alerting in Operations Manager
    • Application management across platforms (Windows and non-Windows systems) and architectures (physical, virtual, cloud, and hybrid cloud)
    • Application context for Orchestrator and Service Manager

    With System Center + BlueStripe, IT Operations teams can tie System Center's infrastructure management capabilities to applications, and greatly expand their ability to deliver business services.

10:10am - 10:25am
Morning Break

10:25am - 11:10am
Identity Management for Hybrid IT with Windows Azure and Windows Server 2012 R2

  • Presented by Maarten Goet
  • In the 13+ years since the original Active Directory product launched with Windows 2000, it has grown to become the default identity management and access-control solution for over 95% of organizations around the world. But, as organizations move to the cloud, their identity and access control also need to move to the cloud. As companies rely more and more on SaaS-based applications and the range of cloud-connected devices being used to access corporate assets continue to grow, along with the increased use of more hosted and public cloud capacity, companies must expand their identity solutions to the cloud. With this in mind, Microsoft set out to build a solution. Join seven-year Microsoft MVP Maarten Goet in this hands-on session featuring Windows Azure and Windows Server 2012 R2 and get your identity management for Hybrid IT fast-track started today!

11:15am - 12:15pm
Managing Modern Mobile Devices with System Center 2012 R2 Configuration Manager

  • Presented by Wally Mead
  • Are you struggling with allowing your end users to use their own devices to access company resources while still providing the desired level of control and security? Microsoft's System Center 2012 R2 Configuration Manager, when integrated with Windows Intune, can provide great capabilities for managing mobile devices in addition to its support for your on premise Windows, Mac, and Linux/Unix server support. This session will cover the mobile device support to help you, the IT admin, with providing support for BYOD scenarios.

12:20pm - 12:40pm
Extending Microsoft System Center to Monitor and Manage Virtual Environments

  • Presented By Chris Henley, Veeam
  • Veeam® Management Pack™ extends deep VMware monitoring, management and capacity planning to Microsoft System Center, providing complete visibility of both physical and virtual infrastructures and applications - all from one console. In this session you will learn how to get state-of-the-art monitoring and management of your virtual environments and how to extend that capability into the realm of virtual data protection.

12:40pm - 1:25pm
Break for Lunch

1:25pm - 1:40pm
Service Manager Solution Showcase

  • Presented By Gord Watts, Provance
  • Microsoft System Center provides a rich platform for extensibility and customization. This session gives you a quick "speed dating" style introduction to the inspired third-party extensions that add powerful functionality and enhanced capabilities to Service Manager. From mobility and web access to IT asset management and more, this showcase of partner offerings shows you how you can get even more value from Microsoft System Center 2012. Don't miss this presentation highlighting the most exciting options to supercharge your Service Manager deployment!

1:45pm - 2:30pm
Master Class: Orchestrating Daily Tasks Like a Pro

  • Presented By Pete Zerger and Anders Bengtsson
  • In this class, through live demonstration, you will learn the secrets to more effective process automation with System Center 2012 R2 in the context of two common enterprise use case scenarios. Professors Bengtsson and Zerger will share methods and strategies to answer some of the toughest challenges to enterprise automation, including:
    • How to maximize runbook reusability
    • Making your process automation more reliable
    • Providing hooks for 3rd party ITSM integration
    • Taking steps in Orchestrator today to prepare for Service Management Automation (SMA) in the future
    If you want to raise the level of your game...please arrive to class on time!

2:35pm - 3:20pm
Monitor Your Interstellar Cloud

  • Presented by Dieter Wijckmans
  • What is that strange Interstellar cloud floating through space holding all your servers, services, data, etc.? Make this not a huge unknown in your universe but send out your probes to get the data back to your mother ship and start monitoring it. Use the force of this massive cloud to even monitor your servers at the mother ship. The possibilities are out there... Just grab them, combine the forces and become a true master of your universe.

3:25pm - 3:45pm
Managing Third Party Updates with Microsoft's System Center Configuration Manager

  • Presented By Meaghan McKeown, Secunia
  • Attend this session and learn tips and tricks on how to solve the daily challenges around patching your environment with Microsoft and non-Microsoft updates. We will outline best practices and demonstrate how to effectively patch 3rd party applications in System Center Configuration Manager.

3:45pm - 4:00pm
Afternoon Break

4:00pm - 4:45pm
Building the Perfect Windows 8 Image

  • Presented by Johan Arwidmark
  • In this session you learn how to create the perfect, real world, production ready, master image of Windows 8. Learn how Sysprep and the Unattend.xml really work in Windows 8. Discover how to automate builds of think, hybrid and thick images, along with what to add and what not to add to an image. MDT 2012 Update 1 and ADK is used as a foundation for creating the master images. Expect lots of live demos in this session.

4:50pm - 5:35pm
Custom Data Collection and Reporting with ConfigMgr

  • Presented by Jason Sandys
  • This session will cover the multiple different vectors for importing, collecting, or using custom data within ConfigMgr. This mostly demo session will showcase actual techniques used at companies to fulfill their non-default data gathering and reporting requirements.

5:45pm - 6:30pm

  • Our panel of distinguished speakers will answer questions from the live and virtual audience members

Jan 3, 2014

sccm: ccmmigratepolicysettingsinit returns code 1603 during pull dp installation

as i sat down to write this, I realized a fellow mvp, alex zhuravlev, wrote about a very similar thing which is corrected by the same fix. it’s posted here if you’d like to read it: http://social.technet.microsoft.com/wiki/contents/articles/4709.configmgr-2012-beta2-site-component-manager-failed-to-install-this-component-because-the-microsoft-installer-file-for-this-component-mp-msi-could-not-install.aspx

recently, we started deploying pull DPs out to our regional locations. one of my engineers indicated that we were having issues where the DPs appeared to be in a waiting for content state. we narrowed the issue down to an error that popped up in the pulldp_install.log file. the end of the installation exits in 1603, which i’m sure you know is a very vague error.

going backward in the log file, we spotted an where the originating 1603 was logged. here are the log lines:

   1:  Getting settings from WMI and storing in <Program Files>\SMS_CCM\polmig.mof
   2:  CustomAction CcmMigratePolicySettingsInit returned actual error code 1603 (note this may not be 100% accurate if translation happened inside sandbox)

at this point, it was pretty clear the problem was happening while trying to retrieve the information from WMI and writing out to polmig.mof. apparently our methods for installing the ConfigMgr 2007 client was SO GOOD that it installed it on the new DP that was recently deployed. (We’re in the middle of a migration.) that was the issue. :|

the client and distribution point role were both removed from the affected server. the 2012 client was loaded, and the server was added back as a pull dp. after that, the installation went through successfully.