Sep 26, 2014

atlanta systems management user group 10.03.14

I cannot honestly believe it’s already time for our user group meeting. It’s one week from now. It’s kind of crazy how fast time goes by. It’s also a lot more effort to put these together than you would expect.

So for that, I am grateful to all of the folks that help keep this going, all of the sponsors that help keep us eating, our perpetual sponsors that give us lots of great giveaways and benefits, all of the speakers that bring great content, and all of the people, like you, that come share your knowledge.

At our last user group meeting, we took an opportunity to use the space in the MTC side of the Microsoft office. What we discovered was the interaction was entirely different than the classroom spaces. It provided a better environment for interaction which is ultimately what we’ve always strived for -- networking, meeting your peers in the industry, and sharing knowledge. That’s the benefit of tying into a user community. You grow your access to knowledge exponentially.

Shavlik is our sponsor this quarter. Their product addresses the big hole of patch management where Windows patching ends -- third party. Here’s a little blurb:

Today, it's not Windows that represents the most vulnerabilities, but instead it's the applications that run on Windows that expose businesses to holes in computing security. The National Vulnerability Database reports that 86% of reported vulnerabilities come from third party applications. With Shavlik Patch for Microsoft System Center, administrators have the ability to automate the patching of third party applications within the System Center Configuration Manager console, providing confidence that third party application vulnerabilities are covered.

Here’s the rest of the schedule:

image

We have three seats left by last count. The drawback to using the MTC space is we lose some room so we’ll be running a tight ship trying to maximize all the seating available. If you haven’t registered yet, there’s still time! Get all the details here, including the registration link: http://www.atlsmug.org/events/atlanta-systems-management-user-group-100314

See you there!

Sep 25, 2014

powershell: limitation on retrieving members of a group

If you have large group memberships, you might have already run into a limitation with Get-ADGroupMember where the cmdlet will fail with this message:

get-adgroupmember : The size limit for this request was exceeded
At line:1 char:1

(Don’t believe me? Go ahead; try it. I’ll wait.)

It seems the limitation comes up when you query a group with more than 5000 members. The easiest way to get around this would be for Microsoft to come up with a switch that let’s you set the size limit. That’s probably also the longest wait. :) Not to worry, there are ways to get around it.

 

Get-QADGroupMember. Remember this cmdlet? It’s a part of the Quest AD cmdlets. Of course, Quest no longer exists after being gobbled up by Dell so your mileage may vary. It does include a –SizeLimit switch so you can merrily bypass the limitations with it.

Get-ADGroup. If you query the group for its member property and expand it, you can get around the size limit. Here’s how it’s done:

Get-ADGroup myLargeGroup -Properties member | Select-Object -ExpandProperty member

 

Get-ADObject. This AD cmdlet to retrieve objects generically is useful also to get around this limitation. It’s pretty much the same as above. Just use Select-Object to expand the property.

Get-ADObject -LDAPFilter "(&(objectcategory=group)(name=myLargeGroup))" -Properties Member | Select-Object -ExpandProperty member

 

DirectorySearcher. I wouldn’t recommend doing this unless you actually feel like you need to or want a challenge. It’ll be more typing (or realistically, more cutting and pasting) than you want to do. It does seem much faster. Haven’t timed it though.

([adsisearcher]"name=myLargeGroup").FindOne() | % { $_.GetDirectoryEntry() | Select-Object –ExpandProperty member} 

 

Active Directory Web Service. The last thing you can do is modify the webservice.exe.config file and change the MaxGroupOrMemberEntries value. This will affect the behavior of other cmdlets as well. I haven’t tried this myself since the other workarounds are fine for me so make sure you read Jason Yoder’s post on this and TEST it out: http://mctexpert.blogspot.com/2013/07/how-to-exceed-maximum-number-of-allowed.html

Sep 24, 2014

powershell: reset user password

things to remember when resetting account passwords.

prompted (displays old, new password dialog)

Set-ADAccountPassword userid

unprompted (yeah, i don’t know why i’d choose this one, honestly.)

Set-ADAccountPassword userid -OldPassword (ConvertTo-SecureString -AsPlainText “myoldpassword” -force) -NewPassword (ConvertTo-SecureString -AsPlainText “mynewpassword” -force)

administrative reset (don’t know the old one, setting it for someone else)

Set-ADAccountPassword userid -Reset -AsPlainText “myoldpassword” -force) -NewPassword (ConvertTo-SecureString -AsPlainText “mynewpassword” -force)

Sep 15, 2014

powershell: converting int64 datetime to something legible

i find that i’m constantly converting AD datetime fields from something that looks like 130552642641560221 to something that looks like 9/15/2014 10:17:44 AM. i don’t know which you prefer, but to me, the second output is the one that most people won’t complain about when i give it to them.

over on stackoverflow.com i found this post that wraps it up pretty nicely. so, let’s say you want to look at the lastlogontimestamp attribute of a user named marcus. here’s a typical command that would show you the output:

get-aduser marcus -properties lastlogontimestamp | select lastlogontimestamp

bam. you get the int64 value. personally, i get lost counting nanoseconds* after i exhaust what i can count on both hands. if you’re like me, you can convert this handily to a readable datetime format like this:

get-aduser marcus -properties lastlogontimestamp | select @{ n='llts'; e={[datetime]::fromfiletime($_.lastlogontimestamp)} }

we’re just creating an expression in the hash table to format lastlogontimestamp to the way we want to read it -- like humans. now, the quest powershell modules will do this automatically. of course, no one uses those anymore, right? :)

 

* and if you were curious, this is the definition of the int64 format -- contains a 64-bit value representing the number of 100-nanosecond intervals since january 1, 1601 (utc).

Sep 2, 2014

enabling deduplication on unnamed volumes (and other stuff)

it dawned on me the other day that while i had enabled deduplication on my office computers, i never did enable it at home. back when ssd was very expensive, i had managed to get a very small drive (64gb.) well, it proved to be too small to be useful.

i ended up replacing the optical drive with a secondary hdd. it runs out of the optical chassis so it spins slower. it did it’s job though – which was to provide more space for not often accessed things. cool. i ran into a couple of things while toying around.

in case you didn’t know you could, windows 8.1 will support deduplication. you just have to get the binaries on to the os. once you install it and enable the features, you need to get into powershell to turn stuff on.

so, here’s a primer on getting all the deduplication commands:

gcm *dedup* | gcm –module deduplication (both work)

CommandType     Name                            ModuleName  
-----------     ----                            ----------  
Function        Disable-DedupVolume             Deduplication
Function        Enable-DedupVolume              Deduplication
Function        Expand-DedupFile                Deduplication
Function        Get-DedupJob                    Deduplication
Function        Get-DedupMetadata               Deduplication
Function        Get-DedupSchedule               Deduplication
Function        Get-DedupStatus                 Deduplication
Function        Get-DedupVolume                 Deduplication
Function        Measure-DedupFileMetadata       Deduplication
Function        New-DedupSchedule               Deduplication
Function        Remove-DedupSchedule            Deduplication
Function        Set-DedupSchedule               Deduplication
Function        Set-DedupVolume                 Deduplication
Function        Start-DedupJob                  Deduplication
Function        Stop-DedupJob                   Deduplication
Function        Update-DedupStatus              Deduplication

 

first problem i ran into happened when i went to enable the c: drive and received the following error:

enable-dedupvolume -Volume c:
enable-dedupvolume : MSFT_DedupVolume.Volume='c:' - HRESULT 0x8056530b, The specified volume type is not supported. Deduplication is supported on fixed, write-enabled NTFS data volumes and CSV backed by NTFS data volumes.

unfortunately searching for the error code did not yield any results. however, if we look at the error message, it speaks about the volume type. according to technet, this is what is supported:

  • Must not be a system or boot volume. Deduplication is not supported on operating system volumes.
  • Can be partitioned as a master boot record (MBR) or a GUID Partition Table (GPT), and must be formatted using the NTFS file system.
  • Can reside on shared storage, such as storage that uses a Fibre Channel or an SAS array, or when an iSCSI SAN and Windows Failover Clustering is fully supported.
  • Do not rely on Cluster Shared Volumes (CSVs). You can access data if a deduplication-enabled volume is converted to a CSV, but you cannot continue to process files for deduplication.
  • Do not rely on the Microsoft Resilient File System (ReFS).
  • Can’t be larger than 64 TB in size.
  • Must be exposed to the operating system as non-removable drives. Remotely-mapped drives are not supported.

the requirements fell apart on the first bullet for me. oh well, i still have the secondary hdd i can optimize. ran into a small snag, realizing that i had created a mount point so the secondary hdd isn’t an actual volume i can specify by drive letter.

not too big of a deal as long as i know the path where it’s mounted such as:

enable-dedupvolume –Volume c:\data

 

if the directory is unknown, you could also use the objectid, which you can get from get-volume. the following command would attempt to enable deduplication on all available volumes. obviously, this is not something you want to try on your desktop:

get-volume | % { enable-dedupvolume -volume $_.ObjectId }