Recently I had a discussion with a friend of mine with regards to checking AD passwords against the “P0wned password list” released by Troy Hunt.

So I went and downloaded the the file from here

That turned out to be a 8.8 GB Zip file, that unpacked to almost 19GB of raw text. (The file consists of to columns of data the NTHash and the number of occurrences across password leaks.)

Having to look through 19 GB of unsorted data I knew would be very slow, but I decided to test out different approaches. Keeping in mind, that I wanted to integrate this with my previously released AD Password checker tool here , and the test data I have in AD consists of about 500 Users with the last 20 passwords stored in HashHistory, meaning that I would have to do about 10.000 lookups into the file.

First attempt was simply to use Select-String and look for a NTHash value, even on a very fast Nvme drive this took more than 2 minutes per query, meaning it would take 20.000 minutes + (14 days)

I then tried Grep and Sift which yielded results in the same 2 minute area as Select-String.

Since I already had all the AD data in a SQLite database, I thought I would try to load the data into the DB. So I loaded the data into the DB and tried to query the DB for a specific Hash, this was a little bit faster than the plain file, but still took 1.40 minutes. Then I considered adding an index to the NTHash column, but since that has almost 520 million rows, and takes up 99% of the space used, it would mean that the DB would be double the size (around 40GB), which I felt would make it less portable.

I really wanted to be able to keep everything in a way, so this could be run without “installing” anything and keeping it portable. So loading the database into MySQL really was not an option either. So I started reading the SQLite documentation and realized that it supports  Clustered indexed. Meaning that I could omit the default index and have SQLite use the NTHash data as an index. This means that the import time into the DB will go up (Takes about 3½ -4 hours),  because the data has to be sorted in order for it to act as an index. Since this data is very static the cost of using NTHash as an index is very low. With the data indexed the query time has gone done to about 10ms, which is very acceptable.

Import Data

First off we need to create the table, and tell SQLite not to use it “regular” index,  by telling it not to add a RowID, but use a “clustered index”.


Then we need to set the right import settings.

.mode csv
.separator :
.import "C:/temp/dbtest/pwds.txt" p0wned

This can of course also be automated, we can create a sql.txt file and put this into it.

.mode csv
.separator :
.import "C:/temp/dbtest/pwds.txt" p0wned

And run (This does require the sqlite3 binary)

Get-Content .\sql.txt | & sqlite3 test1.db

This has given us an ultra fast way to lookup data in the “Have I been p0wned” database, and all can be run locally.

Next up I will show you how I have used that in my AD Password report script, to tell if users are using password that are in the “Have I been P0wned” list, or if you have users who are particular bad at passwords.

I was working with a customer; all their admin users had at least three accounts, a regular account, a domain admin account and a “server admin” account, some even had a “client admin” account as well per domain.

In a discussion with security, they told me that now, everything was good, and they were more secure. I told them that in theory, they were more secure, but what if their admins just used the same password for all their accounts.

They did not think that, that was a real problem, and that none of their admins would reuse their passwords across their different accounts.

I had a sneaking suspicion that that was not the case, so I decided to test it out, which required a little ingenuity.

So I sat down and wrote a list of things that I would like to get from AD. ( I was specifically told not to use any “brute force” methods to guess their user’s passwords, but I was allowed to compile a list of “improbable” used passwords like summer2017, Fall2017 etc. that security was sure no one would use 😉 )

• All current users with the same password
• Currently Most used Hash/Password statistics
• Most Used Hash Historically
• Users with “weak” password (Based on list of known passwords)
• User who have never changed password
• Users with a blank password
• Users who have had a blank password
• Users who have reused a password

If I could get that data out of AD, I would be able to give the security team an idea about the current status of their “users.”

Just to get one thing straight, by running the below scripts you are probably violating every security guideline in your company, do not play around with this, unless you understand what you are doing, and have written consent from your company. Also in the below examples, I am not adhering to “good” security practice by revealing the passwords of users (In my example it is only test data), again this was done to prove a point about insecure passwords. Last but NOT least, you have just copied ALL the keys to the kingdom from AD to unprotected files, I cannot stress this enough, you have to protect everything you export from AD!!!

I had previously helped Danish Security researcher Jakob H Heidelberg in regards to a PowerShell script using DSinternals from Michael Grafnetter he had written to do something similar, so I knew it was possible to get this from AD, but I wanted to create a solution that would scale, and be able to create a report that was easily searchable.
You can find Jakob’s original script here

As you probably know, by default you cannot get AD to reveal a users Password hash, you need some way to extract that from the AD database, I am not going to go into details about that here, and in the below example I will use DSInternals functionality to “pretend” to be a domain controller, and that way get a real domain controller to sync over all its information. There are other ways to obtain the same information, but that is out of scope for this blog post.

The first thing I did was to compile a short list of very easily guessable passwords, and simple permutations thereof, again we are talking Summer2016, Winter2017, CompanyName01 etc. etc.

In the sample data, I am using for this blog post, I created an AD and created 500 accounts using a modified version of Helge Kleins tool here: This way of using “real names” the users are more easily told apart than if I had just created random strings.

After creating the users I randomly assigned them as a password from a list of known passwords, I created a list of passwords MuchoSecuros1 through MuchoSecuros2000, I assigned these password to users at random, I did this just to make sure I had users with the same password. Again this was chosen for easy recognition of the results.

Then I needed a way to query the data that I pulled from AD, and here I ended up choosing SQLite, for those of you that do not know it, SQLite is a small database engine, that does not require any installation, and can be run just using an SQLite DLL file. Here I am using Warren “RamblingCookieMonster” Frames SQLite PowerShell module, which includes the DLL files and code to manipulate it.

In the database I created three tables:
• HashHistory (Containing all users Hash History, in this example AD stores the last 20 password hashes a user have used)
• PWDTranslation (All the “known” passwords and their generated hash value)
• Users (All the user’s data)

By putting it in a database like this, I made it very easy (fast) to query, and as an added benefit I could add data to this over time, to store more than the 20 last password hashes.

Let’s get down to business!

First off, the script is not 100% self-contained it requires some external modules as well (It is, of course, easy to create a package, that contains all the modules, and copy them to an offline machine)

Required PowerShell modules:

• PSSQLite
• EnhancedHTML2
• Dsinternals

Currently, the “main” script is split into three files.

• Setup.ps1 (Setup the SQLite database)
• LoadDataIntoSQL.ps1 (Loads data into SQLite)
• ReportGenerator.ps1 (Takes data from DB and creates HTML report)

In all of the three above files, there is a reference to the SQLite Database file, that has to be changed to reflect where you want to store the DB file.

In LoadDataIntoSQL.ps1 there is also a variable called $Passwords this gets populated with a list of “known” passwords, in the script, this just refers to a file with a single password on each line, each password will get converted into a hash and stored in the DB.

It is also from this file that Get-ADReplAccount is called, which is the cmdlet that syncs with a domain controller, so if you do not have permissions to do that, it will fail.

ReportGenerator has a variable called $ReportPath, which is where the report will be stored.

One thing to note is that the Report loads a client side javascript to do the formatting, only Internet Explorer allows you to do that.

Javascripts are located here:

To get more detailed data I also wrote a small snippet of code, to change the password for each user the same amount of times that is configured in AD, I did this to show how password history is used in the script.

First I generate a list of 2000 passwords, and load them into a variable:

Then I use ADSI to get information about the domain:

Then I use this information to figure out, what the password history is on the domain and change the password for each account that amount of times

Here is a video showing the scripts in action. (Be gentle this is my first attempt at creating a video)

Link to the files on Github

(This is part 1 of 3, in an articles series on how we at Unity started using “screenshots” and machine learning in our monitoring system)

Part 1: Getting screenshots from VMware machines (This Post)
Part 2: Using image analysis to detect machines in a broken state (Not published yet)
Part 3: Using Clarifai “Machine learning” to further narrow down our results (Not Published yet)

In my new job as a “DevOps” engineer (I know I know, there is no such thing as a DevOps engineer), I am working with Unity’s build farm, we are building the Unity engine, and a lots of artifacts 1000’s of times a day, so our build farm is relatively big, running multiple different OS’s like Windows, Mac and Linux.

We are continuously trying to improve our monitoring of the platform, both in order to detect failed machines, but also trying to gather information as to what have gone wrong, so we can use this information to prevent the same issues arising in the future, by giving feedback to the relecant teams.

We had a period, where we had some storage related issues, this caused Mac and Linux machines, in particular, to crash and hang, we had no trouble detecting the machines went offline, but since we weren’t able to connect to these machines, we could not tell what “state” they were in. So in order to document, what had happened, we had to look at the console of the given machine.
So I started thinking if there was an automated way that I could test for this, and it hit me, that I had read somewhere, that it I possible in Vmware to take a “screenshot” of a running machine.

So I wrote a PowerShell script that would take a screenshot of all our running VM’s in our build farm, then at least I had some documentation of what the “state” of the machines were.

In the above example am creating a “session” to reuse for the calls against Vcenter, so we will not see 100’s of connections in Vcenter.

In the next part of the article, I will cover some PowerShell functions I wrote to wrap ImageMagick to make some initial comparisons of each screenshot, sorting them in known good, known bad and What the h**l is going on here 😉

A friend of mine had had to go through a lot of data, and substitute some values… But these weren’t 1-1 replacement, he had to find the value which was in HEX and then convert them to regular characters.
The files he had to traverse had several million lines each, so certain performance considerations had to be taken.

My initial thinking was to use regular expressions, since they excel at manipulating text. So I started playing around with the -replace operator in PowerShell

In PowerShell you can do something like:

Which will replace the matching string, with numbered capturing group 1 from the Regex in this case the HEX value matching ([0-9A-F]+) in Regex parenthesis denotes a capturing group.

But I needed to change the HEX string to a “string”, so I tried different methods, but the PowerShell -Replace does not support callbacks.

So what I did was to create a callback function (Just named it Callback, could be called anything)

Instead of using -Replace I use the .Net type accelerator [Regex] to define the regex, and call the ::Replace method in that, which does support callbacks

This literally cut the running time down from hours to minutes.

I have been playing a lot with Azure automation, and Azure DSC pullserver in particular. I will not get into DSC or Azure automation in great detail in this post.

My goal was to set up an Automation Account, that could be used to handle multiple different servers and setups, so that meant that I had to install multiple DSC resources into the Automation account.
Currently there is a limitation in Azure DSC pullserver, that it does not support versioned modules (

The traditional PowerShell DSC pull server expects module zips to be placed on the pull server in the format”. Azure Automation expects PowerShell modules to be imported with names in the form of See this blog post for more info on the Integration Module format needed to import the module into Azure Automation.

This means that you have to pack your modules differently depending on if you are going to use them on an internal DSC pullserver which supports versioning or in Azure.

So I wrote a small script, that will download DSC resources and zip them, depending on if they are going to be used in Azure or on a regular pullserver (When downloading for a “regular” pullserver, it will also create a checksum file)

One thing to notice here is that when I create the zip files, I am using [io.compression.zipfile], the reason for this is that there is a bug in Compress-Archive, that sometimes prevents you from unpacking the zip files with 3rd party tools like 7-zip, I have also come across that Azure Automation cannot unzip the files (Caused me to waste a looooot of time, to figure that one out)

I was playing around with the Azure RM cmdlets a while back, and had forgotten to read the manual 🙂 So I was running into some issues, with not getting them to work.

So I figured that there had to be some kind of dependency among the modules for them to work.. So instead of reading the documentation, I decided to write a small script, that would look at the module manifest to check for dependencies.

A few Examples

Output looks like this:

VERBOSE: Azure.Storage
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.ApiManagement
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Automation
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Backup
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Batch
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Compute
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.DataFactories
VERBOSE: --AzureRM.Profile

VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.HDInsight
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Insights
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.KeyVault
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Network
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.OperationalInsights
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.profile

VERBOSE: AzureRM.RedisCache
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Resources
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.SiteRecovery
VERBOSE: --AzureRM.Profile

VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Storage
VERBOSE: --AzureRM.Profile
VERBOSE: --Azure.Storage
VERBOSE: ----AzureRM.Profile

VERBOSE: AzureRM.StreamAnalytics
VERBOSE: --AzureRM.Profile

VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.TrafficManager
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.UsageAggregates
VERBOSE: --AzureRM.Profile

VERBOSE: AzureRM.Websites
VERBOSE: --AzureRM.Profile

Today I was working at a client, doing a bunch of Azure Automation stuff, moving a lot of the scheduled jobs into Azure Automation, when I noticed this little button.

So I had to dig a little deeper, you can now run your Hybrid Worker Jobs as user, that you have stored in your Azure Automation credential store. Before Hybrid Worker jobs, would run as “LocalSystem” since that is the account the agent is running under. I don’t know, when this was added, but it has to be recently.

In order to set it up, you have to logon to the azure portal and go to

Automation Accounts -> -> Hybrid Worker Groups -> -> Settings -> Hybrid worker group settings

One of the new features of PowerShell v5 DSC, is that you now can use ConfigurationNames in “clear-text” not GUID, meaning that you can now have human readable names for your configurations. Since they are easier to guess, then there is an added layer of security, now the Pull clients have to register themselves with the Pull server, with a preshared key. When this happens the client LCM will generate a unique AgentID that is used to discern the different clients.

In order to add the RegistrationKey settings you need to add a line to the web.config file of the DSC Pull server, that entry points to a location in the file system where it can find a file called RegistrationKeys.txt. (You can read more about it here: Link

Instead of manually editing the web.config file, I wrote a little script to add the configuration, to help automate the building of pull servers for my demo lab

This assumes you have installed to the Pull server to the “default” location.

Another annoyance that I have come a across is that I usually have a Danish keyboard layout, and when I use the xPSDesiredStateConfiguration Module, and try to setup a pull server it will complain that I cannot find a file in C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSDesiredStateConfiguration\PullServer\en. In order to fix this, you can create two localized folders containing the same files as in the en and en-US folders.. I wrote a little script to do this, based on the locale of the machine.

Since both scripts are altering files in protected areas of the file system both has to be run as Administrator

I have been playing a lot with DSC, and I have therefore had to use [System.Guid]::NewGuid() a lot, to create GUIDs for the DSC configuration clients.
PS C:\> [System.Guid]::NewGuid()


But in this latest revision, there is a new CMDLET that lets you create GUIDs: New-Guid

PS C:\> New-Guid


If you need to copy and paste the GUID into a configuration, you can do something like this:


I was playing around with Microsoft ORK (Operational Readiness Kit), which is a frontend for the PowerShell deployment toolkit, aimed at hosters.

I wanted to try to do a greenfield install, using a clean Server 2012R2 machine, which had not been joined to any domain, and run everything off that machine. I had downloaded all the pre reqs, and started the install, but it kept failing.

It entered an infinite loop, trying to access the PowerShell AD provider, so my first thought is that I will just turn off the autoload of the AD provider, by setting an environment variable

But the script still failed, so I looked at the code, and found this snippet.

This is what is creating the infinite loop, it tests to see if the AD:\ provider is loaded, if not remove the module and install it again and try to find the drive again.

I have not had time to dig through the entire script, to see if the “installing” machine actually needs the PowerShell AD module or not.. So for now, the machine on which you run the installer on, need to be in a domain.