• Category Archives Everyday
  • [Article 746]Updating UPN names in Active Directory

    We are in the process of testing out Office 365, to see if it will be useful for us, so initially we are just going to use DirSync for some specific users, instead of setting up the complete ADFS solution. I have been extremely busy lately, so I decided to hire someone to come in and setup the DirSync and change the UPN’s for the users who are going to the cloud.

    Everything went fine, I gave him a list of OU’s which contained the users who needed the change, and he started opening each user going in and changing the UPN from the GUI.. But apparently I have gotten allergic to doing stuff in the GUI, so I ended writing a small script for him. Which after some procrastinating ended up with a GUI.

    It will load a tree view of the current directory (Requires AD cmdlets to be present on the system), then you can select each OU you want to change the containing users UPN name, it will also let you choose to recurse through multiple OU’s.

    Be aware this is version 1, so there are no “are you sure prompts”, proceed with caution..

    I have exported the code from PowerShell studio with recovery info, so you can load the form, and work with it.

    UPNChanger

    The treeview code is based on code from Thepip3r:

    http://thepip3r.blogspot.dk/2011/06/powershell-guis-active-directory.html



  • [Article 739]Using EWS (Exchange Web Services) to read email subjects.

    Today I had a discussion with a vendor, they are delivering a solution that reads email from a specific email account on Exchange, and create a ticket in our helpdesk system. Often the email subjects shows up mangled in the helpdesk system, and the particular vendor blames Exchange EWS for not messing up the data.. I had a hard time believing that, so I set out to prove it is not Exchange that is a fault.

    First off I needed to download Exchange EWS, which is a separate download from Microsoft.

    So just download and install the MSI file.

    Then it is time to fire up PowerShell, first thing we need to do is load the EWS assembly

    Then we need to specify a username, Password, domain and the mailbox that we want to look at. There are several ways to do this, the simplest way is just writing this in clear text.

    Or you could query for it:

    Then we have to create a EWS Exchange object.

    Possible options for ExchangeService types are Exchange2007_SP1, Exchange2010, Exchange2010_SP1 or Exchange2010_SP2.

    Then we have to pass credentials to the $EWS objects depending on how we chose to supply the password initially, we have 2 options again.

    Here is the way if we supplied the information in clear text

    Here is the way if we prompt for the username/password with Get-Credential

    Then we use the AutoDiscoverURL method to look up the Exchange Server URL endpoint.

    In this example we will iterate through the inbox and list the first 20 items and output the subjects.

    Here is the script in its entirety.

    After analyzing the output from Exchange many times, I am convinced that the vendor is mistaken when he claims Exchange is screwing up the data.



  • [Article 721]Readable Password generator

    Because of a report from our auditors, I was tasked with writing a password generator
    for our helpdesk staff, so instead of using a generic variation of the same password,
    they needed something more “secure”.

    One of the issues is that this password is often read to the user over the phone, so just generating
    a random string of characters, would make it very hard to convey over the phone.

    So what I did was take a list of common words and generate a password based on the words from a wordlist.

    The package consists of 1 script file, and 6 wordlist files (I have taken 100 words with 2,3,4,5,6,7 characters, more words can easily be added to the files)
    (This package have wordlists, the one I have for work, I am using Danish words)

    Download full package here: PWDGenerator



  • [Article 716]Managing Queue-It queues with PowerShell

    We are using Queue-IT for some of our external websites, and we needed to start/stop the queues when we doing maintenance on our sites, I found this very easy to do, with Powershell’s Invoke-WebRequest cmdlet.

    Here are a few examples.



  • [Article 706]Culture Gotchas

    We are not everyone running Powershell on machines with ”en-US” locale, and sometimes that gives some unexpected problems, if you are running on a non US system, and have been working with Excel for instance, you have probably run into this error when trying to add something to Excel like this:

    $xl= New-Object -COM Excel.Application

    $xl.Visible = $true

    $wb=$xl.Workbooks.Add()

    clip_image002

    I stumbled upon another culture “gotcha” when trying to get some eventlog info using (Wasn’t obvious to me at the time it was a Culture issue)

    $daysBack = (Get-Date).Adddays(-2)

    Get-WinEvent -ComputerName Localhost -FilterHashTable @{LogName=’Application’; StartTime=$daysBack; ID=8224}

    clip_image004

    When I ran it I would get basic information back, just not the contents of the “Message” property, which in this case I very relevant.

    I tried running both commands on a test server, where both command worked without any problems. Since the server culture was set to ”en-US” that led me to dig a little deeper into the culture settings.

    First thing I tried was to set the culture manually:

    [System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

    Then check to see if the culture was changed.

    [System.Threading.Thread]::CurrentThread.CurrentCulture

    clip_image006

    To my surprise my culture was still da-DK, my initial thought was that PowerShell in v2 is default is running in MTA mode (Multi Threaded Apartment), meaning that my culture query might have been executed by a different thread. I tried to set the CurrentCulture to ”en-US” 100 times, but still every time I queried CurrentCulture I would get ”da-DK” back. (PowerShell v3 is running in STA (Single Threaded Apartment) mode per default)

    Next I tried was to wrap it in a ScriptBlock

    & {[System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

    [System.Threading.Thread]::CurrentThread.CurrentCulture }

    clip_image008
    Which worked, and returned ”en-US” as CurrentCulture

    I then tried wrapping it in a function

    Function Test {[System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US"

    [System.Threading.Thread]::CurrentThread.CurrentCulture }

    clip_image010
    Which also gave me the Expected ”en-US” Culture.

    After conferring with Joel Bennet, this led is to believe that the CurrentCulture is being reset after every pipeline.

    We then tried (on a single line)

    [System.Threading.Thread]::CurrentThread.CurrentCulture = "en-US";[System.Threading.Thread]::CurrentThread.CurrentCulture

    clip_image012

    Which confirmed that the culture change is only “active” in the current pipeline.

    I found this strange and wrote to the PowerShell team at Microsoft to have them shed some light on why this was happening.

    Lee Holmes replied:

    “PowerShell saves and restores the thread’s current culture before and after invoking a pipeline so that scripts don’t trash your entire session. When you invoke the command to set the culture, it impacts the entire pipeline – but then we restore it when the pipeline completes. When you put the two commands in the same pipeline, our culture restoration code hasn’t kicked in yet and thus you get to see what the pipeline’s culture was changed to”

    So remember if you are trying to run a script that requires a different Culture setting, that it gets reset after each pipeline completes.

    Last thing we tried was to see if PowerShell was using $PSCulture to reset the Culture settings, so I tried changing $PSCulture to “en-US” instead of “da-DK” (It is a “read-only” variable, so you have to use –Force)

    Set-Variable PSCulture -Value en-US -Force

    clip_image014

    But to no avail, it seems as if PowerShell checking the system culture, and not something defined within the PowerShell session.



  • [Article 684]Sending out Password Expiration mails to users in Active Directory

    I was tasked with writing a script that would send out an e-mail to users, when there were 14,7,3,2 and 1 days before their AD passwords expired.

    I use the Quest AD cmdlets to get users from AD.



  • [Article 681]Troubleshooting Test-Connection

    I was contacted by a friend who was having some issues with Test-Connection  when using TimeToLive

    Example:

    Test-Connection -ComputerName www.google.ie -Count 1 -TimeToLive 3

    Test-Connection : Testing connection to computer ‘www.google.ie‘ failed: Problem with some part of the filterspec or providerspecific buffer in general

    At line:1 char:1

    + Test-Connection -ComputerName www.google.ie  -TimeToLive 3

    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

        + CategoryInfo          : ResourceUnavailable: (www.google.ie:String) [Test-Connection], PingException

        + FullyQualifiedErrorId : TestConnectionException,Microsoft.PowerShell.Commands.TestConnectionCommand

    So I fired fired up Reflector to see what Test-Connection actually does, and it is using WMI and the  Win32_PingStatus class, I have not been able to reproduce the error calling the class directly, so I was wondering where the problem could be.

    So I did a  trace command to see if that would give me anything, one thing I noticed in the return value there was a value called 11013, shortly thereafter this was written in the trace.

    MemberResolution Information: 0 :     "writeErrorStream" NOT present in type table.

    MemberResolution Information: 0 :     Adapted member: not found.

    This made me think that there might be an error in converting the Errorcode from its numerical value into text

    So I  looked up the errorcode from ICMP_ECHO_REPLY32 structure, which has an error code called 11013 which is:

    http://msdn.microsoft.com/en-us/library/windows/desktop/bb540657(v=vs.85).aspx

    IP_TTL_EXPIRED_TRANSIT
    11013

    The time to live (TTL) expired in transit.

    Which seems fair because TTL is relatively low, then I looked a bit deeper to try and find what 11013 also could mean

    In  Winsock Error Codes I found:

    http://msdn.microsoft.com/en-us/library/windows/desktop/bb540657(v=vs.85).aspx

    WSA_QOS_BAD_OBJECT
    11013
    QoS bad object.

    A problem was encountered with some part of the filterspec or the provider-specific buffer in general.

    Then I did some additional testing.

    PingPowerShell
    If you do a Trace-Command on the example with TimeToLive = 11 which gives the error: Error due to lack of ressources, you will see that the error code is 11010 which corresponds to:

    WSA_QOS_ADMISSION_FAILURE
    11010
    QoS admission error.

    A QoS error occurred due to lack of resources.

    If you look at the WMI Win32_PingStatus error codes 11010 corresponds to:

    IP_REQ_TIMED_OUT
    11010

    The request timed out.

    Which you can confirm using ping from the command prompt:

    PingDOS



  • [Article 669]Checking Site sizes in SharePoint 2007 and 2010

    Our Sharepoint admin asked me to help him write a script, that found out how much space each DocumentLibrary in our sharepoint farm took up, so after some googeling I found that I could use the StorageManagementInformation Method on the SPSite object, so I cam up with this little script

    [ps]
    #First we load the SharePoint assembly
    [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    #Then we create a function that returns tje SPSite
    function Get-SPSite($url){
    return new-Object Microsoft.SharePoint.SPSite($url)
    }

    $site = Get-SPSite http://URL
    # We use the StorageManagementInformation Method on the $SPSite object, StorageManagementInformation returns a DataTable, and takes 4 input values
    # System.Data.DataTable StorageManagementInformation(Microsoft.SharePoint.SPSite+StorageManagementInformationType ltVar, Microsoft.SharePoint.SPSite+StorageManagementSortOrder sordVar, Microsoft.SharePoint.SPSite+StorageManagementSortedOn soVar, System.UInt32 nMaxResults)
    #
    # ltVar: What kind of storage management information to display
    # List = 1
    # DocumentLibrary = 2
    # Document = 3
    # sordVar: the direction in which the items are to be sorted
    # Increasing = 0×10
    # Decreasing = 0×11
    # soVar: whether the items are sorted by size or by date
    # Size=0
    # Date = 1
    # nMaxResults: the number of results to return

    $DT = $site.StorageManagementInformation(2,0×11,0,$(($site.allwebs).count));
    $DT | Select @{Label="Size"; Expression={[INT]($_.Size/1MB)}},Directory | out-gridview
    $site.Dispose()
    [/ps]

    Seems as if I forgot the last line, where I dispose of the Site object, I have added that now.



  • [Article 654]Updating .Htacces file based on Apache log files

    I am still seeing massive amounts of referal traffic hitting my site, eating up my bandwidth.. I did not get time to update my .htaccess file for the last 2 days.. and within the last 24 hours I have had more than 6000 hits, generating in almost 24.000 pageviews… Generating more than 1 GB worth of traffic (So at that speed I will reach my 10 GB limit soon)

    Looking through the Apache logs, figuring out which sites I get most referral traffic from, getting the hostnames, transforming them into a format that can be used by the Apache rewrite engine in the .htaccess file has been time consuming. So I decided that some powershell magic, might speed up the process a bit.

    [ps]
    function Select-FileDialog
    {
    param(
    [string]$Title,
    [string]$Directory,
    [string]$Filter="All Files (*.*)|*.*")
    [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms") | Out-Null
    $objForm = New-Object System.Windows.Forms.OpenFileDialog
    $objForm.InitialDirectory = $Directory
    $objForm.Filter = $Filter
    $objForm.Title = $Title
    $Show = $objForm.ShowDialog()
    If ($Show -eq "OK")
    {
    Return $objForm.FileName
    }
    Else
    {
    Write-Error "Operation cancelled by user."
    }
    }

    #Function to create the http rewrite rules.

    Function Create-Rewrite {
    Param (
    $Hostname
    )

    $HtaRule = "RewriteCond %{HTTP_REFERER} ^http://" + "$($hostname.replace(".","\."))" +" [OR]"
    $script:BlockList += $HtaRule
    }

    Function add-htaccess {
    Param (
    $HtaRules
    )
    (Get-Content $htaccess) | foreach-object {
    $_
    if ($_ -match "RewriteEngine") {
    if (!(Select-String -simplematch "$htarules" -Path $htaccess))
    {
    $HtaRules
    }
    }

    } | set-Content $tempFile
    Copy-Item $tempFile $htaccess
    }

    Function Upload-Ftp {
    Param ([Parameter(Position=0, Mandatory=$true)]
    [ValidateNotNullOrEmpty()]
    [System.String]
    $FTPHost,
    [Parameter(Position=1)]
    [ValidateNotNull()]
    $File
    )
    $webclient = New-Object System.Net.WebClient
    $uri = New-Object System.Uri($ftphost)

    "Uploading $File…"

    $webclient.UploadFile($uri, $File)
    }

    #Variables
    $log = Select-FileDialog -Title "Select an Apache logfile"
    $htaccess = "c:\Temp\.htaccess"
    $tempFile = [IO.Path]::GetTempFileName()
    $URLCount = 15
    $FTPUsername = "Username"
    $FTPPassword = "PassW0rd"

    $BlockList = ""
    #Create list of sites to block
    $script:BlockList = @()

    #Get the list of URLS in the the logfile, capturing each element into different named capturing groups

    $urls = Select-String ‘^(?<client>\S+)\s+(?<auth>\S+\s+\S+)\s+\[(?<datetime>[^]]+)\]\s+"(?:GET|POST|HEAD) (?<file>[^ ?"]+)\??(?<parameters>[^ ?"]+)? HTTP/[0-9.]+"\s+(?<status>[0-9]+)\s+(?<size>[-0-9]+)\s+"(?<referrer>[^"]*)"\s+"(?<useragent>[^"]*)"$’ $log |
    Select -Expand Matches | Foreach { $_.Groups["referrer"].value }

    #Output statistics for the referer hostnames (Only show top 15)
    $urls | group | ForEach -begin { $total = 0 }
    -process { $total += $_.Count; $_ } |Sort Count | Select Count, Name |
    Add-Member ScriptProperty Percent { "{0,15:0.00}%" -f (100*$this.Count/$Total) } -Passthru | select -Last $URLCount

    #Getting the base hostnames from the complete URLS, and outputs statistics to the screen.

    $hosts = $urls | Select-String '\b[a-z][a-z0-9+\-.]*://([a-z0-9\-._~%!$&()*+,;=]+@)?(?<host>[a-z0-9\-._~%]+|\[[a-z0-9\-._~%!$&()*+,;=:]+\])' |
    Select -Expand Matches | Foreach { $_.Groups["host"].value } | group | sort count | where {($_.name -notlike "*xipher.dk*") -and ($_.Count -gt 100)} |
    ForEach -begin { $total = 0 }

    -process { $total += $_.Count; $_ } | Sort Count | Select Count, Name |
    Add-Member ScriptProperty Percent { "{0,10:0.00}%" -f (100*$this.Count/$Total) } -Passthru

    Write-Host "List of root hostnames"

    $hosts

    Foreach ($Url in $hosts) {

    Create-Rewrite $url.Name
    }

    Foreach ($Block in $script:BlockList) {
    add-htaccess $Block
    }

    notepad $htaccess

    $script:BlockList

    Upload-Ftp -FTPHost "ftp://$($FTPUsername):$($FTPPassword)@xipher.dk/httpdocs/.htaccess" -File $htaccess
    Upload-Ftp -FTPHost "ftp://$($FTPUsername):$($FTPPassword)@xipher.dk/httpdocs/WordPress/.htaccess" -File $htaccess
    [/ps]

    Unfortunately my current hosting company, does not allow me to download the log files via FTP, but I have to connect to the Parallels interface and download it manually.. (I have not had the time looking into automating this part yet, so this is still a manual step)
    That is why I added a little function to use a GUI to pick the access_log file.

    [ps]
    function Select-FileDialog
    {
    param(
    [string]$Title,
    [string]$Directory,
    [string]$Filter="All Files (*.*)|*.*")
    [System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms") | Out-Null
    $objForm = New-Object System.Windows.Forms.OpenFileDialog
    $objForm.InitialDirectory = $Directory
    $objForm.Filter = $Filter
    $objForm.Title = $Title
    $Show = $objForm.ShowDialog()
    If ($Show -eq "OK")
    {
    Return $objForm.FileName
    }
    Else
    {
    Write-Error "Operation cancelled by user."
    }
    }
    [/ps]

    I then call the function like this:

    [ps]
    $log = Select-FileDialog -Title "Select an Apache logfile"
    [/ps]

    A little Regex magic runs through the logfiles, and captures the different elements into different named capturing groups, in this step, I expand all referrer hostnames, and put them into the $urls variable

    [ps]
    $urls = Select-String ‘^(?<client>\S+)\s+(?<auth>\S+\s+\S+)\s+\[(?<datetime>[^]]+)\]\s+"(?:GET|POST|HEAD) (?<file>[^ ?"]+)\??(?<parameters>[^ ?"]+)? HTTP/[0-9.]+"\s+(?<status>[0-9]+)\s+(?<size>[-0-9]+)\s+"(?<referrer>[^"]*)"\s+"(?<useragent>[^"]*)"$’ $log |
    Select -Expand Matches | Foreach { $_.Groups["referrer"].value }
    [/ps]
    I modified a script by Joel Bennet, to get a little statistics as well, since there can be 1000′s of hostnames, I have selected only to output top 15 by default (using the $URLCount variable.

    [ps]
    $urls | group | ForEach -begin { $total = 0 }
    -process { $total += $_.Count; $_ } |Sort Count | Select Count, Name |
    Add-Member ScriptProperty Percent { "{0,15:0.00}%" -f (100*$this.Count/$Total) } -Passthru | select -Last $URLCount
    [/ps]

    Then I loop through all the hostnames, and extract the base domain name, using regex again. (Here I choose to ignore all traffic from my own domain name Xipher.dk, and I choose only to look for referral domains, that have generated 100 hits or more

    [ps]
    $hosts = $urls | Select-String '\b[a-z][a-z0-9+\-.]*://([a-z0-9\-._~%!$&()*+,;=]+@)?(?<host>[a-z0-9\-._~%]+|\[[a-z0-9\-._~%!$&()*+,;=:]+\])' |
    Select -Expand Matches | Foreach { $_.Groups["host"].value } | group | sort count | where {($_.name -notlike "*xipher.dk*") -and ($_.Count -gt 100)} |
    ForEach -begin { $total = 0 }

    -process { $total += $_.Count; $_ } | Sort Count | Select Count, Name |
    Add-Member ScriptProperty Percent { "{0,10:0.00}%" -f (100*$this.Count/$Total) } -Passthru
    [/ps]

    The script expects to find a .htaccess file in c:\temp containing at least the following two lines:

    RewriteEngine On
    RewriteRule (.*) http://%{REMOTE_ADDR}/$ [R=301,L]