RSS
email
0

Determine the source site of Outlook clients on Exchange server

We have been toying with the idea of centralising our Exchange environment for a while now, and as part of this project, we needed to audit our Outlook clients, to determine which source site they were connecting from.

I was tasked with this, and was able to quickly gather this information, from both the Exchange 2003 and the 2007 environment, without too much hassle.

For Exchange 2007 simply use get-logonstatistics and select the information that you need. I added some additional spice, which exports each server to a separate CSV file.
foreach ($server in get-mailboxserver){
write-host "Current server: " $server
$filename = ".\" + $server + ".csv"
Get-LogonStatistics -server $server | select UserName, ClientIPAddress | sort UserName -Unique | Export-Csv $filename
}
Exchange 2003 is very similar, but as you can probably guess by now, you need to use WMI.
foreach ($server in (Get-ExchangeServer | Where {$_.IsExchange2007OrLater -eq $false})){
write-host "Current server: " $server
$filename = ".\" + $server + ".csv"
Get-Wmiobject -namespace root\MicrosoftExchangeV2 -class Exchange_Logon -Computer $server | select MailboxDisplayName, ClientIP | sort MailboxDisplayName -Unique | Export-Csv $filename
}
My job done, I sent the CSV files of the project managers, only to find out that they thought it would be nice, to see exactly which site each IP address belonged to.

This proved to be a little more tricky, but after a few minutes of probing the Interwebs, I found a post where Shay uses nltest to get the site information for a computer.

I assimilated this into my script with a little DNS lookup to find the host name and came up with a function which will retrieve the site information for each IP address on the fly and add that to the CSV file.
function Get-ComputerSite ($ip){
Write-Host "Current IP:" $ip
$site = $null
$computer = [System.Net.Dns]::gethostentry($ip) 
$site = nltest /server:$($computer.hostname) /dsgetsite
Return $site[0]
}

$ADSiteWMI = @{Name="ADSite";expression={Get-ComputerSite $($_.ClientIP)}}
$ADSite = @{Name="ADSite";expression={Get-ComputerSite $($_.ClientIPAddress)}}

foreach ($server in get-mailboxserver){
write-host "Current server: " $server
$filename = ".\" + $server + ".csv"
$LogonStats = Get-LogonStatistics -server $server | sort UserName -Unique 
$LogonStats | select UserName, ClientIPAddress, $ADSite | Export-Csv $filename 
}

foreach ($server in (Get-ExchangeServer | Where {$_.IsExchange2007OrLater -eq $false})){
write-host "Current server: " $server
$filename = ".\" + $server + ".csv"
$LogonStats = Get-Wmiobject -namespace root\MicrosoftExchangeV2 -class Exchange_Logon -Computer $server | sort MailboxDisplayName -Unique
$LogonStats | select MailboxDisplayName, ClientIP, $ADSiteWMI | Export-Csv $filename
This does take some time to complete on servers with many connections, but it gets the results required. I have already noticed a few issues, and the script can do with a little more refinement.

I will post these updates as soon as I get round to adding them. For now, I hope this script can help someone else with a similar problem.

The complete script can be downloaded from here:
Read more
0

Using SCL to prevent messages from going to Junk Mail

In our environment, we have a number of email addresses which are managed by automated programs and systems and even some home grown applications.

Most of these systems use POP3 to connect to the mailboxes and download incoming email. Obviously POP3 does not give you access to subfolders like “Junk Mail”. It has come to our attention recently, that the Junk email rule has been flagging valid client messages as Junk Mail, and sending these messages to the Junk Mail folder. The result is that these instructions / client information never make it to the back office workflow systems.

To prevent this from happening, you first need to understand SCL or Spam Confidence Level.

The SCL, in a nutshell is basically a score based on a number of criteria, which determine how likely a message is to contain spam. The higher the score (maximum 9) the more confident Outlook is that the message is spam.

An awesome way to view the SCL for individual messages is to install a custom form, which displays an additional column with this information. More information about that here: http://msexchangeteam.com/archive/2004/05/26/142607.aspx

After installing the form, I needed to start sending some spam to myself. This would establish the same message is either blocked or cleared by the Transport Rule. I grabbed an obvious spam message from my Gmail account and turned it into a Powershell spambot:

$messageParameters = @{

Subject = "Vicodin ES (Hydrocodone) 650mg x 30 pills $209 -VISA- tbrkl rqg" Body = " -== The Best Painkillers available ==- Buy Hydrocodone, Vicodin ES, Codeine, Phentermin, Norco, Valiuml, Xanaxl Online You pay & we ship, Absolute NO question asked No PrescriptionNeeded (No doctor approval needed!) 100% deliver your order to your house We have been in business since 1998 This is a rare bargain online to obtain these UNIQUE products. No prior order needed. Limited supply of these hard to get pills, so hurry! "

From = "spambot9k@spam-the-planet.com"
To = "spambots@spam-the-planet.com" 
Bcc = "jean.louw@domain.com" 
SmtpServer = "1.1.1.1" 
}
Send-MailMessage @messageParameters –BodyAsHtml


Confirmed! My spam message was being trapped by the Junk Mail rule with SCL 9 and moved to the Junk Mail folder.

OK, next we needed to create the Transport Rule. Now, if you are new to Powershell / Exchange I would suggest creating the rule in the GUI, as the interface / wizard used in that process is similar to the Outlook rules wizard.

Once you have the rule created it is very easy to add additional addresses using Powershell. More about that later. For the purposes of this post, I will however create the rule using the shell.

$condition = Get-TransportRulePredicate SentTo 
$condition.Addresses = @((Get-Mailbox "*jean.louw*")) 
$action = Get-TransportRuleAction SetSCl $action.SclValue = "-1" 
$warning = "WARNING: Adding mailboxes to this rule will prevent the Junk Mail rule from detecting possible spam." 

New-TransportRule -name "Set SCL level to -1" -Conditions @($condition) -Action @($action) -Comments $warning

This script will create the rule to set the SCL for all messages to matching addresses to -1. You can replace "(Get-Mailbox "*jean.louw*")" with any expression or command, which will give you the mailboxes you need to add to the rule.

Now that we have the rule in place, we need to confirm that it is working. Yet again, I sent a control “spam” message ala spambot9000.

This time the message SCL was -1, as we predicted, and the message was not moved to Junk Mail as before.

In future, should you need to add additional email addresses to your rule, you can use the following:
$condition = Get-TransportRulePredicate SentTo 
$condition.Addresses = @((Get-Mailbox "*system*")) 
$condition.Addresses += @((Get-Mailbox "*louw, jean*")) 
Set-TransportRule "Set SCL level to -1" -Conditions @($condition)

Remember that you have to add all of your address searches, each time, as the conditions are overwritten by set-transportrule. This is a really easy way to get around the problem of false positives in mailboxes where humans don't manage mailboxes, and are unable to notice that valid emails are being sent to Junk Mail.
Read more
1

Bulk export calendars from Exchange mailboxes

I have never really had the need for a script like this, so when our catering manager at the office logged a support call, requesting an export of all calendars for all of our meeting rooms, I had to investigate the possibilities. He basically needed this information in order to determine how busy the individual meeting rooms were during the last year.

Following a quick, unsuccessful, Internet probing for tools or scripts that could do this, my initial feeling was to say “No sorry, can’t be done, or if we do it, it was going to be a manual task.”

A manual task, which involves, granting access to the room mailbox, logging onto the mailbox using Outlook, and exporting the calendar data to Excel. Sounds easy, but doing that a hundred times is very unproductive and torturous to say the least.

I decided to attempt to script it, and the result is something I am both proud of and ashamed of at the same time, as I am convinced there must be a better way.

It’s a very rough method, which involves the following process: 

  • Get a list of rooms from a text file (as it was emailed to me). You could use get-mailbox instead.
  • Add-mailbox permission to the current user  
  • Create an Outlook profile 
  • Logon to the profile 
  • Export the Calendar to CSV 
  • Remove-MailboxPermission

I could automate most of the above, but creating new profiles on demand is something I’ve never had to do, and frankly, I had no idea how to get around this problem. After speaking to some of the developers at work, who promised me some dotnet code which could do it (which I am still waiting for might I add :)), I decided to use PRF files.

I have used PRF files very successfully in the past, on Terminal server deployments to automatically setup Outlook profiles. I downloaded the ORK and created a PRF which I used as a template for the script. The blank PRF is attached to this post to save you the time and effort of using ORK.

The script finds and replaces the UserName and HomeServer in the PRF, although any Exchange server should resolve you to your mailbox server. It then creates a PRF and starts Outlook with the /importPRF switch. Some extra information, for anyone wanting to actually deploy or use the PRF file; the %HomeServer% variable in the PRF does not work the same way %UserName% works, if you want use the PRF, you need to specify one of your mailbox servers instead.

While Outlook is open on that profile, the script attaches to Outlook using a COM object and downloads the calendar for the specified date.

The calendar fields can be customised to suit your needs. In my case we simply needed the Start and End date, the duration, and the Organizer.

The export data is saved and the PRF is removed, sadly the swarm of profiles will remain, and you have to manually remove them. You could remove them from HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\Windows Messaging Subsystem\Profiles but I have not added that to the script.

I hope this can help you, if you ever get a freaky request like this.

The script and the PRF template can be downloaded from here:




Read more
0

Measure the SMTP roundtrip time to an external email address

In an attempt to be more proactive about Internet email delays, whether caused by our systems, or those of our ISP, I have written a script which tests the roundtrip time on SMTP mail.

The basic idea behind the script is to send a message with a GUID, and wait for the return of that specific message. When that message returns, it measures the roundtrip time, and logs the result to disk. If the message is not returned within 30 mins, it will send you a warning message informing you of the problem.

Finally, the script creates a nice JPG with the results up to the last run.

Setting up and using this script is a little more complex than usual as it combines different technologies and resources to achieve its goal, which is to measure the roundtrip time on an actual SMTP message.

To start off, the script sends a message using a standard .NET relay. On Powershell V2 you could use send-mailmessage instead. At this point, the message is time stamped in the subject, with the current date and time. The message is also marked with a distinguishable word “SMTPPing”  for the reply rule, and a random GUID, which aids in recognising the message when it returns.

I use Gmail, as my “auto-reply” robot, as I am fairly certain that their infrastructure is robust and pretty stable. If you choose to use Gmail, you will need to setup a filter, which automatically forwards all mail with the word “SMTPPing” back to your email address, and then deletes it from Gmail.

Once you are sure that the auto reply is working, you can configure the script with your SMTP email addresses and relay host.

The return messages are collected from an Outlook mailbox using MAPI. You need to customise the script for the Outlook profile it needs to logon to. More details regarding this can be found as comments inside the script.

Outlook does not like strangers poking around in your stuff, so it will constantly warn you about this. To get around this problem, and also be a little selective about what you allow, you can download an awesome free tool from MapiLab called Advanced Outlook Security

Lastly, the script needs Excel installed, in order for the chart creation and export to JPG.


I am not sure why, but I am having problems currently closing Excel. Although I issue the command to close the application, it sometimes remains running, so lookout for excel.exe in process monitor.

As usual, your comments and suggestions are always welcome.


The script can be downloaded from here:




Read more
0

Updated basic Exchange queue monitor

As an update to the queue monitor script, I have added a little tweak.

When the script has completed its run, it will now import the log file into Excel and create a chart displaying the message flow for the current log.

The Excel chart export seems to have a problem with the current folder. I tried using “.\” or even get-location and set location in variables, but it only works if I hard code the path.

This image can be used on a web page to display the queue information in more friendly and accessible format.Excel is set to overwrite the existing image and spreadsheet everytime the script runs.

An awesome source of Excel related information can be located at the Excel Cookbook. This information saved me a lot of time an effort while working with Excel.

Information regarding Excel chart types and styles can be found here.

The complete script with the Excel chart export section can be downloaded from here:

Read more
0

Updated: Exchange Mailbox Cleaner


 I am ready to call this the final version of my Exchange Mailbox Cleaner script.

I have successfully used it in production and it saved us the effort of having to find and remove these mailboxes manually.

The GUI also makes it easy to hand this function over to the administrators for future cleanup tasks.

I have added one more search query button, the “Last Logon” button. This button will look for users on the selected server where the LastLogon is equal to $null.  This finds accounts which have essentially never logged on. There is a small bug though, if the user name is not unique, it seems that the last logon is unreadable and the account will also show up in the list.

This will however report an ERROR to the shell screen. Mailboxes which have not logged on will report the following warning to the shell:

WARNING: There is no data to return for the specified mailbox 'Bunny, Bugs', because it has not been logged on to.

For now, this is a manual method of verifying that the correct mailboxes will be removed. I am however looking for a way to avoid this and will post an update as soon as I have time to find the solution.

I have also permanently removed the Add-ADPermission from the Export-Mailbox section, as full mailbox access permissions are enough to export the mailbox.

I may build in a check later to see if the permissions are required before adding them.
#Add-ADPermission -Identity $actionItem -User $currentUser -Extendedrights "Send As" -whatif 
#Add-ADPermission -Identity $actionItem -User $currentUser -Extendedrights "Receive As" -whatif 
As always, any comments / suggestions with regards to the script are always welcome.

A little disclaimer / warning: This is a dangerous utility, and can wreck your Exchange system if you are not careful. Please test this in your test environment first, and adhere to your change control procedures before using this utility in the live environment. I take absolutely no responsibility for any damage caused by using this tool.

The utility requires the Exchange Management shell, and if launched from a Vista / Windows 7 needs to be “Run as Administrator”

This script was tested under Windows 7 Powershell v2 The script can be downloaded from here:

Read more
0

Legacy mailboxes on Exchange 2007

One of our users had a problem logging onto OWA today, and I noticed that the icon for his mailbox in the GUI displayed as a legacy mailbox, although he was located on an Exchange 2007 mailbox server.

After countless searches, I came across this article: http://support.microsoft.com/kb/941146. It explains that the msExchVersion property on the AD object is not set correctly, and that using set-mailbox –ApplyMandatoryProperties would resolve the problem. Looking at the help information on set-mailbox this could also be caused by users being created on Exchange 2007 server using the Exchange 2003 tools, although these users were migrated from Exchange 2003.

So, how to correct this? First get a list of all mailboxes on the Exchange 2007 server with the incorrect version. Using get-mailbox, the incorrect Exchange version displays as 0.0 (6.5.6500.0) The following command returns a list of these mailboxes by server (where SERVER1 is your Exchange 2007 server):

get-mailbox -server SERVER1 -resultsize unlimited | where {$_.ExchangeVersion -like "*0.0*"} | select Name, ExchangeVersion

Once you verify the list, pipe them to set-mailbox.

get-mailbox -server SERVER1 -resultsize unlimited | where {$_.ExchangeVersion -like "*0.0*"} | Set-Mailbox –ApplyMandatoryProperties  

This resolved the problem for me, easily, on multiple Exchange mailboxes. Running the get-mailbox command again, returned no results after applying set-mailbox to the problematic mailboxes.
Read more
6

Cleanup unused Exchange 2007 mailboxes

I often use my orphaned home directory cleanup script at work, to recover unused space from our file and print clusters. So my manager recently suggested that I do something similar for Exchange. Knowing that the orphan folder cleanup utility is still my responsibility as the administrators are not too comfortable with running scripts, I decided to give this utility a nice GUI.

To generate the code for the forms, I used SAPIEN PrimalForms. What beautiful tool. Very short learning curve, and very, very powerful. When the form loads, it will get a list of all the Exchange mailbox servers using get-mailboxserver.

This excludes Exchange 2003 servers as get-mailboxstatistics does not work with legacy mailboxes. I may develop a solution for that later. The three query buttons (Disabled, Hidden, Stale) will perform the following actions respectively: Disabled – Find mailboxes linked to disabled AD accounts Hidden – Find mailboxes hidden from the address book.  Stale – Find mailboxes linked to accounts which have not logged on in the last 3 months.

This search may take a little time to complete and this button is not supported against Exchange 2003 servers. These queries will populate the listbox with the names of the mailboxes. Besides the “Export List” button, the Action buttons at the bottom will action only selected items.

You can select items using SHIFT or CTRL. Export List will create a text file containing your search results. Export PST will grant the current user Full Mailbox with Send As and Receive As permission, and then export the mailbox to the path specified. Rename will change the display name based on the query performed. For mailboxes found with the “Disabled” button the display name will be prefixed with “DISABLED-MBXCleaner-“, for “Hidden” with “HIDDEN-MBXCleaner-“ and so forth. Users previously renamed will be excluded from subsequent searches. The “Disable” action will remove Exchange Attributes without deleting the AD account. The mailbox will be removed when the retention time expires. Delete will remove the mailbox and AD account completely.

I have not had a chance to test the Delete button as I would need to submit a change control request before using the utility in our live environment. All of the Action buttons are set to –whatif mode by default. The “Go Hot” checkbox will activate the heavy hitters (Export PST; Rename; Disable and Delete) so you can safely test each button first before taking any action. The “Reserved” button, currently, does nothing. I plan to allow this button to read or build a custom search for users, either by Name or other criteria.
 
WARNING: This is a dangerous utility, and can wreck your Exchange system if you are not careful. Please test this in your test environment first, and adhere to your change control procedures before using this utility in the live environment. I take absolutely no responsibility for any damage caused by using this tool. The utility requires the Exchange Management shell, and if launched from a Vista / Windows 7 needs to be “Run as Administrator” The script can be downloaded from here:

Read more
7

Update: Exchange 2007 audit script

In an attempt to resolve some issues with regards to the event logs, I have made a few updates to the Exchange 2007 audit script:
* I now use [System.Diagnostics.EventLog]::GetEventLogs() to collect the remote event logs and entries instead of WMI
* The output to the host displays exactly which event log it is busy reading.
* The date range seems more accurate now when the event log contains a large amount of data.
* The physical memory on the basic server information is now displayed as GB and is neatly rounded.
* The Mailbox stores are sorted in alphabetical order by Store Name.
* Added more verbose output to the console while the script runs, to give a better indication of what the script is busy with.
      I hope this resolves most of the problems for now, comments / suggestions are always welcome. The script can be downloaded from here:

      This script has been replaced by a later version, please check the following link, or download the updated version below:



      Read more
      6

      Exchange 2007 Audit Report

      I had some extra time this week to complete the Exchange 2007 version of the Audit script, as I am going on leave for a week, and needed to have the process automated while I am gone.

      This version of the script still uses WMI for some of the items on the report, but uses the Exchange 2007 commandlets for most of the Exchange related information.

      The one tricky bit of information to retrieve was the installed Exchange rollups. These are not available via WMI or any other method I could find. I did find a very effective solution on flaphead.com. This little piece of magic, locates the installed patches in the remote registry, and loops through the keys to find and list the installed rollups.


      Unlike Exchange 2003, Exchange 2007 servers are installed with specific roles. This plays a part, when checking things like queues and mailbox stores. For instance, there is no point in checking a pure Hub Transport server for mailbox stores etc. I initially built in a check which would check the ServerRole property of the server to match a specific role, forgetting that one server could have multiple roles. I now do a match for the role anywhere in the property string with this if statement: if ($exServer.ServerRole -notlike "*Mailbox*") This will skip the mailbox related check if the word “Mailbox” cannot be located anywhere in the string.

      To automate the running of the checks on a daily basis I setup a scheduled task on one of my Exchange 2007 servers as the script requires the commandlets.

      I really had no idea how to get the scheduled task to run in the Exchange management shell so, as a test I basically used the following command: C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe -PSConsoleFile "D:\Program Files\Microsoft\Exchange Server\bin\exshell.psc1" c:\scripts\ExchangeAudit2k7.ps1 .\servers.txt

      This did the trick and the entire check process now runs and completes before I even get to work. My version of the script, also creates an HTML menu and moves the reports to our departmental web server for my managers’ viewing pleasure. The mailbox stores now also indicate the last backup time, as we have had issues before where the backups aren't completed, and we don’t find out until it’s too late.


      I am busy working on a little piece of code, which will connect to the OWA site and simply test if the site is available, but that will have to wait until I am back from leave.

      This script has been replaced by a later version, please check the following link, or download the updated version below:



      http://powershellneedfulthings.blogspot.com/2009/11/exchange-2007-audit-script-version-3.html


      Read more
      1

      Update: Powershell Remote WMI Defrag


      As with most things in life, people are only happy with limited features for a little while, and then the enhancement requests pour in.
      The administration guys at the office have been using the remote defrag script for a couple of weeks, and soon realised that there was no way for them to show off the results of their labour to management. So inevitably, they requested that I add some sort of reporting to the script which they can send to management.

      Initially, I had all the results write out to a text file, for each volume, but this became a mess to manage after you defrag hundreds of servers with multiple volumes. Having recently completed the Exchange 2003 audit script with the use of Virtu-Al’s HTML template, I imagined it would be possible to report the defrag results using a similar format.

      The script runs through a list of servers, contained in servers.txt and starts a remote defrag using WMI. It waits for the process to complete and then moves on to the next volume. The script will check if dfrgntfs.exe is running on the remote host, and then skip that server.


      The script changes the colour of the drive on the report, based on whether a defrag was actually run or not. Green means it was skipped, orange that defrag was already running and red that it was defragged.


      Finally, at the bottom of each drives’ report the script will give you a quick before and after result.

      The script can be downloaded from my Skydrive:

      Read more
      0

      Update Network Interface Card parameters using WMI

      The following little function can be used if you need to manually override DNS and WINS addresses on a list of remote computers, where they may have already obtained addresses from a DHCP server. The code gets a list of IP enabled NICs from a remote computer using WMI, you can list the servers in servers.txt file in the same folder. The script updates your DNS servers search list to add two manual entries and also adds two manual WINS servers. I had some fun the SetWINSServer method as it only accepts the variable as an array. Finally, the script modifies the registry, to create a DNS suffix search list. Although this script only modifies limited parameters, it can easilly be adapted to update any of the other parameters.
      function updateNIC {
      $NICs = gwmi -computer $server Win32_NetworkAdapterConfiguration | where{$_.IPEnabled -eq “TRUE”}
      
      foreach ($NIC in $NICs) {
      
      $DNS=("1.1.1.1","2.2.2.2")
      $WINS=@("1.1.1.1","2.2.2.2")
      $DOMAIN="acme.com"
      
      $NIC.SetDNSServerSearchOrder($DNS)
      $NIC.SetDynamicDNSRegistration("TRUE")
      $NIC.SetWINSServer($WINS[0],$WINS[1])
      $NIC.SetDNSDomain($DOMAIN)
      
      $baseKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $server)
      $baseKey.OpenSubKey
      $subKey=$baseKey.OpenSubKey("SYSTEM\\CurrentControlSet\\Services\\Tcpip\\Parameters",$true)
      $subkey.SetValue('SearchList','acme.local,acme.com')
      
      }
      }
      foreach($server in (gc .\servers.txt)){
      updateNIC
      }
      
      Here are some images of the results of the Advanced TCP/IP Settings page after running the script. Here is the WINS tab.
      Read more
      2

      Audit the local Administrators group on a list of remote computers

      This is a very basic script which collects a list of server names from a local text file called servers.txt. The script reports the list of users, sorted by server name to a local text file in the root of drive C. I am working on cleaning up the results, as currently the "Adspath" reports to the text file in the following format: WinNT://DOMAIN/COMPUTER/Administrator This was the only true distinction between local or domain users, as "Name" reports only the name of the user or group. So you are never really sure if it is a domain or local entry. Finally I need to enable the script to report its results to Excel or HTML.

      $Result = @()
      
      foreach($server in (gc .\servers.txt)){
      
      $computer = [ADSI](”WinNT://” + $server + “,computer”)
      $Group = $computer.psbase.children.find(”Administrators”)
      
      function getAdmins
      {$members = $Group.psbase.invoke(”Members”) | %{$_.GetType().InvokeMember(”Adspath”, ‘GetProperty’, $null, $_, $null)}
      $members}
      
      $Result += $server
      $Result += ( getAdmins )
      $Result += " "
      }
      
      $Result > c:\results.txt
      Invoke-Item c:\results.txt
      

      I had a little extra time today, and managed to clean up the members using the -replace parameter, replace "DOMAIN" with your domain name. The updated code looks something like this:
      $Result = @()
      
      foreach($server in (gc .\servers.txt)){
      
      $computer = [ADSI](”WinNT://” + $server + “,computer”)
      $Group = $computer.psbase.children.find(”Administrators”)
      
      function getAdmins
      {$members = ($Group.psbase.invoke(”Members”) | %{$_.GetType().InvokeMember(”Adspath”, ‘GetProperty’, $null, $_, $null)}) -replace ('WinNT://DOMAIN/' + $server + '/'), '' -replace ('WinNT://DOMAIN/', 'DOMAIN\') -replace ('WinNT://', '')
      $members}
      
      $Result += Write-Output "SERVER: $server"
      $Result += Write-Output ' '
      $Result += ( getAdmins )
      $Result += Write-Output '____________________________'
      $Result += Write-Output ' '
      }
      
      
      
      $Result > c:\results.txt
      
      Invoke-Item c:\results.txt
      
      You can simply add another -replace ('WinNT://DOMAIN/', 'DOMAIN\') for each domain in the system. I know its a little hack 'n slash but it will do for now.
      Read more
      4

      Automatically clean up orphaned user directories

      We've had a huge problem where users were removed from Active Directory, but somehow the administrators neglected to remove the home folder for the user from the file servers. This left someone with the nasty task of cleaning up the mess.

      This script will work through a directory of home folders and lookup the user in AD. This is assuming that the home folder and the user id are the same. If the user is not found, or the account is disabled, the folder will be renamed with a leading "orphan-" followed by the original name. The script requires a parameter, which is the path where the folders are located. e.g. "findorphans.ps1 c:\users" The script requires that the Quest Powershell Commandlets are installed, and they can be downloaded free, here.


      param($target)
      $folders=Get-ChildItem -Path $target | Where-Object {$_.Name -notlike "orphan*" -and ($_.PSISContainer)} | Select-Object name
      foreach ($folder in $folders){
      Write-Host ""
      $userid=""
      "PROCESSING FOLDER: {0} "   -f $folder.name
      write-host "Searching for a possible owner..."
      $user=Get-QADUser $folder.name
      $useracc=$user.AccountIsDisabled
      $userid=$user.samaccountname
      $newid="orphan-" + $folder.name
      $fullpath=$target + "\" + $folder.name
      $fullpath
      "Account Disabled: {0} "   -f $user.AccountIsDisabled
      
      if ($userid.length -lt "0" -or $user.AccountIsDisabled -eq "True") {
      Write-Host "No owner found or account disabled, orphan folder renamed to" $newid -ForegroundColor Red
      rename-Item -Path $fullpath -NewName $newid
      }
      else {
      Write-Host "Owner found" $user -ForegroundColor Green
      }
      }
      
      Read more
      2

      Remote Defrag using WMI

      This is a script I created to analyze and defrag Windows 2003 server volumes using the WMI win32_volume defrag method.  The script will collect all volumes on a list of remote servers using WMI. Each volume is then analyzed for fragmentation using the FilePercentFragmentation property. If the fragmentation property is higher than 10 the script will initiate a remote defrag on the volume.  You should see a process on the remote server called “dfrgntfs.exe” running while the defrag is in progress. Sadly I have not found a method to track the progress of the defrag process. You can adjust the fragmentation percentage threshold at which a defrag is initiated by editing line 12.  Replace "SERVER1" "SERVER2" with your server names. Comments or suggestions are always welcome.


      $servers="SERVER1", "SERVER2"
      foreach( $server in $servers){
      Write-Host ""
      $v=(gwmi win32_volume -computer $server)
      "CURRENT SERVER: {0} " -f $server
      "NUMBER VOLUMES: {0} " -f $v.length
      
      foreach( $volume in $v){
      Write-Host ""
      write-host "Analyzing fragmentation on" ($volume.DriveLetter) "..."
      $frag=($volume.defraganalysis().defraganalysis).FilePercentFragmentation
      if ($frag -gt "10") {
      write-host "Drive" ($volume.DriveLetter) "is currently" $frag "% fragmented." -foreground RED
      write-host "Starting remote defrag..."
      $volume.defrag($true)
      }
      else {
      write-host "Drive" ($volume.DriveLetter) "is not fragmented" -foreground GREEN
      Write-Host ""
      }
      }
      }
      
      Read more