RSS
email
0

Find specific group membership

One of our administrators needed to present a report, which could show a list of users, and whether or not, they belonged to a specific group or groups, which control their browsing privileges.

Doing this manually was an immense task, but obviously using Powershell, it becomes a job you can do while reading your news paper.

The script example is a very basic solution, but it gets the job done. It uses the Quest Active Roles commands in an expression, which displays yet again, the magic of Powershell and how much time it could save you as an administrator, when working with thousands of items this way.

Although this is not groundbreaking stuff, I share this in the hope that it could save someone else some time.

$websense = @{Name="WebSense Group";expression={Get-QADMemberOf -sizelimit 0 -identity $_ | where {($_.Name -like "*Websense*") -or ($_.Name -match "Global Browsing")}}}
(gc .\users.txt) | foreach {Get-QADUser -sizelimit 0 -identity $_} | select Name, $websense | Export-Csv .\websense.csv
Read more
21

Exchange 2007 Audit Script - Version 3

I have updated the Exchange 2007 audit script yet again!

Included in this update are two MAJOR changes, firstly, the script uses and publishes information using the new HTML format, as created by Virtu-Al.

This script, and the functions which create its HTML output are far more efficient and literally cut the number of lines in the script down by half. Not only is the code leaner, but it is also infinitely more legible, and adding new tests to the current script is a breeze. This version of the HTML output is also compatible with multiple browsers, including Mozilla and Chrome.

Secondly, the script will now detect pipeline input. You can still use a server list as a parameter to the script, but now, you can also pipe content to the script. This content can include your server list, or output from Exchange commands such as get-transportserver or get-mailboxserver etc. Be careful though, because commands like get-exchangeserver could include Exchange 2003 servers.

If no server list is specified or piped, the script will still get all Exchange 2007 servers.

As another minor addition, I have added an additional test (Test-OutlookWebServices) to the CAS servers.

As always, your comments and feedback is always welcome.

The script can be downloaded from here:


Read more
0

Updated: Exchange 2007 audit script (Version 2)

I have finally been able to complete the updates to my Exchange 2007 audit script. The script has some enhancements which includes suggestions and comments from some readers.

The new script includes a few checks against CAS servers, which I feel have been neglected in the past. These checks include test-owaconnectivity and test-activesyncconnectivity. These two commands need some additional work to enable. To test if these command will work, you can run both test-owaconnectivity and test-activesyncconnectivity with the –ClientAccessServer switch. Additional information will be available in the console if the commands are unable to run.

I have been meaning to update the HTML format, as designed and used by Virtu-Al, but I have been unable to find the time. This is definitely high on the priority list, as the new format is supported by multiple browsers, and cuts down the number of lines of code significantly. I really wanted to include the new HTML in this release, but it would have delayed the release by weeks.

Here is a complete list of changes:

 - You have the option to specify a list of servers to audit, if you don’t, the script will use get-exchangeserver to find servers to audit.
 - Changed disk space to values to gigabyte.
 - Added white space to mailbox store report. ( This is done with dotnet, and has been optimised to be really quick)
 - Added MAPI connectivity test to mailbox server report.
 - Added OWA connectivity report for CAS servers.
 - Added ActiveSync connectivity report for CAS servers.
 - Cleaned up some variable names.

I will release a newer version soon, which will include a few additional checks, and will also use the latest HTML code.

Your comments and suggestions are always welcome.

The script can be downloaded from here:

This script has been replaced by a later version, please check the following link, or download the updated version below:

http://powershellneedfulthings.blogspot.com/2009/11/exchange-2007-audit-script-version-3.html



Read more
3

Exchange summary reports

Taking a cue from a post on the Windows Powershell Blog, by James Brundage, I decided to create a few notifications for my Exchange environment.

These little “scriptlets” will pop off a notification message in HTML format with a summary of information gathered by each.

The information is not server specific, as I tried to limit the number of instances required. And the content is very basic, but it gets the job done.

You can refer to James’ post above for more information on how to automatically schedule these to run.

You can copy the scripts by hovering over the code block and selecting “view source”

Mailbox database summary:
#//Mailbox Database Reports
$messageParameters = @{
    Subject = "Exchange 2007 Database Report - $((Get-Date).ToShortDateString())"
    Body =  Get-MailboxDatabase -status | 
   Select-Object Server, Name, Mounted, LastFullBackup | 
   Sort-Object Server, Name |
         ConvertTo-Html |
         Out-String
    From = "exrept@domain.com"
    To = "you@domain.com"
    SmtpServer = "10.10.10.10"
}
Send-MailMessage @messageParameters -BodyAsHtml
Exchange 2007 queue summary:
#//Exchange 2007 Queue Report
$messageParameters = @{
    Subject = "Exchange 2007 Queue Report - $((Get-Date).ToShortDateString())"
    Body =  Get-TransportServer |
   ForEach-Object { Get-Queue -Server $_ | 
   Select-Object NextHopDomain, MessageCount, Status} | 
   Sort-Object NextHopDomain |
         ConvertTo-Html |
         Out-String
    From = "exrept@domain.com"
    To = "you@domain.com"
    SmtpServer = "10.10.10.10"
}
Send-MailMessage @messageParameters -BodyAsHtml
Exchange 2003 queue summary:
#//Exchange 2003 Queue Report
$messageParameters = @{
    Subject = "Exchange 2003 Queue Report - $((Get-Date).ToShortDateString())"
    Body =  Get-ExchangeServer | 
   Where-Object {$_.IsExchange2007OrLater -eq $False} |
   ForEach-Object {
   Get-WmiObject -class exchange_smtpqueue -namespace ROOT\MicrosoftExchangev2 -computername $_ | 
   Where-Object -FilterScript {$_.MessageCount -gt 0} |  
   Select-Object VirtualMachine, QueueName, MessageCount, Size} |
   Sort-Object VirtualMachine |
         ConvertTo-Html |
         Out-String
    From = "exrept@domain.com"
    To = "you@domain.com"
    SmtpServer = "10.10.10.10"
}
Send-MailMessage @messageParameters -BodyAsHtml
Exchange 2007 MAPI connectivity summary:
#//MAPI Connectivity Report
$messageParameters = @{
    Subject = "MAPI Connectivity Report - $((Get-Date).ToShortDateString())"
    Body =  Get-MailboxServer |
   Where-Object {(get-mailboxdatabase -Server $_ ).count -gt '0'} |
   ForEach-Object { Test-MAPIConnectivity -Server $_ |
   Select-Object Server, Database, Result, @{Name="Latency(MS)";expression={(([TimeSpan] $_.Latency).TotalMilliSeconds)}}, Error} |
   Sort-Object Server, Database |
         ConvertTo-Html |
         Out-String
    From = "exrept@domain.com"
    To = "you@domain.com"
    SmtpServer = "10.10.10.10"
}
Send-MailMessage @messageParameters -BodyAsHtml
Exchange server disk summary:
#//Disk Space Reports
$messageParameters = @{
    Subject = "Exchange Disk Space Report - $((Get-Date).ToShortDateString())"
    Body =  Get-ExchangeServer |
   ForEach-Object { Get-WmiObject -computer $_ Win32_LogicalDisk} | 
   Where-Object {$_.DriveType -eq 3} |
   Select-Object SystemName, DeviceID, VolumeName, @{Name="Size(GB)";expression={[math]::round(($_.Size / 1073741824))}}, @{Name="Free(GB)";expression={[math]::round(($_.FreeSpace / 1073741824))}}, @{Name="Free(%)";expression={[math]::round(((($_.FreeSpace / 1073741824)/($_.Size / 1073741824)) * 100),0)}} | 
   Sort-Object SystemName, DeviceID |
         ConvertTo-Html |
         Out-String
    From = "exrept@domain.com"
    To = "you@domain.com"
    SmtpServer = "10.10.10.10"
}
Send-MailMessage @messageParameters -BodyAsHtml
Exhcange services summary:
#//Exchange Services Report
$messageParameters = @{
    Subject = "Exchange Services Report - $((Get-Date).ToShortDateString())"
    Body =  Get-ExchangeServer | 
   ForEach-Object {
   Get-WmiObject -computername $_ -query "select * from win32_service where Name like 'MSExchange%' or Name like 'IIS%' or Name like 'SMTP%' or Name like 'POP%' or Name like 'W3SVC%'" | 
   Select-Object SystemName, DisplayName, StartMode, State} |
   Sort-Object SystemName, DisplayName |
         ConvertTo-Html |
         Out-String
    From = "exrept@domain.com"
    To = "you@domain.com"
    SmtpServer = "10.10.10.10"
}
Send-MailMessage @messageParameters -BodyAsHtml
Read more
0

New Look

If you are a regular visitor, you may have noticed that the site looks a little different.

I looked at the site during last week and decided that I had enough of the standard old blogger template. It was really boring, and half the blogger sites out there use the same template.

I spent most of Friday getting the new look ready, and I am pretty sure I have it all working now. I have added some RSS feeds and a Feedburner email subscription page.

I think the new look is cleaner and a little more unique, and I really hope you like it too. 

Your comments and suggestions are always welcome.

Read more
0

View Performance data in a web browser

This is a little trick, which I have been meaning to share for a while. It is a very simple way to view performance data for your server, using a web browser. Now I am sure that it’s not news to everyone, but for those of you who see this for the first time, I am sure you’ll be able to use this in your environment.

Because I work primarily with Exchange server, I will be using some Exchange performance data for this post. However, you can use any performance counters you require, according to my knowledge they all work the same way.

To setup a basic html page with some performance data, open performance monitor, and add a counter from a remote server. I my case I have selected the % Processor Time. Once the graph starts populating with data, right-click anywhere on the graph and select “Save As”. Save the html file, either to your web server, or anywhere on your disk.

If you open the file from the disk, you have to manually start the logging again, I have noticed that if the page is loaded from a web server this is not required.

Microsoft included a very nice performance template in the Exchange 2007 Toolbox. I think we’ll start there, and open a performance counter with some pre-loaded information. You can access the performance data from the Exchange Management Console. Click the Toolbox and select Performance Monitor.

You can now save this data to an html document as before. There is one catch though, the performance data saved from this console points to the local machine. You have to open the html document in a text editor, and do a find and replace on the following string VALUE="\ with VALUE="\\MACHINENAME\ where MACHINENAME is your server name.

Now you should be able to load this html document from any computer or web server and have the selected performance data available.

Read more
1

Bulk export calendars from Exchange mailboxes

I have never really had the need for a script like this, so when our catering manager at the office logged a support call, requesting an export of all calendars for all of our meeting rooms, I had to investigate the possibilities. He basically needed this information in order to determine how busy the individual meeting rooms were during the last year.

Following a quick, unsuccessful, Internet probing for tools or scripts that could do this, my initial feeling was to say “No sorry, can’t be done, or if we do it, it was going to be a manual task.”

A manual task, which involves, granting access to the room mailbox, logging onto the mailbox using Outlook, and exporting the calendar data to Excel. Sounds easy, but doing that a hundred times is very unproductive and torturous to say the least.

I decided to attempt to script it, and the result is something I am both proud of and ashamed of at the same time, as I am convinced there must be a better way.

It’s a very rough method, which involves the following process: 

  • Get a list of rooms from a text file (as it was emailed to me). You could use get-mailbox instead.
  • Add-mailbox permission to the current user  
  • Create an Outlook profile 
  • Logon to the profile 
  • Export the Calendar to CSV 
  • Remove-MailboxPermission

I could automate most of the above, but creating new profiles on demand is something I’ve never had to do, and frankly, I had no idea how to get around this problem. After speaking to some of the developers at work, who promised me some dotnet code which could do it (which I am still waiting for might I add :)), I decided to use PRF files.

I have used PRF files very successfully in the past, on Terminal server deployments to automatically setup Outlook profiles. I downloaded the ORK and created a PRF which I used as a template for the script. The blank PRF is attached to this post to save you the time and effort of using ORK.

The script finds and replaces the UserName and HomeServer in the PRF, although any Exchange server should resolve you to your mailbox server. It then creates a PRF and starts Outlook with the /importPRF switch. Some extra information, for anyone wanting to actually deploy or use the PRF file; the %HomeServer% variable in the PRF does not work the same way %UserName% works, if you want use the PRF, you need to specify one of your mailbox servers instead.

While Outlook is open on that profile, the script attaches to Outlook using a COM object and downloads the calendar for the specified date.

The calendar fields can be customised to suit your needs. In my case we simply needed the Start and End date, the duration, and the Organizer.

The export data is saved and the PRF is removed, sadly the swarm of profiles will remain, and you have to manually remove them. You could remove them from HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\Windows Messaging Subsystem\Profiles but I have not added that to the script.

I hope this can help you, if you ever get a freaky request like this.

The script and the PRF template can be downloaded from here:




Read more
1

Collecting Exchange database white space from the event log using .NET

A recent comment from a reader, prompted me to do some updates and bug fixes to my Exchange 2007 audit script. As a part of this process, I decided to add the white space count into the mailbox store check.

I discovered an extremely helpful post, as usual, from Shay Levy, which pointed me in the right direction.

Although this function does get exactly what I needed, I did however want to search for the white space by mailbox store name, in order to get the value, as each mailbox store was passed during the script processing.

I changed my script to use .NET instead of WMI for event logs so I decided to continue using this method for the white space as well.

The basic script to collect the white space sizes from the event log using .NET is as follows:
$now = Get-Date
$colMailboxStores = Get-MailboxDatabase -Server SERVER -Status | Sort-Object Name
$spaceLog=[System.Diagnostics.EventLog]::GetEventLogs('SERVER') | where {($_.LogDisplayName -eq "Application")}
   
foreach ($objMailboxStore in $colMailboxStores)
  {
    Write-Host "..Getting database white space for" $objMailboxStore.Name
    $store = @{Name="store";Expression={$_.ReplacementStrings[1]}}
    $freeMB = @{Name="freeMB";Expression={[int]$_.ReplacementStrings[0]}}
    $whiteSpace = @()
    $whiteSpace += $spaceLog.entries | where {($_.TimeWritten -ge $now.AddDays(-1))} | where {($_.EventID -eq "1221")} | where {($_.ReplacementStrings[1] -match $objMailboxStore.Name)} | select $store,$freeMB -last 1
    $whiteSpace.freeMB
  }

This method is very slow, as it has to dredge through the entire event log for every database. It’s really not a problem if you have a small number of databases, but in a large environment like ours, with multiple mailbox servers, this could take ages to complete.

It was was painful during testing to wait for the above script to complete and I really felt that the speed of this process should be increased, so instead I came up with the following solution:
$now = Get-Date
$spaceLog=[System.Diagnostics.EventLog]::GetEventLogs('SERVER') | where {($_.LogDisplayName -eq "Application")}
$db = @{Name="database";Expression={$_.ReplacementStrings[1]}}
$freeMB = @{Name="MB";Expression={[int]$_.ReplacementStrings[0]}}
$whiteSpace = $spaceLog.entries | where {($_.TimeWritten -ge $now.AddDays(-1))} | where {($_.EventID -eq "1221")} | select $db,$freeMB

($whitespace | where {$_.database -match $objMailboxStore.Name} | select -last 1).mb

The code above will collect all of the Event ID 1221’s for the last day and store them in a variable with the customised place holders from the expressions.

This happens once per server only and any subsequent searches can be performed against the variable instead.

The select statement at the end, also selects the last item in the list to ensure that you also look at the latest event for each database. This literally reduces the runtime of the script by a factor equal to the amount of databases on your server.

I will be posting an update to the Exchange 2007 audit script soon, so stay tuned.
Read more
0

Measure the SMTP roundtrip time to an external email address

In an attempt to be more proactive about Internet email delays, whether caused by our systems, or those of our ISP, I have written a script which tests the roundtrip time on SMTP mail.

The basic idea behind the script is to send a message with a GUID, and wait for the return of that specific message. When that message returns, it measures the roundtrip time, and logs the result to disk. If the message is not returned within 30 mins, it will send you a warning message informing you of the problem.

Finally, the script creates a nice JPG with the results up to the last run.

Setting up and using this script is a little more complex than usual as it combines different technologies and resources to achieve its goal, which is to measure the roundtrip time on an actual SMTP message.

To start off, the script sends a message using a standard .NET relay. On Powershell V2 you could use send-mailmessage instead. At this point, the message is time stamped in the subject, with the current date and time. The message is also marked with a distinguishable word “SMTPPing”  for the reply rule, and a random GUID, which aids in recognising the message when it returns.

I use Gmail, as my “auto-reply” robot, as I am fairly certain that their infrastructure is robust and pretty stable. If you choose to use Gmail, you will need to setup a filter, which automatically forwards all mail with the word “SMTPPing” back to your email address, and then deletes it from Gmail.

Once you are sure that the auto reply is working, you can configure the script with your SMTP email addresses and relay host.

The return messages are collected from an Outlook mailbox using MAPI. You need to customise the script for the Outlook profile it needs to logon to. More details regarding this can be found as comments inside the script.

Outlook does not like strangers poking around in your stuff, so it will constantly warn you about this. To get around this problem, and also be a little selective about what you allow, you can download an awesome free tool from MapiLab called Advanced Outlook Security

Lastly, the script needs Excel installed, in order for the chart creation and export to JPG.


I am not sure why, but I am having problems currently closing Excel. Although I issue the command to close the application, it sometimes remains running, so lookout for excel.exe in process monitor.

As usual, your comments and suggestions are always welcome.


The script can be downloaded from here:




Read more
0

Updated basic Exchange queue monitor

As an update to the queue monitor script, I have added a little tweak.

When the script has completed its run, it will now import the log file into Excel and create a chart displaying the message flow for the current log.

The Excel chart export seems to have a problem with the current folder. I tried using “.\” or even get-location and set location in variables, but it only works if I hard code the path.

This image can be used on a web page to display the queue information in more friendly and accessible format.Excel is set to overwrite the existing image and spreadsheet everytime the script runs.

An awesome source of Excel related information can be located at the Excel Cookbook. This information saved me a lot of time an effort while working with Excel.

Information regarding Excel chart types and styles can be found here.

The complete script with the Excel chart export section can be downloaded from here:

Read more
0

A very basic queue monitor

At my office we recently needed a method to quickly know if the queues on any of the Exchange servers were building up. We have monitoring in place, but these guys can sometimes miss a build-up which leaves us with the problem.

As a very rudimentary solution, I compiled the following script.
In a nutshell, it enumerates the message count of all the queues on all Exchange servers in the Org. This includes Exchange 2003 and 2007. The script then measures the sum total of all messages. If it exceeds a predetermined amount, 1000 in my case, it will send a notification message to the administrators.

This is really a catch 22, if the server with the queue build-up is also your relay host, or happens to be the server with the problem. As a workaround for this you could probably do a NET SEND message or use 2 SMTP servers to relay the message through. An alternative is to send an SMS to administrators if you have the facility.

Initially the script waited with a while loop and polled the queues every 5 minutes. I have opted to change that, and launch the script with Task Scheduler instead, which means I don’t have to actually be logged onto the console for the script to run.

The script writes out the date and message count to a log. This log cycles daily.
I know this is very basic, but it gets the job done in terms of what we needed as an interim solution.

You can download the script here:


Read more
0

Updated: Exchange Mailbox Cleaner


 I am ready to call this the final version of my Exchange Mailbox Cleaner script.

I have successfully used it in production and it saved us the effort of having to find and remove these mailboxes manually.

The GUI also makes it easy to hand this function over to the administrators for future cleanup tasks.

I have added one more search query button, the “Last Logon” button. This button will look for users on the selected server where the LastLogon is equal to $null.  This finds accounts which have essentially never logged on. There is a small bug though, if the user name is not unique, it seems that the last logon is unreadable and the account will also show up in the list.

This will however report an ERROR to the shell screen. Mailboxes which have not logged on will report the following warning to the shell:

WARNING: There is no data to return for the specified mailbox 'Bunny, Bugs', because it has not been logged on to.

For now, this is a manual method of verifying that the correct mailboxes will be removed. I am however looking for a way to avoid this and will post an update as soon as I have time to find the solution.

I have also permanently removed the Add-ADPermission from the Export-Mailbox section, as full mailbox access permissions are enough to export the mailbox.

I may build in a check later to see if the permissions are required before adding them.
#Add-ADPermission -Identity $actionItem -User $currentUser -Extendedrights "Send As" -whatif 
#Add-ADPermission -Identity $actionItem -User $currentUser -Extendedrights "Receive As" -whatif 
As always, any comments / suggestions with regards to the script are always welcome.

A little disclaimer / warning: This is a dangerous utility, and can wreck your Exchange system if you are not careful. Please test this in your test environment first, and adhere to your change control procedures before using this utility in the live environment. I take absolutely no responsibility for any damage caused by using this tool.

The utility requires the Exchange Management shell, and if launched from a Vista / Windows 7 needs to be “Run as Administrator”

This script was tested under Windows 7 Powershell v2 The script can be downloaded from here:

Read more
0

Maximize My SendSize

Someone asked me the other day, “How could I go about using Security Groups, to control users’ send size limits?” He basically had a limit of 2mb for all users, and wanted to allow users in a specific Security Group to send up to 50mb messages. Here is a basic breakdown of the process I suggested: Firstly, you need to confirm that the global transport limit is raised to 50mb.

You can view and set these limits using get-transportconfig and set-transportconfig respectively:
Get-TransportConfig | select MaxSendSize  
The next step would involve setting the send connector to allow 50mb messages. You can use get-sendconnector to get a list of all send connectors, and their respective limits.
Get-SendConnector | Select Name, MaxMessageSize   
And then use set-sendconnector to set the MaxSendSize Set-SendConnector “Connector Name” –MaxSendSize 50mb Finally, you need to control the individual users’ send limits. If you have to control it via groups, you can use the following command to first enumerate the users in the group, and then pipe that to the set-mailbox command. Replace testsizegroup with the group you need to control the size limits for.
((get-group "testsizegroup").members) | foreach {set-mailbox -identity $_.Name -maxsendsize 52428800}
This will set the MaxSendSize for all users in that group to 50 mb. This command will have to be rerun every time to add users to the group, so it would be advisable to schedule this command to run hourly / daily etc.
Read more
0

The OWA saga continues...

After solving the msExchVersion mystery, it has become apparent that even more of our Exchange 2007 users were unable to access OWA.

After logging onto the site, a very similar error is displayed: Exception type: Microsoft.Exchange.Data.Storage.StoragePermanentException Exception message: There was a problem accessing Active Directory. My first step was obviously to verify the msExchVersion.

After ensuring that this was correct, and that the users were still unable to use OWA, I had to do more digging. Deeper delving into this issue, yielded the following KB from Microsoft: http://support.microsoft.com/kb/949527

To use OWA the Exchange Servers group must have write permissions to the msExchUserCulture attribute. Easy to resolve, just allow inheritable permissions from the parent to filter to the faulty object / objects, as per the KB article.

Easy enough on one account, but if you had to change this setting manually on multiple accounts, you could use Set-QADObjectSecurity –UnlockInheritance to accomplish the task. For more information see Dimitri’s blog
Read more
0

Legacy mailboxes on Exchange 2007

One of our users had a problem logging onto OWA today, and I noticed that the icon for his mailbox in the GUI displayed as a legacy mailbox, although he was located on an Exchange 2007 mailbox server.

After countless searches, I came across this article: http://support.microsoft.com/kb/941146. It explains that the msExchVersion property on the AD object is not set correctly, and that using set-mailbox –ApplyMandatoryProperties would resolve the problem. Looking at the help information on set-mailbox this could also be caused by users being created on Exchange 2007 server using the Exchange 2003 tools, although these users were migrated from Exchange 2003.

So, how to correct this? First get a list of all mailboxes on the Exchange 2007 server with the incorrect version. Using get-mailbox, the incorrect Exchange version displays as 0.0 (6.5.6500.0) The following command returns a list of these mailboxes by server (where SERVER1 is your Exchange 2007 server):

get-mailbox -server SERVER1 -resultsize unlimited | where {$_.ExchangeVersion -like "*0.0*"} | select Name, ExchangeVersion

Once you verify the list, pipe them to set-mailbox.

get-mailbox -server SERVER1 -resultsize unlimited | where {$_.ExchangeVersion -like "*0.0*"} | Set-Mailbox –ApplyMandatoryProperties  

This resolved the problem for me, easily, on multiple Exchange mailboxes. Running the get-mailbox command again, returned no results after applying set-mailbox to the problematic mailboxes.
Read more
0

Updated: Exchange Mailbox Cleaner

I have really been busy lately and have not had much time to spend on scripting. I did however find a litlle time to update this utility. Here is a list of changes:

 - After completing a query, the utility will now show you the total amount of data used by the mailboxes. (this obviously ignores single instance storage etc.)
 - You can now use the utility to move selected mailboxes to another store (This was a request from Aaron)
 - I have force removed the mandatory “confirm” on the Exchange verbs (move, disabled and delete) 

WARNING: This is a dangerous utility, and can wreck your Exchange system if you are not careful. Please test this in your test environment first, and adhere to your change control procedures before using this utility in the live environment. I take absolutely no responsibility for any damage caused by using this tool. The utility requires the Exchange Management shell, and if launched from a Vista / Windows 7 needs to be “Run as Administrator” The script can be downloaded from here: 


Read more
6

Cleanup unused Exchange 2007 mailboxes

I often use my orphaned home directory cleanup script at work, to recover unused space from our file and print clusters. So my manager recently suggested that I do something similar for Exchange. Knowing that the orphan folder cleanup utility is still my responsibility as the administrators are not too comfortable with running scripts, I decided to give this utility a nice GUI.

To generate the code for the forms, I used SAPIEN PrimalForms. What beautiful tool. Very short learning curve, and very, very powerful. When the form loads, it will get a list of all the Exchange mailbox servers using get-mailboxserver.

This excludes Exchange 2003 servers as get-mailboxstatistics does not work with legacy mailboxes. I may develop a solution for that later. The three query buttons (Disabled, Hidden, Stale) will perform the following actions respectively: Disabled – Find mailboxes linked to disabled AD accounts Hidden – Find mailboxes hidden from the address book.  Stale – Find mailboxes linked to accounts which have not logged on in the last 3 months.

This search may take a little time to complete and this button is not supported against Exchange 2003 servers. These queries will populate the listbox with the names of the mailboxes. Besides the “Export List” button, the Action buttons at the bottom will action only selected items.

You can select items using SHIFT or CTRL. Export List will create a text file containing your search results. Export PST will grant the current user Full Mailbox with Send As and Receive As permission, and then export the mailbox to the path specified. Rename will change the display name based on the query performed. For mailboxes found with the “Disabled” button the display name will be prefixed with “DISABLED-MBXCleaner-“, for “Hidden” with “HIDDEN-MBXCleaner-“ and so forth. Users previously renamed will be excluded from subsequent searches. The “Disable” action will remove Exchange Attributes without deleting the AD account. The mailbox will be removed when the retention time expires. Delete will remove the mailbox and AD account completely.

I have not had a chance to test the Delete button as I would need to submit a change control request before using the utility in our live environment. All of the Action buttons are set to –whatif mode by default. The “Go Hot” checkbox will activate the heavy hitters (Export PST; Rename; Disable and Delete) so you can safely test each button first before taking any action. The “Reserved” button, currently, does nothing. I plan to allow this button to read or build a custom search for users, either by Name or other criteria.
 
WARNING: This is a dangerous utility, and can wreck your Exchange system if you are not careful. Please test this in your test environment first, and adhere to your change control procedures before using this utility in the live environment. I take absolutely no responsibility for any damage caused by using this tool. The utility requires the Exchange Management shell, and if launched from a Vista / Windows 7 needs to be “Run as Administrator” The script can be downloaded from here:

Read more
0

Whats going on here?

As part of the Exchange audit scripts, I recently changed the 2007 version of the script to use .NET to collect the event logs instead of WMI. Virtu-Al made an interesting suggestion, which was to say, which of these methods are quicker at collecting the logs. So in order to do this I needed to setup a race.

This race would basically involve the two methods of retrieval collecting a large list of events from a selected server. The basic command to accomplish this is as follows: For WMI one would simply use:

$wmi = Get-WmiObject -computer SERVER1 Win32_NTLogEvent  

Using .NET, it retrieves the actual Event Logs, so the entries have to be enumerated with a quick bit of code:

System.Diagnostics.EventLog]::GetEventLogs('SERVER1') ForEach ($eventLog in $eventLogs){ $dotNet += ($eventLog.entries)} 

In both cases, where SERVER1 is the name of the remote server you need to collect the events from. Now, in order to make sure that there is no cheating, I would have to count how many objects are returned by each method. This could be done by simply saving the collection to a variable and counting the total. So in this scenario, .NET would return approximately 56000 items and WMI would return less. About 500+ less every time. From here I went down a crazy path of checking date and time formats etc. and in the end, I came to the conclusion that it had to be the security log. Entries were being written into the Security Log so quickly, that by the time the 2nd script is run, the number of entries have changed, or I remembered that you needed special permissions to read certain Security Log entries. Or so I thought. So I decided to exclude the Security log from my collection. This was easy enough, but still the totals were inconsistent. In an effort to try and eliminate where the problem could be, I decided to include only one log at a time, starting with the Application Log. Here is the script used to collect the Application from a remote server using WMI:

$d1 = get-date

$wmiDate = [System.Management.ManagementDateTimeConverter]::ToDmtfDateTime([DateTime]::Now.AddDays(-1))
$WMI = Get-WmiObject -computer SERVER1 -query ("Select * from Win32_NTLogEvent Where Logfile = 'Application' and TimeWritten >='" + $WmiDate + "'")

$wmiCount = ($WMI).Count

$wmiDT = [System.Management.ManagementDateTimeConverter]::ToDateTime($wmiDate)
Write-Host From Date $wmiDT
Write-Host Total $wmiCount
$d2 = Get-Date
$d2 - $d1

WMI Script results: From Date 09/06/2009 01:28:49 PM Total 317 Here is the script used to collect the same event log entries from the same server, using .NET instead:

$d1 = get-date
$dotNetDate = ([DateTime]::Now.AddDays(-1))
$eventLogs=[System.Diagnostics.EventLog]::GetEventLogs('SERVER1') | where {$_.LogDisplayName -eq "Application"}
ForEach ($eventLog in $eventLogs ){

$dotNet += ($eventLog.entries) | where {($_.TimeWritten -ge $dotNetDate)}
}

$dotnetCount = ($dotNet).count

Write-Host From Date $dotNetDate
Write-Host Total $dotnetCount
$d2 = Get-Date
$d2 - $d1

.NET Script Results:

From Date 09/06/2009 01:28:49 PM Total 650

This was still very confusing so, to see exactly at which record the problem is, I had both scripts display the record number of the first and last record in each respective collection, by adding the following to each script: For the .NET script:

$dotNet | Select-Object -First 1 $dotNet | Select-Object -Last 1 For the WMI script: $WMI | Select-Object RecordNumber, TimeWritten, Type, SourceName, EventCode -First 1 $WMI | Select-Object RecordNumber, TimeWritten, Type, SourceName, EventCode -Last 1 

Now I could see that, at least they were starting at the same record, but for some odd reason, WMI was quitting before the job was done. .NET record results:

Index Time Type Source EventID ----- ---- ---- ------ ------- 51 Jun 09 14:55 Warn MSExchange Availa... 4004 705 Jun 10 14:51 Warn MSExchange Active... 1008 WMI Results: RecordNumber TimeWritten Type SourceName ------------ ----------- ---- ---------- 353 20090610012624.00000... Warning MSExchange ActiveSync RecordNumber TimeWritten Type SourceName ------------ ----------- ---- ---------- 51 20090609145522.00000... Warning MSExchange Availability

To make sure this problem wasn’t specific to the current server I started collecting logs from other servers, to record the results. I also did an add-member on the WMI script to convert the time and date back for easier reading. With the following string:

ForEach-Object { Add-Member -inputobject $_ -Name myTime -MemberType NoteProperty -Value ([System.Management.ManagementDateTimeConverter]::ToDateTime($_.TimeWritten)) -Force -PassThru} 

Over a number of servers this still made no difference, WMI still did not return all the results. This seems to be a problem specific to the Application and Security Log, and could well be related to the WMI impersonation or authentication which will be available in version 2.

This I have not had time to investigate. I decided to re-write the WMI script to collect all results and then filter out the unwanted events with “where-object”. At this point I also changed the selected log to the system event log, as someone cleared the application logs on the selected servers.

This worked great for most of the servers and finally I was getting similar results from both scripts. I did however find, that servers with large numbers of events generate a WMI Quota Violation, which seems to imply that there are too many items in the list, which is yet another blow to WMI.

This could also explain the incomplete results from previous attempts. The Quota Violation is a known problem and there is a resolution for it posted here: http://support.microsoft.com/kb/828653. To get around this problem, I changed the script again, to use the WMI query. So now that we were getting results, it was time to start testing the speed of each method.

I decided to test the speed against 3 different servers, and increment the number of records retrieved until I could not collect anymore, or up to a maximum of 240 days worth of events.

I decided to also give each method and average read time over 3 attempts.

Here are some of the results:





























As the amout of days, or number of records increase, the read speed of WMI starts decreasing.

In summary, WMI scales nicely when using a WMI query directly in the Get-WMIObject command. It does however loose speed as the number of records to retrieve start increasing.

It has to be mentioned, that WMI slows down to a crawl, if all records are retrieved and the result is filtered with “where-object”.
Although WMI is faster with less records, I am going base all my event log queries on .NET for now, as WMI proved to be inconsistent and erroneous in what retrieves, or atleast in my testing it did.

I hope that this problem is related to impersonation, and that it is resolved in Powershell v2. The final scripts I used to retrieve the information can be downloaded from here:

 
Read more
7

Update: Exchange 2007 audit script

In an attempt to resolve some issues with regards to the event logs, I have made a few updates to the Exchange 2007 audit script:
* I now use [System.Diagnostics.EventLog]::GetEventLogs() to collect the remote event logs and entries instead of WMI
* The output to the host displays exactly which event log it is busy reading.
* The date range seems more accurate now when the event log contains a large amount of data.
* The physical memory on the basic server information is now displayed as GB and is neatly rounded.
* The Mailbox stores are sorted in alphabetical order by Store Name.
* Added more verbose output to the console while the script runs, to give a better indication of what the script is busy with.
      I hope this resolves most of the problems for now, comments / suggestions are always welcome. The script can be downloaded from here:

      This script has been replaced by a later version, please check the following link, or download the updated version below:



      Read more
      6

      Exchange 2007 Audit Report

      I had some extra time this week to complete the Exchange 2007 version of the Audit script, as I am going on leave for a week, and needed to have the process automated while I am gone.

      This version of the script still uses WMI for some of the items on the report, but uses the Exchange 2007 commandlets for most of the Exchange related information.

      The one tricky bit of information to retrieve was the installed Exchange rollups. These are not available via WMI or any other method I could find. I did find a very effective solution on flaphead.com. This little piece of magic, locates the installed patches in the remote registry, and loops through the keys to find and list the installed rollups.


      Unlike Exchange 2003, Exchange 2007 servers are installed with specific roles. This plays a part, when checking things like queues and mailbox stores. For instance, there is no point in checking a pure Hub Transport server for mailbox stores etc. I initially built in a check which would check the ServerRole property of the server to match a specific role, forgetting that one server could have multiple roles. I now do a match for the role anywhere in the property string with this if statement: if ($exServer.ServerRole -notlike "*Mailbox*") This will skip the mailbox related check if the word “Mailbox” cannot be located anywhere in the string.

      To automate the running of the checks on a daily basis I setup a scheduled task on one of my Exchange 2007 servers as the script requires the commandlets.

      I really had no idea how to get the scheduled task to run in the Exchange management shell so, as a test I basically used the following command: C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe -PSConsoleFile "D:\Program Files\Microsoft\Exchange Server\bin\exshell.psc1" c:\scripts\ExchangeAudit2k7.ps1 .\servers.txt

      This did the trick and the entire check process now runs and completes before I even get to work. My version of the script, also creates an HTML menu and moves the reports to our departmental web server for my managers’ viewing pleasure. The mailbox stores now also indicate the last backup time, as we have had issues before where the backups aren't completed, and we don’t find out until it’s too late.


      I am busy working on a little piece of code, which will connect to the OWA site and simply test if the site is available, but that will have to wait until I am back from leave.

      This script has been replaced by a later version, please check the following link, or download the updated version below:



      http://powershellneedfulthings.blogspot.com/2009/11/exchange-2007-audit-script-version-3.html


      Read more
      0

      First version of my PowerPack uploaded

      I really wanted to see how complicated it would be to write a very basic PowerPack for PowerGUI.

      I had recently updated my WMI defrag script to report to HTML, so I thought this would be an excellent candidate for a PowerPack.


      I must admit that, at first it is not very straight forward, but the more time you spend on it the easier it becomes. I am really looking forward to re-create some of my favorite apps / scripts as PowerPacks. I think this is an excellent platform from Quest.



      Suggestions / comments always welcome. Have mercy, it’s my first attempt at a PowerPack.
      Read more
      0

      Update: Automatically clean up orphaned user directories

       As an update to the automatic cleanup script, this script will do essentially the same, but will also attempt to take ownership of the folder before renaming it. This was added because some of our users had full control on their folders, and removed all other permission from the folder.

      There is a slight problem with get-acl and set-acl. The ownership will fail if you can’t read the ACL. If that fails, as a last resort, the script writes the failures out to a text file, which can be used as a batch file with a little utility called takeown.exe which will take ownership of the folders regardless. On a re-run of the script, the rename will succeed.

      You can download the script from here:


      Read more
      1

      Update: Powershell Remote WMI Defrag


      As with most things in life, people are only happy with limited features for a little while, and then the enhancement requests pour in.
      The administration guys at the office have been using the remote defrag script for a couple of weeks, and soon realised that there was no way for them to show off the results of their labour to management. So inevitably, they requested that I add some sort of reporting to the script which they can send to management.

      Initially, I had all the results write out to a text file, for each volume, but this became a mess to manage after you defrag hundreds of servers with multiple volumes. Having recently completed the Exchange 2003 audit script with the use of Virtu-Al’s HTML template, I imagined it would be possible to report the defrag results using a similar format.

      The script runs through a list of servers, contained in servers.txt and starts a remote defrag using WMI. It waits for the process to complete and then moves on to the next volume. The script will check if dfrgntfs.exe is running on the remote host, and then skip that server.


      The script changes the colour of the drive on the report, based on whether a defrag was actually run or not. Green means it was skipped, orange that defrag was already running and red that it was defragged.


      Finally, at the bottom of each drives’ report the script will give you a quick before and after result.

      The script can be downloaded from my Skydrive:

      Read more
      0

      Exchange WMI Audit

      I recently needed to automate my Exchange 2003 server daily checks. I have done some basic work on this before, but I really needed to automate the process and write the results to HTML to make it more “manager friendly”.

      While searching the web for something I could use as a basic start up script, I came across an awesome script on Virtual-Al. This script uses WMI to audit a list of remote computers, and reports in a very neat HTML format. It was exactly the platform I needed, and it meant not having to re-invent the wheel.

      I did however have some trouble with WMI and the mailbox stores, and finding a method for reporting the number of users and whether the store is mounted or not. I managed to find a workaround for the number of users, but it seems that checking the store status would have to be done with CDOEXM. This felt like a little too much effort as we are in the middle of our migration to Exchange 2007.

      Speaking of Exchange 2007. The script cannot be used against Exchange 2007 servers, as Exchange 2007 does not include any WMI providers.
      I am however working on an Exchange 2007 version or an Exchange version check process for this script. All credit for the HTML template and the original script should go to Alan Renouf, I merely took a great script and adapted it for use with Exchange.

      The script will show only Exchange related information on the report, this includes Hotfixes, Services and Event Log entries. The version of the script which I use myself, creates an HTML menu, with a list of all of the servers processed and links to their individual reports. It also moves the files to a web server, which makes it much more automated. Comments and suggestions are always welcome.

      This script is not displayed in a code window, but can be downloaded from here:

      Read more
      0

      Update Network Interface Card parameters using WMI

      The following little function can be used if you need to manually override DNS and WINS addresses on a list of remote computers, where they may have already obtained addresses from a DHCP server. The code gets a list of IP enabled NICs from a remote computer using WMI, you can list the servers in servers.txt file in the same folder. The script updates your DNS servers search list to add two manual entries and also adds two manual WINS servers. I had some fun the SetWINSServer method as it only accepts the variable as an array. Finally, the script modifies the registry, to create a DNS suffix search list. Although this script only modifies limited parameters, it can easilly be adapted to update any of the other parameters.
      function updateNIC {
      $NICs = gwmi -computer $server Win32_NetworkAdapterConfiguration | where{$_.IPEnabled -eq “TRUE”}
      
      foreach ($NIC in $NICs) {
      
      $DNS=("1.1.1.1","2.2.2.2")
      $WINS=@("1.1.1.1","2.2.2.2")
      $DOMAIN="acme.com"
      
      $NIC.SetDNSServerSearchOrder($DNS)
      $NIC.SetDynamicDNSRegistration("TRUE")
      $NIC.SetWINSServer($WINS[0],$WINS[1])
      $NIC.SetDNSDomain($DOMAIN)
      
      $baseKey = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $server)
      $baseKey.OpenSubKey
      $subKey=$baseKey.OpenSubKey("SYSTEM\\CurrentControlSet\\Services\\Tcpip\\Parameters",$true)
      $subkey.SetValue('SearchList','acme.local,acme.com')
      
      }
      }
      foreach($server in (gc .\servers.txt)){
      updateNIC
      }
      
      Here are some images of the results of the Advanced TCP/IP Settings page after running the script. Here is the WINS tab.
      Read more
      2

      Audit the local Administrators group on a list of remote computers

      This is a very basic script which collects a list of server names from a local text file called servers.txt. The script reports the list of users, sorted by server name to a local text file in the root of drive C. I am working on cleaning up the results, as currently the "Adspath" reports to the text file in the following format: WinNT://DOMAIN/COMPUTER/Administrator This was the only true distinction between local or domain users, as "Name" reports only the name of the user or group. So you are never really sure if it is a domain or local entry. Finally I need to enable the script to report its results to Excel or HTML.

      $Result = @()
      
      foreach($server in (gc .\servers.txt)){
      
      $computer = [ADSI](”WinNT://” + $server + “,computer”)
      $Group = $computer.psbase.children.find(”Administrators”)
      
      function getAdmins
      {$members = $Group.psbase.invoke(”Members”) | %{$_.GetType().InvokeMember(”Adspath”, ‘GetProperty’, $null, $_, $null)}
      $members}
      
      $Result += $server
      $Result += ( getAdmins )
      $Result += " "
      }
      
      $Result > c:\results.txt
      Invoke-Item c:\results.txt
      

      I had a little extra time today, and managed to clean up the members using the -replace parameter, replace "DOMAIN" with your domain name. The updated code looks something like this:
      $Result = @()
      
      foreach($server in (gc .\servers.txt)){
      
      $computer = [ADSI](”WinNT://” + $server + “,computer”)
      $Group = $computer.psbase.children.find(”Administrators”)
      
      function getAdmins
      {$members = ($Group.psbase.invoke(”Members”) | %{$_.GetType().InvokeMember(”Adspath”, ‘GetProperty’, $null, $_, $null)}) -replace ('WinNT://DOMAIN/' + $server + '/'), '' -replace ('WinNT://DOMAIN/', 'DOMAIN\') -replace ('WinNT://', '')
      $members}
      
      $Result += Write-Output "SERVER: $server"
      $Result += Write-Output ' '
      $Result += ( getAdmins )
      $Result += Write-Output '____________________________'
      $Result += Write-Output ' '
      }
      
      
      
      $Result > c:\results.txt
      
      Invoke-Item c:\results.txt
      
      You can simply add another -replace ('WinNT://DOMAIN/', 'DOMAIN\') for each domain in the system. I know its a little hack 'n slash but it will do for now.
      Read more
      4

      Automatically clean up orphaned user directories

      We've had a huge problem where users were removed from Active Directory, but somehow the administrators neglected to remove the home folder for the user from the file servers. This left someone with the nasty task of cleaning up the mess.

      This script will work through a directory of home folders and lookup the user in AD. This is assuming that the home folder and the user id are the same. If the user is not found, or the account is disabled, the folder will be renamed with a leading "orphan-" followed by the original name. The script requires a parameter, which is the path where the folders are located. e.g. "findorphans.ps1 c:\users" The script requires that the Quest Powershell Commandlets are installed, and they can be downloaded free, here.


      param($target)
      $folders=Get-ChildItem -Path $target | Where-Object {$_.Name -notlike "orphan*" -and ($_.PSISContainer)} | Select-Object name
      foreach ($folder in $folders){
      Write-Host ""
      $userid=""
      "PROCESSING FOLDER: {0} "   -f $folder.name
      write-host "Searching for a possible owner..."
      $user=Get-QADUser $folder.name
      $useracc=$user.AccountIsDisabled
      $userid=$user.samaccountname
      $newid="orphan-" + $folder.name
      $fullpath=$target + "\" + $folder.name
      $fullpath
      "Account Disabled: {0} "   -f $user.AccountIsDisabled
      
      if ($userid.length -lt "0" -or $user.AccountIsDisabled -eq "True") {
      Write-Host "No owner found or account disabled, orphan folder renamed to" $newid -ForegroundColor Red
      rename-Item -Path $fullpath -NewName $newid
      }
      else {
      Write-Host "Owner found" $user -ForegroundColor Green
      }
      }
      
      Read more
      2

      Remote Defrag using WMI

      This is a script I created to analyze and defrag Windows 2003 server volumes using the WMI win32_volume defrag method.  The script will collect all volumes on a list of remote servers using WMI. Each volume is then analyzed for fragmentation using the FilePercentFragmentation property. If the fragmentation property is higher than 10 the script will initiate a remote defrag on the volume.  You should see a process on the remote server called “dfrgntfs.exe” running while the defrag is in progress. Sadly I have not found a method to track the progress of the defrag process. You can adjust the fragmentation percentage threshold at which a defrag is initiated by editing line 12.  Replace "SERVER1" "SERVER2" with your server names. Comments or suggestions are always welcome.


      $servers="SERVER1", "SERVER2"
      foreach( $server in $servers){
      Write-Host ""
      $v=(gwmi win32_volume -computer $server)
      "CURRENT SERVER: {0} " -f $server
      "NUMBER VOLUMES: {0} " -f $v.length
      
      foreach( $volume in $v){
      Write-Host ""
      write-host "Analyzing fragmentation on" ($volume.DriveLetter) "..."
      $frag=($volume.defraganalysis().defraganalysis).FilePercentFragmentation
      if ($frag -gt "10") {
      write-host "Drive" ($volume.DriveLetter) "is currently" $frag "% fragmented." -foreground RED
      write-host "Starting remote defrag..."
      $volume.defrag($true)
      }
      else {
      write-host "Drive" ($volume.DriveLetter) "is not fragmented" -foreground GREEN
      Write-Host ""
      }
      }
      }
      
      Read more