Blog

Exchange 2016 Tracking Log Searches

Exchange 2010 was the last iteration that included a decent message tracking log interface. Microsoft intentionally broke that wonder search app with 2013 in order to push admins to use powershell. So, when I had to track down the internal source of an email that was being sent by a job that did not appear to exist (we have dozens of applications that route mail through Exchange) I had to use PowerShell to track down the source IP.

Image result for exchange powershell

 

This proved not to be the easiest thing to do. I was not sure where the original source IP even resided. So, I decided to start with everything for the first query and work my way back from there. I also decided to use the wonderful | Out-Gridview feature to pipe the output to a GUI so that I could filter from there.

Here is the query to use:

Get-MessageTrackingLog -Server mail01 -Start “Aug 7 2017” -sender “customerservice@domain.com” -resultsize unlimited | select-object TransportTrafficType,SchemaVersion,RunspaceId,Timestamp,ClientIp,ClientHostname,

ServerIp,ServerHostname,SourceContext,

ConnectorId,Source,EventId,InternalMessageId,MessageId,

NetworkMessageId,Recipiets,RecipientStatus,TotalBytes,RecipientCount,RelatedRecipientAddress,

Reference,MessageSubject,Sender,ReturnPath,Directionality,

TenantId,OriginalClientIp,MessageInfo,MessageLatency,MessageLatencyType | Out-Gridview

 

This gave me the information that I needed. Too much information in fact, which was easily remedied by the Out-View Grid’s “Add Criteria” feature. In the end, it was the OriginalClientIP field that gave me what I needed!

 

 

 

 

 

 

Converting 2016 Evaulation to Production

I recently had a client for whom I had to build a new Windows 2016 server. He had not gotten the software yet but I wanted to proceed anyway so I downloaded an eval copy and installed from that. Later, I received the production key for the OS. But simply adding the key is not enough to activate an eval copy; it must be converted to ServerStandard via the command line.

Fortunately, this is easy to do. Just open an Administrative Command Prompt and type:

DISM /online /Set-Edition:ServerStandard /ProductKey:XXXXX-XXXXX-XXXXX-XXXXX-XXXXX /AcceptEula

Image result for convert 2016 evaluation to standard

This will convert the product and install the key at the same time! Please note that the install will appear to “hang” at 10% for a while and will then proceed pretty quickly. Just be patient! Then, reboot once and you are good to go.

https://serverfault.com/questions/808878/convert-evaluation-to-volume

 

Use Azure to build a domain based RDS infrastructure

The Client required Remote Desktop Sessions to be available to 17 contractors via a hosted server. They also wanted that server to be hosted within their existing Azure/365 portal.

The first thing was to connect their AD to the cloud. AD Connect had already been installed and had done the initial seeding of the accounts for Office 365. But it had not been enabled for password replication and so SSO was not in place. Further, the issue with users not synching properly was only because the user accounts were in the default USERS OU. AD Connect doesn’t like this. Accounts need to be elsewhere.

 

 

 

 

 

Domain Joined machines in the cloud

In order to join virtual servers and PCs in the cloud to the client.com domain and use domain creds for logins to these machines, we had to extend the AD to the cloud. There are a number of ways to do this and the easiest and simplest is to use Azure Active Directory Domain Services (AAD DS). This extends the domain to the cloud with no site to site VPN (which can be done within Azure) and no virtual server running in the cloud as a domain controller.

 

 

 

 

 

 

Domain Admins for AAD DS

In the AAD Dashboard, go to Groups and note the AAD DC Administrators

This is a global security group that only exists in the AAD DS world. It contains AD accounts that are allowed to be the equivalent of the Domain Admin role in Azure. You must be in the group to be able to do certain high level functions like joining Azure VMs to the Client.com domain.

 

 

 AD Connect on the on premise domain controller

AD Connect controls the replication of all configured information from the on-prem DC to AAD DS. It is straightforward to use and will sync any OU’s data except for the default, built-in OU.

 

 

 

 

 

Azure Virtual Machines

Azure VMs are mainly managed in the classic portal and not the new one.

 

Client-RDP01.CLIENT.COM

Here are the server information readings as configured by Microsoft on the server desktop

 

 

The server’s RDS configuration is set via Add Roles… as seen below

Configurable information for the Session Based RDS Deployment is listed here:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Migrate DFS Namespaces from 2000 to 2008 mode

Recently I had a client whom I had migrated AD services from 2008 R2 to 2012 R2. That part of the project went well and when completed, I needed to upgrade their DFS infrastructure to 2008 Mode.

There were issues though and I had to do a few things that I didn’t really like to do. Let’s just say that when you have to use ADSIEDIT.MSC, it’s always a cause for concern!

Image result for distributed file system

Here are a few of the references that I used:

https://technet.microsoft.com/en-us/library/cc753875.aspx – procedure for upgraded DFS Namespaces to 2008 from 2000

https://social.technet.microsoft.com/Forums/office/en-US/d77b1057-7854-45a8-9449-20f64bcc2f48/tried-migrating-dfsr-namespace-from-2000-to-2008?forum=winserverfiles – Use ADSIEDIT.MSC to clean up DFS in AD.

https://blog.workinghardinit.work/2011/05/24/move-that-dfs-namespace-to-windows-2008-mode/ – Good article on the process

These are the steps that I took to upgrade the DFS with errors and the remediations that I found needed to be used.

1: Open an admin CMD prompt:

  • Dfsutil root export \domain.localxyz c:xyz.xml

2: Write down the path for each namespace server:

\Server03xyz

\Server01xyz

\Server02xyz

3: In DFS Management, right-click the namespace and then click Delete

  • Could not delete the entire DFS. Server03 gave an error about the element not being found – DFS – The namespace cannot be queried. Element not found

https://social.technet.microsoft.com/Forums/office/en-US/b64ee068-653d-4f71-8cd7-a693c955bed9/dfs-the-namespace-cannot-be-queried-element-not-found?forum=winserverfiles

  • Bounced the DFS services on all three DCs
  • Able to delete the DFS Namespace but gave an error on Server03, red X on the namespace
  • Deleted Namespace from the AD using ADSIEDIT.MSC

 

  • Open ADSIedit.msc.
  • Connect to Default Naming Context (the domain name)
  • Expand and locate the container, which show the DFS root information
  • CN=<name_of_the_DFS replication group>,CN=DFSR-GlobalSettings,CN=System,DC=<name_of_your_domain>

 

4: In DFS Management, recreate the namespace with the same name, but use the Windows Server 2008 mode

  • At first, gave additional error about Server03 – Access is denied.
  • Could not stop and restart DFS, access is denied
  • Rebooted Server03
  • Now about to stop and start DFS
  • Added namespace back to the DFS successfully

 

5: Import the namespace from the export file

  • Dfsutil root import merge c:pps.xml \Domain.localxyz

Now, all appears to be well.

Upgrade Windows 10 Home to Pro

I have a client who recently wanted to get a Windows 10 Pro PC for >their office. He went to Best Buy but all they had was Home edition. This is not an issue since you can upgrade Home to Pro with a simple license key upgrade. But it turned out that when I went to install the new key, I got an error. Nice! Follow MS’s procedure and get an error…..

BUT, I was able to track down this solution. Install a temp license key that will allow the actual upgrade to work, THEN install the newly purchased key.

VK7JG-NPHTM-C97JM-9MPGT-3V66T – Install this key first, Reboot, then install the key that you purchased.

support.microsoft.com/en-us/help/12384/windows-10-upgrading-home-to-pro – This is where you go, but you have to install the temp key first, then the correct one.

Backup Hyper-V VMs with PowerShell

Backing up your Hyper-V VM environment using PowerShell has never been easier! PowerShell allows you to backup your VMs, log the results and powershell-tips-and-trickseven email you a report with a few easy commands.

In this case, I have two physical hosts, VH-SERVER01 and VH-SERVER02. Each of them has quite a bit of HD space so for the purposes of recovery from a simple hardware failure, back up the VMs on one host to the other is a great, simple way to go.

I created shares on each server for the other to use as a repository. VH-SERVER0xexports. I map the Z: drive to this share for simplicity’s sake.

One caveat that I found is that a Domain Controller VM will not export to a SMB share, even when it is mapped with a drive letter. I researched this and found all sorts of suggestions about permissions related to compluter accounts and despite my efforts, I was unable to resolve the direct export. So, for the DCs, I export them locally and then move the files to the same share where the other exports are sent.

After all the exports are done, I call the script Get-DirStats.ps1 to calculate the sizes of the exported files. I output the results to a file. The script code for that is located below the main script. I then use the great command, Send-MailMessage to email me the outputted results! Note that you need to have an SMTP server that will allow you to route email. Since this is a corporate client, I have an Exchange Receive Connector that will allow me to send SMTP traffic without authentication.

Some helpful links:

technet.microsoft.com/en-us/library/hh848491.aspx

social.technet.microsoft.com/Forums/scriptcenter/en-US/11213a59-e84b-47da-a505-dd6c59ce18de/powershell-troubles-with-bulk-moveitem-script?forum=ITCG

technet.microsoft.com/en-us/library/hh849925.aspx

stackoverflow.com/questions/25917637/create-folder-with-current-date-as-name-in-powershell

blogs.technet.microsoft.com/heyscriptingguy/2012/05/25/getting-directory-sizes-in-powershell/

The script code!

#deletes and adds Z: as a persistent drive since this script needs to run whether someone is logged in or not

Get-PSDrive Z | Remove-PSDrive
New-PSDrive -Name “Z” -PSProvider “FileSystem” -Root “\VH-SERVER01exports” -Persist
Remove-Item D:ExpADSERVER02 -Recurse
New-Item -ItemType directory -Path “D:ExpADSERVER02”

#define export paths

$ExportPath_D = “D:ExpADSERVER02”
$ExportPath_Z = “Z:”
$date = Get-Date
$date = $date.ToString(“yyyy-MM-dd”)

#Deletes files-folders that are older than 20 days

$limit = (Get-Date).AddDays(-20)
$path = $ExportPath_Z

# Delete files older than the $limit.

Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

# Delete any empty directories left behind after deleting the old files.

Get-ChildItem -Path $path -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse

New-Item -ItemType directory -Path “$ExportPath_Z$date”

#Exports are here

Export-VM -Name “ADSERVER02” -Path $ExportPath_D

Move-item $($ExportPath_D) Z:$($date)

Export-VM -Name “SERVER03″,”SERVER04″,”SERVER05″,”SERVER06” -Path $ExportPath_Z$date

#Report creation and email

c:scriptsGet-DirStats.ps1 -Path Z:$($date) -Every >C:Backup-LogsVH-SERVER02-$date.csv

Send-MailMessage -From “Backups <backups@YOURDOMAIN.com>” -To “Alerts <alerts@YOURDOMAIN.com>” -Subject “VH-SERVER02 Backup Status for $date” -Body “This is the VM backup report for VH-SERVER02.” -Attachments “c:backup-logsVH-SERVER02-$date.csv” -Priority High -dno onSuccess, onFailure -SmtpServer “MAIL.YOURDOMAIN.COM”

exit

Get-DirStats.ps1
<#
.SYNOPSIS
Outputs file system directory statistics..DESCRIPTION
Outputs file system directory statistics (number of files and the sum of all file sizes) for one or more directories.

.PARAMETER Path
Specifies a path to one or more file system directories. Wildcards are not permitted. The default path is the current directory (.).

.PARAMETER LiteralPath
Specifies a path to one or more file system directories. Unlike Path, the value of LiteralPath is used exactly as it is typed.

.PARAMETER Only
Outputs statistics for a directory but not any of its subdirectories.

.PARAMETER Every
Outputs statistics for every directory in the specified path instead of only the first level of directories.

.PARAMETER FormatNumbers
Formats numbers in the output object to include thousands separators.

.PARAMETER Total
Outputs a summary object after all other output that sums all statistics.
#>

[CmdletBinding(DefaultParameterSetName=”Path”)]
param(
[parameter(Position=0,Mandatory=$false,ParameterSetName=”Path”,ValueFromPipeline=$true)]
$Path=(get-location).Path,
[parameter(Position=0,Mandatory=$true,ParameterSetName=”LiteralPath”)]
[String[]] $LiteralPath,
[Switch] $Only,
[Switch] $Every,
[Switch] $FormatNumbers,
[Switch] $Total
)

begin {
$ParamSetName = $PSCmdlet.ParameterSetName
if ( $ParamSetName -eq “Path” ) {
$PipelineInput = ( -not $PSBoundParameters.ContainsKey(“Path”) ) -and ( -not $Path )
}
elseif ( $ParamSetName -eq “LiteralPath” ) {
$PipelineInput = $false
}

# Script-level variables used with -Total.
[UInt64] $script:totalcount = 0
[UInt64] $script:totalbytes = 0

# Returns a [System.IO.DirectoryInfo] object if it exists.
function Get-Directory {
param( $item )

if ( $ParamSetName -eq “Path” ) {
if ( Test-Path -Path $item -PathType Container ) {
$item = Get-Item -Path $item -Force
}
}
elseif ( $ParamSetName -eq “LiteralPath” ) {
if ( Test-Path -LiteralPath $item -PathType Container ) {
$item = Get-Item -LiteralPath $item -Force
}
}
if ( $item -and ($item -is [System.IO.DirectoryInfo]) ) {
return $item
}
}

# Filter that outputs the custom object with formatted numbers.
function Format-Output {
process {
$_ | Select-Object Path,
@{Name=”Files”; Expression={“{0:N0}” -f $_.Files}},
@{Name=”Size”; Expression={“{0:N0}” -f $_.Size}}
}
}

# Outputs directory statistics for the specified directory. With -recurse,
# the function includes files in all subdirectories of the specified
# directory. With -format, numbers in the output objects are formatted with
# the Format-Output filter.
function Get-DirectoryStats {
param( $directory, $recurse, $format )

Write-Progress -Activity “Get-DirStats.ps1” -Status “Reading ‘$($directory.FullName)'”
$files = $directory | Get-ChildItem -Force -Recurse:$recurse | Where-Object { -not $_.PSIsContainer }
if ( $files ) {
Write-Progress -Activity “Get-DirStats.ps1” -Status “Calculating ‘$($directory.FullName)'”
$output = $files | Measure-Object -Sum -Property Length | Select-Object `
@{Name=”Path”; Expression={$directory.FullName}},
@{Name=”Files”; Expression={$_.Count; $script:totalcount += $_.Count}},
@{Name=”Size”; Expression={$_.Sum; $script:totalbytes += $_.Sum}}
}
else {
$output = “” | Select-Object `
@{Name=”Path”; Expression={$directory.FullName}},
@{Name=”Files”; Expression={0}},
@{Name=”Size”; Expression={0}}
}
if ( -not $format ) { $output } else { $output | Format-Output }
}
}

process {
# Get the item to process, no matter whether the input comes from the
# pipeline or not.
if ( $PipelineInput ) {
$item = $_
}
else {
if ( $ParamSetName -eq “Path” ) {
$item = $Path
}
elseif ( $ParamSetName -eq “LiteralPath” ) {
$item = $LiteralPath
}
}

# Write an error if the item is not a directory in the file system.
$directory = Get-Directory -item $item
if ( -not $directory ) {
Write-Error -Message “Path ‘$item’ is not a directory in the file system.” -Category InvalidType
return
}

# Get the statistics for the first-level directory.
Get-DirectoryStats -directory $directory -recurse:$false -format:$FormatNumbers
# -Only means no further processing past the first-level directory.
if ( $Only ) { return }

# Get the subdirectories of the first-level directory and get the statistics
# for each of them.
$directory | Get-ChildItem -Force -Recurse:$Every |
Where-Object { $_.PSIsContainer } | ForEach-Object {
Get-DirectoryStats -directory $_ -recurse:(-not $Every) -format:$FormatNumbers
}
}

end {
# If -Total specified, output summary object.
if ( $Total ) {
$output = “” | Select-Object `
@{Name=”Path”; Expression={“<Total>”}},
@{Name=”Files”; Expression={$script:totalcount}},
@{Name=”Size”; Expression={$script:totalbytes}}
if ( -not $FormatNumbers ) { $output } else { $output | Format-Output }
}
}

The BEST Web Hosting

I have dealt with many web hosting companies over the years but by far IX hosting is the best out there. They have the same prices as the rest of the hosting companies, but with EXCELLENT support, even for issues with WordPress and Joomla. Give them a look if you need hosting for you site! Lande Technologies can assist with a transfer or an existing site or the creation of a totally new site! www.ixwebhosting.com/

Exchange 2016 CU1 Bug

From the AWESOME blog of Mr. Cunningham

Exchange 2016 CU1 Edge Transport Recipient Validation Bug

Hi there, Exchange 2016 Cumulative Update 1 was released two weeks ago, and news has emerged of a bug impacting Edge Transport servers. In short, the bug will affect Edge Transport servers that are performing recipient validation, and may result in legitimate email being rejected. Microsoft has acknowledged the bug in their release notes for Exchange 2016 and provided steps to mitigate the issue. Read more Paul Cunningham Microsoft MVP, Office Servers and Services

http://exchangeserverpro.com/exchange-2016-cumulative-update-1-cause-edge-transport-reject-email-valid-recipients/?awt_l=4VCygt&awt_m=3cl8.6s9oPG0vZC

Migrating from Exchange 2010 to 2016?

If this is what you are up against, take a look at this great PDF by Scott Schnoll at Microsoft. www.advis.ch/Documents/InfoNetDay/2015/Deploying%20Exchange%20Server%202016%20-%20Microsoft%20Corporation.pdf

http://www.msexchange.org/articles-tutorials/exchange-2016-articles/migration-deployment/migrating-small-organization-exchange-2010-exchange-2016-part4.html

Disable Typing Animations – Office 2016

I just upgraded to Office 2016 (not very happy about it either) and I can’t stand the typing animations in Outllook. I’m in email all day and when the cursor lags behind my typing, it drives me nuts because I touch type and I am pretty fast with it. To disable this, follow this great article! www.404techsupport.com/2012/11/disable-cursor-animation-word-2013/