Calling x64 CLI Tools in x86 Scripting Tools and Processes

Every now and then I get the same question from people who only recently decided to make the switch to x64 bit Windows operating systems. I’ve been running on x64 since Vista RTM and I’m very happy with it. When those people start scripting with their tools, which are 32 bit, calling some CLI tool in %windir%System32 they can run into an annoying issue that express itself in the correct yet somewhat misleading “WshShell.Exec: The system cannot find the file specified.”. But you know it’s there in %windir%System32, you checked and double checked!

When your scripting tool is 32 bit and you run your script it usually launches an 32 bit version of the CLI tool you’re calling. This behavior is a result of file redirection. This is a transparent process that’s part of the Windows-on-Windows 64-bit (WOW64) subsystem that is used to run 32 bit apps. When a 32 bit applications calls a CLI tool in the %windir%system32 directory it silently redirects this to the %windir%SysWOW64 where 32 bit apps can happily run without a worry on an x64 bit operating system. Yes, indeed %windir%system32 is for x64 code only and %windir%SysWOW64 is for 32 bit code.

What’s in a name 🙂 Some people argue they should have use system32 for 32 bit and system64 for x64 bit but I’m sure they had their reasons for what they did (i.e. it would have been hell for some reason I guess). Other suggestions have also been made by people who are far better qualified than I am. For example by Mark Russinovich, a hard core systems developer, in http://blogs.technet.com/b/markrussinovich/archive/2005/05/07/running-everyday-on-64-bit-windows.aspx.

Now all this can happen transparently for the user if the tools used have both an x64 and a x86 version. Cmd.exe and ping.exe are fine examples. If you run some VBScript in my favorite scripting tool for example (Sapiens PrimalScript) which is 32 bit it will launch a 32 bit cmd.exe, that launches the cscript.exe 32 bit version and which will launch ping.exe (using WScript.Shell) in %windir%SysWOW64 by silently redirecting your %windir%system32 path. No worries, you don’t know any better and the result is the same. So it’s usually not a problem if there is both a x64 and a x86 version to the CLI tool as you have seen in the ping.exe example. When a 32 bit process calls a tool in %windir%system32 it’s redirected to %windir%SysWOW64 and uses the 32 bit version. No harm done.

The proverbial shit hits the fan when you call a CLI tool that only has a x64 bit version. As the scripting tool is x86 it’s call is redirected to the WOW64 and the script fails miserably as the CLI tool can’t be found. This can be pretty annoying when writing and testing scripts. The CLI backup tool of Windows Backup is a prime example. It does not have a 32 bit version. Consider this little script for example:

Option Explicit

Dim oShell
Dim oExecShell
Dim sBackupCommandString
Dim sText

Set oShell = CreateObject("WScript.Shell")
'sBackupCommandString = "%windir%sysnativewbadmin get disks"
sBackupCommandString = "%windir%system32wbadmin get disks"

Set  oExecShell = oShell.Exec(sBackupCommandString)

Do While oExecShell.Status = 0
    Do While Not oExecShell.StdOut.AtEndOfStream
        sText = oExecShell.StdOut.ReadLine()
        Wscript.Echo sText 
    Loop    
Loop

Set oShell = Nothing
Set oExecShell = Nothing

There is a lot of File Redirection going on here to %windir%SysWOW64 when running this code in the 32 bit scripting tool. That tool launches the 32 bit cmd.exe and thus the 32 bit cscript.exe which then launches a 32 bit shell and tries to run "%windir%system32wbadmin get disks" which is also redirected to %windir%SysWOW64 where wbadmin cannot be found throwing the error: “WshShell.Exec: The system cannot find the file specified.”. If you don’t have a 32 bit code editor just launch the script manually from an 32 bit command prompt to see the error.

The solution as demonstrated here is to use as in “%windir%Sysnativewbadmin.exe get disks”. Uncomment that line and put the line with sBackupCommandString = "%windir%system32wbadmin get disks" in comment. Do the same test again and voila. It runs. So there you have it, you can easily test your script now. Just make sure that when the time comes to put it out in the wild you replace it with the real path if the calling process is x64 bit, which for example wscript.exe and cscript.exe are when you launch the form a x64 bit shell (explorer.exe or cmd.exe), which is the default on a x64 operating system. The x86 version runs when you launch them from a x86 shell. But remember the default on x64 bit operating systems is x64 bit and sysnative only functions when called from a 32 bit process (it’s a virtual directory that doesn’t really exists).

Sysnative was introduced in Vista/Windows2008 x64 bit. Not only 32 bit script editor users a affected by this, all 32 bit processes launching tools in "%windir%system32 are. See more on MSDN via this link http://msdn.microsoft.com/en-us/library/aa384187(VS.85).aspx.  For the folks running XP or Windows 2003 x64 bit it is perhaps time you consider upgrading to Windows 2008 R2 or v7 x64 bit? If you can’t, no need to worry, you’re in luck. Microsoft did create a hot fix for you (http://support.microsoft.com/?scid=kb;en-us;942589) that introduces sysnative on those platforms. So welcome to the x64 bit universe, beware of file redirection in WOW64 and happy scripting 🙂

Setting Dates on Folders With PowerShell

A friend of mine with a Business Intelligence company asked me a favor. They have a lot of data (files & folders) that have to be copied around in the lab, at clients etc. This often leaves the date modified on the folders not reflecting the last modified date of the most recent modification in that folder’s sub structure.  This causes a lot of confusion in their processes, communication and testing.

The needed a script to correct that. Now they wanted a script, not an application (no installations, editable code). Good news they had a Windows machine (XP or higher) to run the code on and file sharing on Linux was using SAMBA so we could use PowerShell. VBScript/Java Script can only change dates on files using the Shell.Application object but NOT of folders. They also can’t directly call Windows API’s. First of all that’s “unmanaged code to the extreme” and using a  COM dll to get access to the Windows API violates the condition set out from the start.  But luckily PowerShell came to the rescue!

To accomplish the request we sort of needed to walk the tree backwards from all it’s branches back to the root. I’m no math guru so writing that sort of a reverse incursive algorithm wasn’t really an option. I decided to use plain good old recursion and count the depth of the folder structure to know how many times I needed to recursively parse through to get the correct modified date to “walk up” the folder structure. Here a snippet as a demo:


# Demo snippet

$root = "E:TestRootTestDataStructure" # The folder structure to parse
$DeepestLevel = 0 # A counter to persist the deepest level found up to that moment
$File
$LevelCheck
$Return

#Loop through the folder structure recursively to determine the deepest level.
foreach ($folder in Get-ChildItem $root -recurse | Where-Object {$_.PsIsContainer -eq "True"})

    {
        $search = $folder.FullName
        Write-Host "Folder: $search"
        #Sort the returned objects by modified date and select the most recent (last) one
  $Return = Get-ChildItem $search | Sort-Object LastWriteTime | Select-Object -last 1
  Write-Host "Childe File/Subfolder most recently modified: $Return"
        #Check how deep is the current level
  $LevelCheck = $Return.FullName.split("").Count -1
  # Compare above with deepest level foudn so far and set to new value if needed.
  if ($LevelCheck -gt $DeepestLevel) {$DeepestLevel = $LevelCheck}
  Write-Host "LevelCheck: $LevelCheck"
  Write-Host "DeepestLevel: $DeepestLevel"

    }
# Now actually recurively walk the folder structure x times where x = Deepestlevel
do {
  foreach ($folder in Get-ChildItem $root -recurse | Where-Object {$_.PsIsContainer -eq "True"})
        {
            $search = $folder.FullName
             #Sort the returned objects by modified date and select the most recent (last) one
  $Return = Get-ChildItem $search | Sort-Object LastWriteTime | Select-Object -last 1
            Write-Host "Child File or Folder most recently modified: " $Return.Fullname
            #Set the modified date on the parent folder to the one of most recent modified child object
  if ($Return -ne $null) {$folder.LastWriteTime = $Return.LastWriteTime}
            Write-Host "Parent folder " $search " last modified date set to " $Return.LastWriteTime
        }
  ; $DeepestLevel-- } #Counter -1
until ($DeepestLevel -eq 0)

Going through the folder structure to0 often is OK, going through it to0 few times is bad as it doesn’t accomplish the goal. So the logical bug in the code that loops once to much due to “\” in the UNC path isn’t an issue. Not really elegant but very effective. The speed is also acceptable. It ran through 30,000 files, 20 GB in all in about a minute. Quick & Dirty does the trick sometimes.

The code will work with PowerShell 1.0/2.0 against a local and a UNC path as long as you have the correct permissions.

This is just a code snippet, not the production code with error handling, so please test it in a lab & understand what it does before letting it rip through your folder structures.

Cheers