Wednesday, December 16, 2009

Translating Data With PowerShell

I recently was confronted with a situation where I needed to transform a list of developer-friendly filenames contained in a text file into a user-friendly list of report names. In the past I probably would have written a Console app to do this work but PowerShell is a much more lightweight solution to this problem.

The filenames were the result of a query against TFS version control and were in the format of $/[TFS Project Name]/Reports/[Codeline]/[Visual Studio Project Name]/*.rdl. A given *.rdl file, for example "UpcomingBdayRpt.rdl", would be represented in the application as "Upcoming Birthdays Report". I needed to provide the names which the tester was familiar with in the application. Providing a list of cryptic names like "UpcomingBdayRpt.rdl" would be as useful as providing a list in cuneiform (assuming that the tester doesn't read cuneiform).

The first step was to get the list into PowerShell. (For the purposes of this demonstration assume that I've already navigated in PowerShell to the directory that contains the source files.)
$allReportFiles = Get-Content .\MyInputFile.txt

Because I had a predictable path pattern I could use Split on the forward slash to get a list of only the filename. I removed the extension on the filename with -replace.
$rdlNames = $allReportFiles | ForEach-Object{ $_.split("/")[5] -replace ".rdl", ""}

The database that supports our application has a cross-reference table that relates the *.rdl name to the user-friendly name. I exported the two columns that I needed into a comma-delimited file named "reports.txt". The format of the text file looked like this:
DocumentLetter,Document Letter
InterviewReport,Interview Report
OutstandingItemsLetter,Outstanding Items Letter

I got lazy during the next step. Pipelining the contents of $rdlNames to ForEach-Object, I used GetChildItem to get a reference to my data file, searched that file for a line that contained the filename contained in "$_", and, when the line was found, got the FileName, Pattern, and Line properties. Note that I don't need the FileName property, but included it so that I could confirm that my data was coming from where I expected it. I'm paranoid about things like that.
$data = $rdlNames | ForEach-Object { Get-ChildItem * -include reports* | Select-String -pattern $_.Trim() -SimpleMatch} | Select-Object -property FileName, Pattern, Line -unique
So why do I consider this lazy? Well, there are almost certainly more elegant solutions to this problem. But then again, this is just a trivial script meant to solve a unique problem. The solution doesn't have to stand the test of time nor be incredibly efficient- it just needs to get the job done.

The final step is just to extract the user-friendly names from the objects in $data using $_.Line.split(",")[1]. In case I need to discuss one of the reports with the tester, I provide the developer-friendly filename in brackets so that we can translate between the two names (developer and tester).
$data | ForEach-Object{$_.Line.split(",")[1] + " [" + $_.Pattern + "]" } | Sort-Object

So that's it. I copied and pasted the output from the final line into an email and I was done. The tester had the information that she needed, all was right in the world, and I could leave on time for once.

Thursday, November 19, 2009

Phantom TNSNAMES entries

I have a PowerShell script that uses Oracle.DataAccess. It retrieves its connection information from an existing configuration file on a server. The connection information is as basic as you can get: Data Source, User Id, Password. I had already successfully installed and executed the script on a development server but received the following error when I attempted to run the script on my test server: "ORA-12505: TNS:listener does not currently know of SID given in connect descriptor".

I was baffled. The test server is an application server that has been running in good order for a couple of years. The application that runs on it (let's call it SERVICE) successfully talks to the "missing" database almost continuously. The TNSNAMES file on the test server contains a single entry, and I knew that entry was valid because SERVICE was up and running.

I immediately started checking event logs, experimenting with case sensitivity in the connection string and the PowerShell script, and tweaking environment variables. I executed the relevant steps from the script in the PowerShell console. I scoured the server's file system for extra TNSNAMES.ORA files but found only the one that I expected. I stopped SERVICE in case it was somehow blocking my script's database calls. To make the issue even more confounding, I could specify a Data Source that wasn't even listed in the TNSNAMES file and I could then open the OracleConnection!

After several hours I gave up. I returned to the problem over a week later with a strange notion to check any mapped network drives. I've got a default mapped drive created (I assume) when my domain account was created. Looking in this drive I had a Eureka! moment: an old TNSNAMES file from my development machine that was full of entries. Suddenly it all made sense: I could connect to databases not present in the test server's TNSNAMES file because the entries were present in the 'network' TNSNAMES file. And, conversely, I couldn't connect to the test server's TNS entry because the port value had changed in the past couple of months and my 'network' TNSNAMES file had the old, invalid port.

What I had expected to be a trivial smoke test turned out to be much, much more. There's a lesson in here somewhere. The obvious one is that I probably should have initially started off configuring the script with the credentials with which it will be used in production. Another lesson is that software development can be maddeningly frustrating and that sometimes you just have to walk away. We don't all have the luxury of time that I did during this exercise, but sometimes some distance from a problem really brings clarity.

Friday, October 23, 2009

Visual Studio 2010 Beta1 uninstall

There's a new Visual Studio 2010 beta available. I wanted to see if the problems that I've experienced using TeamFuze with beta 1 were still present in the new bits. I hit what seemed to be a showstopper uninstalling from Windows 7 Professional x64: a prompt for the media installation path so that the 'TFS Object Model' could be removed. Uh-oh, I thought- I had just deleted the installation iso file AND recycled. I thought for a moment that I was either going to have to pull out a file undelete tool or download the beta 1 again.

Luckily, I found Scott Hanselman's post and was able to uninstall beta 1 without resorting to extraordinary measures.

Thursday, October 15, 2009

Windows 7 is satisfactory

I won't state anything profound or even moderately interesting in this post. I've been using various flavors of Windows 7 RTM for almost two months now and can report no problems. A friend of mine thought that I should have a more enthusiastic view of Windows 7. My response: "It's an OS. I've seen them come and go." But don't take my lack of excitement for a negative viewpoint. In fact, lack of excitement is a good thing when it comes to an operating system. I want something stable that allows me to get my work done. I want an OS that doesn't annoy or frustrate me whether it is at home or work. Windows 7 fulfills those requirements. It is pleasingly adequate.

Monday, October 5, 2009

When "Copy Local" doesn't copy locally

There are days when software development is fun, challenging, rewarding. Then there are days like today when frustration makes me want to hurl my monitors against the wall and generally destroy my office like Keith Moon.

The day started innocently enough as I tried to wrap up some final release details by executing my stable old deployment scripts. By 'stable' and 'old' I mean they haven't had to change in over a year and have staged several successful releases. One of the scripts reported that it couldn't find Oracle.DataAccess.dll. That's odd, I thought, and proceeded to waste 2 hours investigating why my reference wasn't copying locally. Because the scripts had been so seemingly reliable for so long, I had forgotten their flow. I couldn't remember who was responsible for copying Oracle.DataAccess.dll to the proper deployment location so I had to take a trip down memory lane and get reacquainted.

The bottom line is that though the reference is set to "Copy Local", its presence in the GAC causes MSBuild to not copy it to the output directory. Here's a great description of the problem and solution. And here's a second post on the subject.

I don't know how long the build and deployment system has had this problem. Probably a long time. There are a few key files, such as Oracle.DataAccess.dll, that get injected into the build process. The deployment scripts package everything and were probably obscuring the issue so that it didn't surface until now. Ultimately the problem is mine since I am the de facto build engineer here. I think that the best course of action is to rebuild the build and deployment environments after this release. There could be other lurking problems like this one that would be discovered by a fresh build server.

Monday, September 28, 2009

Dumbing down work environments

Our LOB application has a couple of different codelines. Occasionally two will be in use simultaneously for a short period of time. This requires developers to think about where they are working. The TFS task that is assigned contains all relevant information, such as the codeline in which the developer should be working, but there are a couple of people who just always pick the wrong one.

Maybe I need to dumb down the work environment. Right now we've got separate VS2008 solution and project files for the codeline. We've got separate databases related to each codeline. And the multitude of reports are also similarly segregated by codeline. I suppose that I could spend time getting the codeline name somehow worked into all object names, but this feel unnecessarily infantile to the other developers who are able to consistently pick the right place to work.

We have no junior developers on staff. No one is new to our organization either, so it's not like our software development processes are novel to anyone. Maybe I'm being too hard, but my expectation is that if your title is 'Senior Software Engineer' then you should be able to successfully analyze and execute work assignments. Knowing the personalities of the individuals involved I don't think that this behavior is intentional. That leads me to the conclusion that this is either apathy or sloppiness. Or both.

I suppose that this is a sore spot for me because I resent spending so much time merging and synchronizing. When someone works in the wrong place I've got to spend even more time sorting it all out. And then I also feel really petty when I have to scold someone who is a highly skilled, highly compensated professional for making such a rookie mistake. Again.

Friday, September 4, 2009

Windows 7 x64, Toad 9.7.2.5, Oracle 11g ODAC

After too much time and frustration, I finally got Toad (version 9.7.2.5) and Oracle Data Access Components to work on Windows 7 Professional x64.

For hours Toad wouldn't even acknowledge that an Oracle client was installed. I tried the 11g x86, Instant Client, 10g Oracle x64, finally Oracle 11g ODAC and Oracle Developer Tools for Visual Studio AKA ODTwithODAC1110621.zip started to work. At one point while trying to get Instant Client to work I manually set the ORACLE_HOME, TNS_ADMIN, and PATH environment variables. I don't know if that had any effect on my success. I'm too scared to remove them to find out considering the fragility of the Toad / Oracle client interaction. For now I'm just happy that my Dell Latitude E6500 can play any role other than netbook.

Wednesday, August 19, 2009

Corrupt TNSNAMES.ORA file

I just spent three hours troubleshooting this error: "ORA-12154: TNS:could not resolve the connect identifier specified." I had three TNSNAMES.ORA files with identical information. Oracle.DataAccess interaction with two of the three resulted in the ORA-12154 error. After scouring Google, I finally came across this nugget of wisdom:

Just a note, sometimes the TNSNAMES.ORA will get messed up. Nothing will jump out but it will not work and you cannot see the file though the net config. Just re-create it using notepad or [s]omething.

I replaced the two misbehaving files with the 'good' file and guess what? ORA-12154 magically went away.

Wednesday, August 5, 2009

ClickOnce and NHibernate

I've been working on a WPF MVVM intranet application that uses NHibernate for data access. As I expected, the initial ClickOnce deployment failed due to a missing configuration file and Castle assemblies. My View project used a Post-build event to get the required NHibernate files so that everything would run smoothly. Here's what that command line looks like:


C:\Windows\Microsoft.NET\Framework\v3.5\MSBuild.exe
   "$(ProjectDir)DeploymentUtil\CopyNHibernateStuff.xml"
   "/p:TargetCompileDirectory=$(TargetDir);TargetProjectDirectory=$(ProjectDir)"


Originally it just used the $(TargetDir) macro as an input to an MSBuild project file to copy the NHibernate files to the bin\Debug directory. In consideration of ClickOnce I modified it to also use the $(ProjectDir) macro to copy the files to the View project's root directory. Here's the contents of the CopyNHibernateStuff.xml file:


<?xml version="1.0" encoding="utf-8"?>
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<TargetCompileDirectory>.</TargetCompileDirectory>
<TargetProjectDirectory>.</TargetProjectDirectory>
<HibernateCfgPath>C:\SourceTfs\Patient Safety Organization\Eligibility Manager\PsorgEligibilityData\hibernate.cfg.xml</HibernateCfgPath>
</PropertyGroup>
<ItemGroup>
<NHFiles Include="C:\Program Files\NHibernate\Required_For_LazyLoading\Castle\Castle.Core.dll" />
<NHFiles Include="C:\Program Files\NHibernate\Required_For_LazyLoading\Castle\Castle.DynamicProxy2.dll" />
<NHFiles Include="C:\Program Files\NHibernate\Required_For_LazyLoading\Castle\NHibernate.ByteCode.Castle.dll" />
<NHFiles Include="C:\Program Files\NHibernate\Required_For_LazyLoading\Castle\NHibernate.ByteCode.Castle.pdb" />
<NHFiles Include="$(HibernateCfgPath)" />
</ItemGroup>
<Target Name="CopyNH">
<Message Text="NHFiles: @(NHFiles)" />
<Message Text="TargetCompileDirectory: $(TargetCompileDirectory)" />
<Copy SourceFiles="@(NHFiles)" DestinationFolder="$(TargetCompileDirectory)" />
<!-- Now copy files (again) to project directory so that they can be referenced as a publishable file by ClickOnce. -->
<Copy SourceFiles="@(NHFiles)" DestinationFolder="$(TargetProjectDirectory)" />
</Target>
</Project>


I did some spelunking in the View project file to determine how items appeared in the Application Files dialog on the Publish tab. I figured out that I could manually add my required NHibernate references in the following manner and they would be available to ClickOnce for publishing.


<ItemGroup>
<Content Include="hibernate.cfg.xml" />
<Reference Include="Castle.Core.dll" />
<Reference Include="Castle.DynamicProxy2.dll" />
<Reference Include="NHibernate.ByteCode.Castle.dll" />
</ItemGroup>


Of course after I did all of this work I discovered this post which described my process in detail. At least it confirmed that I wasn't terribly abusing MSBuild.

Monday, July 20, 2009

NHibernate and Oracle 9i

Spent 1.5 hours troubleshooting the following error when using NHibernate 2.1:

"NHibernate.HibernateException: Could not instantiate dialect class NHibernate.Dialect.Oracle9Dialect ---> System.TypeLoadException: Could not load type NHibernate.Dialect.Oracle9Dialect. Possible cause: no assembly name specified."

I was looking at outdated documentation. The correct dialect is "NHibernate.Dialect.Oracle9iDialect" (emphasis mine). "NHibernate.Dialect.Oracle10gDialect" is also valid.

Friday, April 24, 2009

The Case of the Disappearing Drive

I made a big mistake recently when reinstalling Windows XP Professional SP3- I didn't disconnect my secondary hard drive. After installation was complete the second hard drive appeared as unallocated space in Disk Management. The NTFS-formatted disk had plenty of stuff on it. Well, it did prior to my lapse in judgment. TestDisk saved me. Great tool. These two tutorials (thanks Herman!) walked me though recovering my partition, which obviously I had blown away during the XP install process. I had to actually use the dig deeper method to get my partition back.

Friday, April 3, 2009

VS2008 WPF Intellisense no longer works after SDK install

Another in the "note to self" series...

Intellisense stopped working for VS2008 WPF projects after installing the .NET 3.5 SDK. Resolution is here. This fixed my problem without a machine reboot.

Tuesday, March 31, 2009

Determine installed MS hotfixes

This is really just a note to self.

Enter wmic qfe list full /format:htable >C:\hotfixes.htm on the command line.

Wednesday, February 25, 2009

Error while installing Systems Management Server 2003 Service Pack 3 on Windows Vista

I hope that this saves someone some time.

If your attempt to install SMS SP3 on Windows Vista fails with the message "not enough storage to process this command" then follow the steps enumerated in this support article.

Monday, January 12, 2009

FreeNAS Server

Built a network-attached storage server using FreeNAS and a Dell OptiPlex 280 based on this guide from Maximum PC. Setup was unbelievably easy and the machine works great. The only problem I ran into was creating my shares. This post helped me understand the configuration required.

Friday, January 2, 2009

Editing multiple VS2008 solution files with PowerShell

Our huge LOB application is comprised of 30+ Visual Studio 2008 projects. One of these projects, which I'll call the 'Framework', is used as a reference by many of the projects. We use several different solution files comprised of a subset of the 30+ projects in order to improve Visual Studio performance as well as reduce Solution Explorer visual clutter.

I recently needed to move the Framework project to a new location in Team Foundation Version Control (TFVC). This move meant that some of the solution files needed to be altered because the TFVC references inside the file would now be invalid. Specifically, the SccProjectNameNN reference needed to be altered, where NN is the ordinal value of the project reference in the solution file.

Doing this in Visual Studio would be extremely tedious because each solution file would have to be opened, remove the invalid Framework project(which would also cause the reference to be automatically removed from the other projects in the solution), add the valid Framework project reference back to the solution, and add the Framework project reference back to each affected project. Time was also a factor. I didn't want to hold up developers by having solution and project files checked out. We allow shared checkouts, but I was concerned about developer TFS workspaces if I made any mistakes.

Recently I had used Powershell to alter reference paths in a Visual Studio project file. Could I do the same with the solution files? (Hint: Yes, I could.)


PS H:\> $projectLocation = "C:\Workspace"
PS H:\> $files = get-childitem $projectLocation -include *.*sln -recurse
| Select-String -pattern "/OldVersionControl" -SimpleMatch
| Select-Object -Property Path
PS H:\> $files | Foreach-Object { CheckOutTfsFile( $_.Path) }
PS H:\> $match = "/OldVersionControl"
PS H:\> $replacement = "/NewVersionControl"
PS H:\> $files
| Foreach-Object { (Get-Content -Path $_.Path -encoding UTF8 )
-replace $match, $replacement | Set-Content $_.Path -encoding UTF8 }


As you can see there's no real magic here- just a simple replace operation inside each file that has content that matches '/OldVersionControl'. The pipeline allowed me to avoid the dreaded manual tediousness of updating multiple solution files.