PGP Single Pass Sign and Encrypt with Bouncy Castle

Bouncy Castle is a great open source resource. However, the off-the-shelf PGP functionality is severely lacking in real-world-usabaility. Most of what you need is easy enough to code up yourself (and I would love to contribute what I’ve done if I could). One thing that you really need that doesn’t come built-in and is actually quite hard to do right is PGP Single Pass Sign and Encrypt. Here’s a class in the style of csharp\crypto\test\src\openpgp\examples\KeyBasedFileProcessor.cs that does exactly that.

using System;
using System.IO;
using Org.BouncyCastle.Bcpg;
using Org.BouncyCastle.Bcpg.OpenPgp;
using Org.BouncyCastle.Security;

namespace PgpCrypto
	public class PgpProcessor
		public void SignAndEncryptFile(string actualFileName, string embeddedFileName,
			Stream keyIn, long keyId, Stream outputStream,
			char[] password, bool armor, bool withIntegrityCheck, PgpPublicKey encKey)
			const int BUFFER_SIZE = 1 << 16; // should always be power of 2

			if (armor)
				outputStream = new ArmoredOutputStream(outputStream);

			// Init encrypted data generator
			PgpEncryptedDataGenerator encryptedDataGenerator =
				new PgpEncryptedDataGenerator(SymmetricKeyAlgorithmTag.Cast5, withIntegrityCheck, new SecureRandom());
			Stream encryptedOut = encryptedDataGenerator.Open(outputStream, new byte&#91;BUFFER_SIZE&#93;);

			// Init compression
			PgpCompressedDataGenerator compressedDataGenerator = new PgpCompressedDataGenerator(CompressionAlgorithmTag.Zip);
			Stream compressedOut = compressedDataGenerator.Open(encryptedOut);

			// Init signature
			PgpSecretKeyRingBundle pgpSecBundle = new PgpSecretKeyRingBundle(PgpUtilities.GetDecoderStream(keyIn));
			PgpSecretKey pgpSecKey = pgpSecBundle.GetSecretKey(keyId);
			if (pgpSecKey == null)
				throw new ArgumentException(keyId.ToString("X") + " could not be found in specified key ring bundle.", "keyId");
			PgpPrivateKey pgpPrivKey = pgpSecKey.ExtractPrivateKey(password);
			PgpSignatureGenerator signatureGenerator = new PgpSignatureGenerator(pgpSecKey.PublicKey.Algorithm, HashAlgorithmTag.Sha1);
			signatureGenerator.InitSign(PgpSignature.BinaryDocument, pgpPrivKey);
			foreach (string userId in pgpSecKey.PublicKey.GetUserIds())
				PgpSignatureSubpacketGenerator spGen = new PgpSignatureSubpacketGenerator();
				spGen.SetSignerUserId(false, userId);
				// Just the first one!

			// Create the Literal Data generator output stream
			PgpLiteralDataGenerator literalDataGenerator = new PgpLiteralDataGenerator();
			FileInfo embeddedFile = new FileInfo(embeddedFileName);
			FileInfo actualFile = new FileInfo(actualFileName);
			// TODO: Use lastwritetime from source file
			Stream literalOut = literalDataGenerator.Open(compressedOut, PgpLiteralData.Binary,
				embeddedFile.Name, actualFile.LastWriteTime, new byte&#91;BUFFER_SIZE&#93;);

			// Open the input file
			FileStream inputStream = actualFile.OpenRead();

			byte&#91;&#93; buf = new byte&#91;BUFFER_SIZE&#93;;
			int len;
			while ((len = inputStream.Read(buf, 0, buf.Length)) > 0)
				literalOut.Write(buf, 0, len);
				signatureGenerator.Update(buf, 0, len);


			if (armor)

TortoiseSVN: Tweaking the Context Menu; Global Ignore

If you use TortoiseSVN alot like I do, then you’ve probably often wished you could put more frequently used items in the first context menu that comes up when you right click on a file in Windows Explorer. By default only update, commit, and checkout are there. You can choose which items appear in the first menu and which appear in the TortoiseSVN sub-menu by selecting Settings, Look and Feel. There you can check and uncheck items to determine whether or not they are on the sub-menu. I got rid of Checkout and included add, delete, rename, and diff.

Another time saver available in settings is Global Ignore Pattern. Items matching this pattern will automatically be ignored when you commit. I currently have mine set to:

*/[Bb]in [Bb]in */obj obj */[Rr]elease *.user *.suo *.resharper */_ReSharper.* _ReSharper.* *.bak *.dll *.pdb.

Compare Files Method in C# Unit Testing

I’ve been working on a class that wraps the low-level PGP crypto capabilities of Bouncy Castle and presents a higher-level, directly-usable interface similar to a command-line utility. One obvious test is to start with a file, sign and encrypt it, then decrypt it, and compare the result with the original. I’ve gotten so spoiled to having so many rudimentary things like this already available in the .net framework I was surprised when I didn’t find a Compare method on the File class or anything like it somewhere else. If there is something already built-in, please post a comment. If there isn’t, hopefully this small method will save a few people a few minutes.

private bool FileCompare(string srcFileName, string dstFileName)
const int BUFFER_SIZE = 1 << 16; FileInfo src = new FileInfo(srcFileName); FileInfo dst = new FileInfo(dstFileName); if ( src.Length != dst.Length ) return false; using ( Stream srcStream = src.OpenRead(), dstStream = dst.OpenRead() ) { byte[] srcBuf = new byte[BUFFER_SIZE]; byte[] dstBuf = new byte[BUFFER_SIZE]; int len; while ((len = srcStream.Read(srcBuf, 0, srcBuf.Length)) > 0)
dstStream.Read(dstBuf, 0, dstBuf.Length);
for ( int i = 0; i < len; i++) if ( srcBuf[i] != dstBuf[i]) return false; } return true; } } [/sourcecode]

How Many Programming Languages Have You Used?

I’ve been hearing all the buzz for the last several years over dynamic languages like Ruby and Python. I always like to try the latest and greatest from time to time and sometimes even move myself professionally in that direction. I guess because I started out working primarily with interpreted, weakly typed languages and I’m now horribly spoiled by intellisense, I’m having trouble even taking a look at either Ruby or Python (yes, I know they are *technically* not weakly typed, but the “dynamic” part equates in my feeble mind). I was perusing a list of programming languages trying find an interesting alternative to Ruby or Python to play with, and thought it would be interesting to list the ones that I’ve actually written programs in and what level of experience I have with them.

BASIC — Hobby (Like many others, lowly BASIC got me hooked on programming at an early age)
dBaseIII — Production
Clipper — Production, Commercial
TurboPascal — Academic
DOS Batch — Production
8086 Assembly — Production, Commercial
Modula-2 — Academic
PDP-11 Assembly — Academic
Lisp — Academic
Prolog — Academic
Smalltalk — Academic
Ada — Academic
Actor — Hobby (Am I forgetting the name or how it’s spelled? Can’t find a link.)
C — Academic, Production, Commercial
C++ — Academic, Production
Protel (Nortel proprietary — I actually worked on the DMS 250) — Production
Rexx — Production
Visual Basic — Production
T-SQL — Production
Java — Production
PL-SQL — Production
C# — Production
PHP — Production

One language I left out was the pet project of a professor of mine in college. It was Prolog-like language that he had us implement as our major assingment in compilers class. I actually won a copy of his book (aren’t you jealous :p) for having the best compiler in the class. Unfortunately, I threw the book out a few years ago and I can’t recall the name of the professor or the langauge for the life of me.

So which languages have you used and which ones are you interested in learning?

RAID 5 Sucks on Intel Matrix ICH8 (82801 SATA RAID) Integrated Controller

I started using RAID 1 as a matter of course after suffering some major downtime due to hard drive failures on my desktop and home machines a long time ago. First, I used the software RAID available in Windows NT (or was it WIN2K?). As I started building machines with integrated (fake raid, as some people call it) RAID controllers, I took advantage of them instead of using software RAID which was, in my experience, susceptible to frequent rebuilds due to lock-ups and power outages.

In the last machine I built, I decided to give RAID 5 a try, mainly because the 320Gb drives were so cheap at that time (but I didn’t want to buy 4 or larger ones I guess?). I develop on this machine, so write performance is very important. I’ve been running RAID 5 for the last 10 months or so. I’ve never been particularly happy with the performance of this machine. It always spent a lot time hammering the hard drives and I noticed that things would get sluggish, even though my CPU utilization never cracked 5 or 10 percent.

Things finally came to a head when the family and I went snowboarding for spring break. I really wanted an offsite backup of two of my machines in case someone broke into our house and stole the computers. One is my development machine (RAID 5) and the other is our household server for photos, music, and movies (RAID 1) which also doubles as the second gaming machine for my son’s friends, believe it or not.

I had forgotten that I used RAID 5 on my development machine. I ran up to Best Buy and picked up two hard drives that I was planning to exchange with the existing hard drives in the machines. The existing HDs I removed would be my instant offsite backups. On the machine running RAID 1, this worked beautifully. On the machine running RAID 5, it obviously didn’t work at all. Since this was the morning before we were leaving, I barely had time to copy important files from my development machine to my laptop, which was coming with us and would serve as an offsite backup of sorts.

When we got back from vacation, I wanted to salvage the money I spent on the extra drive for the machine using RAID 5. I disconnected one of the drives in the RAID 5 array as the first step in trying to migrate to some different configuration. When I reattached it, my machine was unusable for 22 hours! I mean so slow it was truly unusable! Turns out, RAID 5 suffers up to 80% performance degradation when rebuilding. You can see the gory details here. Not only that, but on the ICH8 controller, you can’t migrate from RAID 5 to any other configuration. I even tried turning on array write caching to boost performance. This helped a little but be prepared for that hideous rebuild every time you hit the reset button or suffer a power outage.

Now here’s where I really saw how badly RAID 5 sucks. I still had the 500GB hd I bought that I was going to use to mirror my dev machine to. So I swapped that in to my household server machine’s RAID 1 array which had only 2Gb free (my wife takes an insane number of photos). 20 minutes later, the array is already rebuilt. Not only that, but the machine is actually usable while the array is rebuilt. I then bought a second 500Gb drive, swapped that in, and, viola — I had just increased the capacity of my household file server in less than an hour with no pain whatsoever.

Meanwhile, my dev machine is still on RAID 5. I do some research and realize that RAID 5 suffers from really bad write performance. Combine that with the fact that it’s literally unusable (at least on ICH8) while a rebuild is underway and it’s only advantage is space effeciency, and you must conclude that RAID 5 on the ICH8 (82801 SATA RAID) is a terrible choice for a development machine.

Now, how in the world do you recover from your horrible choice of RAID 5 on ICH8 without doing a complete reinstall (which is incredibly painful)? Not easily! Since I had RAID 5, the array was larger than the the individual drives that made up the array so you can’t just take a drive out of the array and copy over to it. So now you need to shrink an NTFS partition, assuming your usage of the array, like mine, is still less than the size of component drives.

If you weren’t using RAID 5, shrinking your partition would be free and easy using Ubuntu 7.1. Unfortunately, even if you go to the trouble of trying dmraid, it doesn’t work with RAID 5. After hours of research, I finally ended up buying Acronis Disk Director which worked perfectly. I was able to shrink my partition down to 200GB and then copy it to another drive.

Since I now had 6 320GB drives, I converted my newly liberated 200GB Windows XP NTFS active partition to a RAID 10 array. Just in the time it took me to write this post, the rebuild is already 40% complete and the machine has been completely usable during that time.

In conclusion, RAID 5 on the IC8H SUCKS!!! Don’t use it under any circumstances. Disk space is so cheap there’s no reason to use RAID 5 instead of RAID 1 or RAID 10 on a workstation. Either buy one more drive (RAID 10) or buy two bigger drives (RAID 1).

Declaritive Nullability In Programming Language Constructors

I’ve often thought shallowly about but never had much chance to explore the idea of a new programming language or enhancements to existing programming languages for unit testing. Unit testing has come a long away over the past 5 years but I still feel like there’s a lot of room for improvement.

I’m a big fan of constructor dependency injection because it makes the depdencies of your class crystal clear. If you add a dependency, you have to review all of the code that instantiates instances of your class, which usually is a very good thing. I frequenty add a check in the constructor that throw a null argument exception if the actual parameter passed in is null. Resharper makes this easier, but I’d really like to see a way to do this declaritively, like field nullability in table declarations in SQL DDL. EG, in SQL you have:

CREATE TABLE Blah (counter int not null)

Wouldn’t it be nice to have something like this in a c# constructor?

public SomeClass(notnull IDataAccess access)

Instead of adding a check for null and throwing an exception yourself (or having resharper do it), the compiler would take care of this for you. Also, you wouldn’t need to test that the constructor throws an exception when a null actual parameter is passed in.

Actually, the more I think about this, the more I’d prefer that constructor arguments default to not being nullable and instead have a nullable key word.

Has anyone thought of any other things that you could do to tweak a language to make it more unit test friendly?

Terry Goodkind on Philosophy

My first undergraduate degree was in “Plan II” (my second was in C.S. after 3 semesters in law school) which is an honors liberal arts degree program at UT Austin that tried to provide an “ivy league” experience — top professors and small class sizes. As a result, I ended up taking 15 hours in philosophy. Although I never would have taken any philosophy on my own, I have to say that these classes were some of the most influential and enlightening that I took.

I’m also a big fan of the Sci-fi and Fantasy book genres and I recently started reading Terry Goodkind’s “Wizard’s First Rule” on an acquaintance’s recommendation. I’ve really enjoyed the book and as I was reading it, I recognized some Ayn Rand-ish concepts (IE objectivism) so I googled Terry Goodkind and was browsing around his very nice website where I found this:

Some people use big words to try to make their beliefs sound scholarly and important or, worse, to hide the fact that their beliefs don’t make any sense. Don’t ever allow such people to bully you with their attempts to make philosophy impossibly complex, or intimidate you into accepting what they say. What this kind of person wants is for you to blindly believe them; they don’t want you to think for yourself.

Reason demands clarity. Whenever presenting your views it is essential to be clear about those beliefs and to frame them rationally. When confronted with reason, some of those without rational arguments or beliefs will frequently switch to personal attacks. Obscenities and insults are the product of an ineffectual mind, merely the crude tools of the enemies of reason and thus the enemies of life. You cannot reason with this type of person; they are incapable of reason. Clarity and reason are tools of truth. Use them to better your own life.

Now whether you agree or disagree with objectivism (I find it incomplete and overly simple, personally), you have to wholeheartedly agree with his sentiments about philosophy and clarity of thought in general. The world would be a much better place if more people took Goodkind’s advice to heart.

It’s the People Stupid!

Jeff Atwood recently posted the table of contents of “Facts and Fallacies in Software Engineering.” I’m sure it’s no coincidence that the the section on “People” is first:


The most important factor in software work is the quality of the programmers.
The best programmers are up to 28 times better than the worst programmers.
Adding people to a late project makes it later.
The working environment has a profound impact on productivity and quality.

I wish more people understood these four facts. In 24 years, I’ve watched upper management pretend these things weren’t true over and over again with disasterous results. If you’ve ever been fortunate enough to work with great people in a well-run organization and also suffered the misfortune of working with talentless people in a poorly run enterprise, you know that it’s the people stupid! You can’t make up for bad people with “good” process or methodology. The rest of the 50 item list is interesting and useful, but you could go a long way with just these first four “facts.”

I’ve often said that the best programmers are 10 times better than the worst, but I can easily believe 28 — I’ve never actually tried to measure. The sad part is that be it 10 or 28 times, you never see the best programmers making 10 times the worst — at least not in my experience.

Counting Messages in an MSMQ MessageQueue from C#

Surprisingly, this not as easy as MessageQueue.Count; When I searched for how to do this, I kept seeing solutions that required use of COM interop or Performance Counters. I didn’t really like either of those suggestions so I figured I’d just try Peek with a cursor. I was pleasantly surprised to find that this method was more than adequate for my needs. I ran this against a queue with 1000 (relatively small) messages and, including the time to open the queue, this took only 0.04 seconds. I was using private, transactional queues. I haven’t tested any other scenarios. But for my needs, this worked fine.

protected Message PeekWithoutTimeout(MessageQueue q, Cursor cursor, PeekAction action)
	Message ret = null;
		ret = q.Peek(new TimeSpan(1), cursor, action);
	catch (MessageQueueException mqe)
		if (!mqe.Message.ToLower().Contains("timeout"))
	return ret;

protected int GetMessageCount(MessageQueue q)
	int count = 0;
	Cursor cursor = q.CreateCursor();

	Message m = PeekWithoutTimeout(q, cursor, PeekAction.Current);
	if (m != null)
		count = 1;
		while ((m = PeekWithoutTimeout(q, cursor, PeekAction.Next)) != null)
	return count;			


Healthcare Costs Out of Control — My Simple Solution

Back at the beginning of February, my son sliced his finger with a linoleum carving tool in Art. Here’s a sketch he did of it later:

Alex Cut Sketch

I picked him up and took him to the ER at a local hospital. The cut required four stitches. I was reviewing the claims for this on line and noticed that the hospital tried to charge my insurance company $362 for “Misc Services” and the doctor staffing the ER charged $898 for “Op Services” and “Surgery.” I’m supposed to pay a $150 deductible for ER visits. My insurance company reduced the hospital charge to $139 and had me pay it as my deductible. They paid the $898 charge to the doctor without reduction.

The primary motivation of my call was to determine why I had to pay the $139 (which turned out to be the lesser of the allowable charge and $150 deductible). I was also amazed that getting four stitches costs $900. That’s outrageous. That’s $225 per stitch! So the total cost associated with taking my son to ER to get 4 stitches in his finger was $1037!!!

Granted, I only paid $139 but I realize there’s no free lunch. All I have to do is look at what we’re paying through my wife’s employer for health insurance — $421/month to cover a family of four. I know from my former employer that the total cost of my health insurance was over $1300 a month to cover a family of four. These costs are out of control and unjustified. There have to be other factors at work here.

One that I can easily point to is ER treatment of illegal aliens for non-emergency care. Unfortunately, I’ve been to the ER three times in the past 18 months. Every time, a majority of the people waiting were spanish-only speaking brown skinned people with sick babies. I didn’t ask for papers, but it’s not a huge leap to infer that the majority of them were, in fact, illegal. Illegal aliens don’t pay payroll taxes (they present an strong argument for consumption based taxes, actually). In fact, I see the whole illegal immigration issue as a ridiculous subsidy for certain industries. They get cheap, exploitable labor at very low cost while all the rest of us foot the bill for social services that the employer should be paying for. I’ve always been puzzled by “liberals” that support illegal immigration. They are on the wrong side of this from the “I-hate-the-evil-corporations” standpoint. In fact, they really like cheap landscaping services, maid, and nanny services.

Another problem I can easily identify is other people (legal aliens, and citizens) opting out of paying for the health care system while at the same time relying on it via ER care. If you don’t have access to health care via an employer, it’s very expensive. So people who are on the edge (and even some rich, greedy folks), so to speak, just decide not to pay for health insurance, even though they will have access to it via emergency care. This should be addressed with a catastrophic health care payroll tax. Illegal immigrants and people who voluntarily or involuntarily “opt-out” of paying for their healthcare should not be allowed to be free-loaders (in the classical enconomic) sense any more. This is a clear tradgedy of the commons situation and needs to be addressed with federal intervention.

Another major problem is the whole “employer-provided” health care concept. People don’t see how much they are actually paying for health insurance. People also don’t see how much health care providers charge for services. Sure, you get explanation of benefits, but who really reads those? I know I don’t most of the time. People with health insurance just don’t have an incentive to keep costs down because they are shielded from those costs. The other big problem with this system is that insurance providers just have to sell corporate execs, they don’t have to provide good service. If you want health insurance, it’s almost always cheapest to get it through your employer — especially since it’s not tax deductible if you buy it yourself. So if you don’t like the service you get, it’s tough shit for you and the insurance companies know it.

My solution to these problems is very simple and requires only three things.

The first is the simplest. Make all healthcare expenses tax deductible for everyone, not just corporations. Why we put a 7.5% AGI minimum on deductible health care expenses is a mystery to me. There are Flexible Spending Accounts, but these are really just a bone thrown to plan administrators (insurance companies). Why do I need to commit a certain amount of dollars up front? Why should I have to try and guess what my medical expenses for the year will be and why shouldn’t the money roll over if I don’t spend it all? FSAs as currently structured are a joke. Just make all healthcare expenses tax deductible. Simple and effective with no corporate welfare thrown in.

Second, a new federal payroll tax that would fund catastrophic health care policies for all. If you have a job, you will pay this tax. You will pay it no matter how much you make and it will be the same rate for all people. I’m a firm believer in everyone paying some taxes no matter how little they make. Everyone that earns income (no matter how small) should have a stake in how the government spends tax money. The government will set minimum policy requirements and allow private insurers to offer policies. Each taxpayer will be able to select their policy of choice each year.

Third, lock-down the border and implement a real guest worker/immigration program. By lock down the border, I mean do whatever it takes to shut it down. If they have to build a fence and man it with machine gun-toting guards, fine. I’m tired of hearing how this is impossible and will cost too much. It’s quite possible and wouldn’t cost much at all. Do the math yourself and then compare your computed costs to the national budget. Now compare them to the cost of social services, education, and health care for those workers. Yes, you see it really is possible and not that expensive.

Now, by real guest work/immigration program I mean allow millions of people to work/immigrate/apply for citizenship a year. If there’s demand for 10 million cheap laborers, then do the paperwork and get 10 million cheap laborers here legally. I’m even for a path to citizenship. See, I’m not a racist. So don’t pull that tired argument out. I’m for law and order. I’m for screening out people with criminal backgrounds or contagious diseases. I’m for creating a non-exploitive environment for “undocumented workers” that isn’t a huge corporate subsidy. And notice that I’m not even mentioning terrorism here. This is a simple dollars and sense issue with the bonus of creating a less exploitive environment for manual laborers.

The sad thing is, I know I’m not the smartest person in the world. Not even close. In fact, I’m sure many of the people running for various offices, including the presidency could think of all this themselves. I’m sure they have. You could tweak the details of my plan to make it look more “liberal” or “conservative.” But the fundamentals are sound and plainly visible to anyone that looks at this problem with some care. We are paralyzed on this issue because of the special interests that would be hurt by this plan, namely health insurance companies, health care providers, corporations exploiting cheap labor, and hypoctritical, racist latino groups that promote illegal immigration to grow their power base.