The evilness of ApplicationException

This post explains what’s wrong with ApplicationException and why it should not be used.

For over three years the Framework Design Guidelines (also available online on MSDN) make a clear statement about the use of ApplicationException:

DO NOT throw or derive from System.ApplicationException.

The reason for the existence of ApplicationException is explained by Jeffrey Richter in the Framework Design Guidelines:

The original idea was that classes derived from SystemException would indicate exceptions thrown from the CLR (or system) itself, whereas non-CLR exceptions would be derived from ApplicationException.

There are two problems with this idea however. First of all it's very questionable if there is any use in separating CLR exceptions from non-CLR exceptions. You hardly ever want to catch all non-CLR exceptions or inverse: catch CLR exceptions. They are both just too general to catch. Second, not all exceptions in the .NET framework follow this pattern; some CLR exceptions do inherit from ApplicationException (such as TargetException, TargetInvocationException and WaitHandleCannotBeOpenendException). This defeats the whole idea of having an ApplicationException in the first place.

While Framework Design Guidelines are very clear about this, outdated documents and guidelines are still floating on the web that recommend using ApplicationException. Some even published by Microsoft!
The ApplicationException in itself isn't considered harmful, inheriting from it is just considered useless. It gets harmful, when catching ApplicationException. Not only does the .NET framework itself throw exceptions that derive from ApplicationException, but a lot popular frameworks you might use in your production code throw exceptions that derive from ApplicationException (such as tools from Telerik, Aspose, LLBLGen, Microsoft's Enterprise Library, Log4net, MySQL, ICSharpCode, aspNetEmail, Html-to-pdf and many, many more).

That makes catching an ApplicationException so general that I consider it similar to catching the Exception base class. You simply never know what you're catching.

While inheriting from ApplicationException itself has no evil in it, the problem starts when you, as a framework or application developer, derive all your exceptions directly from ApplicationException. For instance, a few years ago on a project of one of my clients, the developers defined exceptions that were thrown by the business layer and contained information for the end user. The exceptions were caught at the presentation level and subsequently displayed on the user interface. While there's nothing wrong in doing this, the problem was that these 'end user exceptions' all inherited directly from ApplicationException and the presentation layer caught those ApplicationExceptions. It’s not surprising that once in a while technical exception messages were displayed to the end user. Showing technical information to your users will not only annoy them, it could cause a security risk, because you’re leaking information.

Another client recently went even further. While the developers did actually define their own exception base class for business layer exceptions (which is a good thing), they decided to name it ‘ApplicationException’. What they didn’t realize that about 75% of the time they actually caught the real System.ApplicationException instead of their own business layer exception. This caused all sort of subtle, hard to find bugs. The remedy was to rename the exception and fix all lines were System.ApplicationException was caught.

While the OR/M mapper framework LLBLGen uses ApplicationException, the design is better than the cases above. LLBLGen defines a base exception (called ORMException) from which all other exceptions inherit. ORMException inherits from ApplicationException and thus all LLBLGen exceptions inherit from it. Catching an LLBLGen exception however, can be done by catching ORMException and therefore it's not a problem that ORMException inherits from ApplicationException. Still, it's useless and while being a breaking change, I still advice LLBLGen to let ORMException directly inherit from Exception. This way LLBLGen exceptions can not accidentally be caught when a application developer catches ApplicationException.

If you’re maintaining an existing code base, especially the code base of a reusable framework, you have to be careful when making breaking changes, like removing ApplicationException from the inheritance hierarchy. However, when building a new framework or application, just don’t use ApplicationException.


- .NET General - two comments / No trackbacks - §

The code samples on my weblog are colorized using javascript, but you disabled javascript (for my website) on your browser. If you're interested in viewing the posted code snippets in color, please enable javascript.

two comments:

Can you elaborate on why the application exception shouldn't be used, the scenario you described was related to a specific case where the developers have named there custom exception as Application Exception. If we are creating a custom exception than that means we are well aware of the application and the exception it raises and we have defined for a purpose.
Kris - 29 10 10 - 01:26

You can throw as much ApplicationExceptions you want without a problem, as long as you not have any logic in your application that depends on handling thrown ApplicationExceptions. Like I said in the post, once you catch an ApplicationException you actually never now what your catching, because besides the exceptions you've thrown yourself in your application, it can basically be anything.
Steven (URL) - 29 10 10 - 08:40