Why can’t I a have string? aka Nullable<string>

It’s not often that I write a post to rant about something that is by design and not actually a bug or just hard to accomplish end but something that really bothers me is the fact I can’t declare a string nullable.

The number one response to that is ”surely you jest as string is already nullable. Which obviously it is, however it’s also created to behave as a primitive type. This being the case makes it the only primitive type you can’t declare as nullable. Once again people usually will re-iterate, ‘exactly, it’s already nullable.’

If string is nullable, why can’t I preform .GetValueOrDefault()?

This is one of the most useful methods available as it easily let you establish frameworks around your primitive types without ever once needing to worry about triggering a null reference exception. It also clearly defines all objects that you expect to be able to be uninitialized and null. This is why I hate that strings cannot be nullable, it’s impossible to tell the intent of the string. Is null an accepted value because it could imply string.Empty? This always has to be handled by convention and validation there’s no way to clearly define this, whereas if you could declare string? this would definitively tell you that null is an expected value and acceptable.

To further my proof is the static method on the string class, string.IsNullOrEmpty(string). This method was placed here because the C# team clearly knew that strings being complex objects treated as reference types is very ambiguous and knew to expose a standardization on how to check this. Why would none of the other classes provide a type.IsDefault(type) method? So in my conclusion I should be able to do string? and sadly I can’t and I guess would have very little regress other than perhaps downloading mono and probably needing to alter the source code for itself.


12 thoughts on “Why can’t I a have string? aka Nullable<string>

  1. “string?” is weird: would that mean that “string” can not be null anymore?

    For the GetValueOrDefault() method, you can easily accomplish this yourself:

    public static class StringExtensions
    public static string GetValueOrDefault(this string s)
    return s ?? null; // or string.Empty

    • IMO strings should have never been nullable and should have been true primitive types, but in the original conventions of the language that would have been problematic before nullables were introduced in 2.0

  2. Yes both of those are ways around the limitation of that but the reason why neither of those are good enough is the fact I couldn’t base type checking on whether the object is Nullable or not, I suppose I could also check if it’s Nullable OR a string but it’s retarded I need to make that type of assertion. The reason why alot of this becomes important is later you could build your framework around your Domain Model and write convention rules based whether the type is nullable or not. A perfect example of this is using that as a basis of a not null verification for NHibernate Validation.

    • It’s not a mistake, it SHOULD be a primitive type. What .NET chooses to do with how it passes strings by reference and the actual allocation of strings are an implementation detail. Strings are faux immutable, there’s still things you can do to cause them to mutate, albeit not obvious things. This is where the mistake is, making string behave similar to a primitive type with pseudo-immutability where it should be a primitive with true immutability.

      • I don’t agree that it should be a primitive type. If string would be a primitive type than we would need to pass them by reference for efficiency reasons. And what about methods returning strings, you couldn’t use them in the normal way, because again that would not be efficient, you would have to pass the result as a method parameter with “out” or “ref” modifier. String is a reference type but is NOT “PASSED BY REFERENCE” in a method call, it is passed by value unless you use the “ref” or “out” (don’t confuse the expression “pass by reference” with the reference type). What that means is the reference to the actual data is copied to the method (it gives you a new copy of a pointer to the actual data) so if you allocate a new string to that reference, the caller method can’t see this change. To pass a string by reference, you would have to use “ref” or “out”, and now the caller can see the new allocation made by the method (think about as a pointer to pointer).

      • Em you’re mistaken. Strings are most certainly passed by reference. The difference between using ref/out is whether the REFERENCE is passed by reference, or whether the REFERENCE is
        passed by value. For citation see Jon Skeet: http://stackoverflow.com/a/1096456/37055

        If strings were cloned and passed by value that would result in alot of wasted memory. This was the core reason strings were created to behave as fake primitive types through the appearance of immutability.

      • Yes, of course I’m refering at the reference to the object, not the actual object, see my explanations above with the pointer analogy.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s