Optionals cause more problems than they solve

Which member was nil? Which member could be nil?

Writing b? is unwrapping b. If you write if a.b?.c.d != nil, the compiler will be happy.

I think this discussion would move forward if you could provide some concrete examples where Optionals are awkward. It's understandable that the constant corrections are frustrating when you're not used to them, but that eases with practice. The awkwardness won't go away, and we could be more productive attempting to tackle those spots than bashing on the concept of Optionals in modern programming languages.


Many languages have had an equivalent for years; some have added an equivalent recently.

Languages like C++, C# and Java have to make major concessions for backward compatibility. Languages like Scala and Kotlin have to make concessions to interoperate with Java libraries but try to stick to the Optional-equivalents in their β€œnative” code. Languages like Haskell and Rust are as uncompromising as Swift.


…and in many (most, in my experience) cases, the compiler will offer to fix it for you, and Xcode will automatically insert the ? if you type a.b.c. Let's acknowledge that you often don't have to remember which properties are Optional, because you get a lot of help from the tools.


The Apple community had this discussion when Swift was announced 6 years ago, so there's nothing really new coming up here. There are a few Obj-C holdouts but most have embraced explicit nullability and the safety it brings. Oh, some Obj-C devs will say Obj-C was nil safe, but it really wasn't. You could safely message nil, but you pass nil to a method not designed to handle it, or try to add it to a collection, or a number of other things, and you quickly find your app mysteriously crashing. Once I learned to embrace Optionals and properly design with them in mind, things became so much easier.


Do you recall which ones don't have nesting issues? It would be nice to check out alternative implementations.

I think some definitions of optionals mean different things. Sometimes optionals are used to refer to class/function definitions where a property isn't required. Maybe similar in nature but they have very different motives. Optional properties are to allow less concrete class/function definitions to increase flexibility, the Swift kind is trying to prevent unexpected nil types and decreasing flexibility. I use optional properties all the time and they share a lot of similar syntax.

The unwrapping aspect of Swift's optionals is harder to deal with. Unwrapping isn't something I've seen in any other language.

The outcome would be the same no matter which is nil. Putting the question mark at the end means you don't have to think about it. That's one of the main things I don't like about having it enforced everywhere is that it forces me to verify each and every optional in my own code and 3rd party library code.

People often say that but it hasn't been my experience that the compiler's fixes have been helpful and that's a point I was going to mention. The error messages especially are unclear as to why things are wrong and why the suggested fix is right. Such as recommending ?? instead of ? and then saying after inserting ?? that it's not being unwrapped properly. It leaves you going in circles trying to figure out if it's the unwrapping that's wrong or the declaration.

Even if I get a prompt while typing, I still have to double-check the references in my code and 3rd party code to make sure that the prompt is what I want and there are times where it will insert the '?' and then raise another error caused by that change.

This also makes it near impossible to develop Swift code without an editor like Xcode that understands these references, which hinders its adoption cross-platform.

I'm open to embracing the use of optional types. I see the benefit of preventing unexpected nil types and the person who invented the nil reference said he regretted adding it:

What I don't like is when implementations cause problems as the complexity grows. This is the case with Swift's implementation. This is due to it being enforced everywhere and allowing nested optional types, which have unwrapping issues. Getting rid of nested optionals would be an improvement.

I also had difficulty wrapping my head around optionals, in the beginning. Your concern about productivity is genuine. However, once I got experienced with the language, it made me more productive. A strong type system like Swift's gives me confidence to do huge refactors without fear. I would like to hear if there are stories from engineers with Swift experience having issues with Optionals.


Not necessarily. Getters are allowed to mutate state, but optional chaining has a short-circuit effect. So the resulting state of the program could be very different depending on which member in the chain happened to be nil.

The alternative, that everything could implicitly be nil, looks simpler but decades of practice tells us that in reality it scales much worse.

You have to keep track of this anyway. Optionals actually make it easier than tracking these things in your head.


Personally, I have some things that annoy me, but they don't outweigh the benefit of having them in the language. One example is having to wrap try? expressions with parentheses when they are followed by dot notation access. Not even sure if we could say this is related to Swift's optional design, though. Probably more related to try syntax.

The concern about Xcode's fix-its not existing on other editors is a big issue, but nothing stands in the way of other editor's plugins. As far as I know, they can implement that today.

1 Like

Part of embracing Optionals is adapting your system design to account for them. That is, minimizing the number of optional values you deal with or finding better ways to model something instead of an Optional. Once that happens, there's little friction left since your system only uses Optionals in the cases where it truly makes sense, rather than a common default.


Optional conceptions are welcome to most of modern languages, no matter compromised/uncompromised.

You could check out these articles below, and getting used to it.

Ah, yeah changes like that have been happening as the language has evolved. Sounds like that's an additional pain point for you, as are the nested optional/nil coalescing stuff that you're working through.

You posted the link to SE-0054 which was implemented in Swift 4.2 a while ago. This makes me wonder how much of the sharp edges you are running in to might have more to do with the upgrade path to later versions of the language. What version of Swift was this app on before? And what version are you trying to upgrade to now?


At the enterprise level with millions of lines of code perhaps because minor issues are costly but not for most average projects and even open source projects with millions of lines of code, it's a minor issue. Just browse any large github repo's issue tracker or commit log and see how many bugs are caused by null references. If it was such a big problem for most projects, they would switch to a language that explicitly prevented null references. Nobody is doing this.

Not quite, in other languages, you only need to care about the possibility of nil as a result of an execution statement. You don't need to think about which variables a library developer defined as optional or not.

Right but you can't do that when you deal with untyped data like JSON or database data. It's held as strings and dictionaries.

I ran into some issues upgrading a minor project from Swift 3 to 5. It's a ridiculously simple project but had over 300 errors in the migration because of all the Swift changes. It's mostly fixed now and the migration isn't the pain point but that Swift 5 hasn't really improved dealing with optionals much.

The biggest pain point is what's linked to in some of the articles @BigSur posted above. Things like this:

let nameAndAges: [String:Int?] = ["Antoine van der Lee": 28]
let antoinesAge = nameAndAges["Antoine van der Lee"]
print(antoinesAge) // Prints: "Optional(Optional(28))"
print(antoinesAge!) // Prints: "Optional(28)"
print(antoinesAge!!) // Prints: "28"

Like I say, I normally deal a lot with untyped dictionary data and I don't like dealing with multiple wrappings on optionals. In the first print statement above, a typical language result would be 28.

That's the result I want because I know it's not nil. I don't want to have to think about unwrapping everything multiple times with all kinds of crazy syntax all through the code.

Because the craziest part of it all is that those guides on how to deal with optionals even say, hey why not just implicitly unwrap it. Ok, well what was the point of wrapping everything if I can break the safety feature at any time and introduce a potential bug? Having it explicitly declared can help make it easier to track down but it's not exactly hard to track down when something is nil and if it doesn't prevent the bug from happening then it's of little value compared to the complexity.

If I could only ever deal with single level optionals and unwrap them simply and safely without too much code bloat or confusing syntax like nil coalescing, that would be fine.

if let antoinesAge = nameAndAges["Antoine van der Lee"] {


guard let antoinesAge = nameAndAges["Antoine van der Lee"] else {
  print("Where'd her age go?!")

But your example is contrived. Why did you define the dictionary to hold optional values? What's the use-case for that? If you have one, how often do you print the value retrieved and care how many levels of Optional it's wrapped in?

Even in other languages, you have to deal with the fact that a dictionary might not hold a value for the queried key, so what's the difference between

if let v = d[k] {


v = d[k]
if v != nil {



Sorry but why are you going for [String:Int?] instead of just [String:Int]? What you're telling Swift here is that you can define a key whose value is either Int or nil. So yeah when looking for a key you now have Optional<Optional<Int>>. Going with [String:Int] you would just get Optional<Int>.

No you don't :man_shrugging: It's a dictionary you might add/remove or pass it to a function where you don't know what's been defined in it. You might even mistype your string, so you're 100% sure there's a value for "Antoine van der lee"?

You can unwrap any amount of nesting using identical syntax, and you could since at least Swift 2.0.

That's what if let v = optional ... does.

1 Like

This in itself is a pretty outrageous statement. Essentially you've moved the bar so low that even reliability isn't important.

Actually it's a lot easier than that - I can just point to your own post showing how the person who invented null references regrets them. Why do you think that is? Because they were too useful? Maybe he missed a great patent opportunity? Let's take look - here is the quote from your link:

My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years


A language like... Swift, perhaps? Or Rust? Or Kotlin? Seriously. All I can do is point you to @mayoff's excellent post literally just a few posts above yours. Everybody is doing this.


If you ask me, OP is just a troll. Don't waste your time answering him.




Not my example, it was on the following site:

The use case is untyped data like from a JSON data structure.

You keep using basic examples that are one level deep.

Which out of the options on the following page is the preferred method that will unwrap multiple or single optional levels:

Is there a syntax that will unwrap variables not having to know the level of nesting or do I always need to use the appropriate syntax depending on if it's single or multi-level optionals?

I don't see how you reach that conclusion. Good developers write nil-safe code by design, it's a rare exception that nil errors reach production code.

He's talking about the errors in all projects in the world over 40 years as well as mission critical software. I'm talking about on a per project basis.

These languages are not the most popular production languages and not by a long shot and when you say everybody, you are talking about new language designers. I'm talking about developers. Which developers are migrating their codebases to nil-safe languages?

I'm honestly confused by your entire first post. Maybe I'm not completely understanding your point, but it seems like you're advocating for the possibility of writing incorrect code faster. If this is your goal, and it's the same for "the rest of you", the only thing I can hope for the future of software development in general is that this misguided mindset (push garbage out in the world faster) will eventually disappear from the field.

Also, quickly copypasting code between languages is a non-goal. Maybe Swift is simply not the language that you want to use, I'd advice against trying to bend your mindset around a language that's fundamentally different from many other languages that you might be used to: just to consider a popular language as a means of comparison, a language like Java is at the polar opposite of Swift.

The very idea that the concept of "optionals" is a distraction is akin to thinking that, when designing an engine, the laws of thermodynamics are a distraction.

No, I can't provide stats, there's no stats: when confronting programming language features, stats make no sense, due to both the impossibility to control for other factors, and substantial differences in culture and patterns between languages. What I can say is that the presence of algebraic data types, like optionals, has completely eliminated, for me, an entire class of bugs, which is also the most popular one, that is, incorrect representation of state. You simply can't correctly represent most states of a system without algebraic data types.

Without the possibility to correctly control for the possible states of a system, developing would be a mess, which it actually is in languages that don't allow for it, but many simply don't see this because incorrect-by-design development seems normal: it's like the fish in the sea that doesn't see "water" because water is all there is. Also, I was that very fish, many years ago, before understanding that pushing out incorrect code is (professionally) unethical, and that there are alternatives.


I fear you are correct.