i came across this article a while back, which i thought was an illuminating read:
https://stevelosh.com/blog/2013/09/teach-dont-tell/
Act II summarizes pretty well why tests are not an effective form of documentation.
i came across this article a while back, which i thought was an illuminating read:
https://stevelosh.com/blog/2013/09/teach-dont-tell/
Act II summarizes pretty well why tests are not an effective form of documentation.
Ditto, here.
I vote 1024+ for this.
I agree, too, with a nuance for the "requirement".
I'm not a professional documentation writer, but in my own experience APIs I design have no usage value until I can write sensible documentation for them.
Putting myself in the shoes of the user allows me to teach a feature along with use cases the reader can relate to. Nothing too complex, but not limited to toy examples either: I want the reader to grow a reasonable mental model, and the API to be usable.
The sweet spot is when the API becomes a good fit for the growth in complexity that has to be expected from real code bases. Dealing with new or shifting requirements is a normal task of a developer. In this process, a good API limits the threshold effects (i.e. the need to learn a completely new technique). Instead, it allows the developer to use the previously acquired knowledge in order to reach a working solution. This working solution might not be the global optimum, but it is satisfying because it works, and it is correct. The time spent learning was rewarded.
Future work and knowledge acquisition may eventually allow the developer to come up with a better approach, that is eventually applied, or not, depending on external factors (time, technical relevance, etc.).
A specific mindset is needed to put oneself in the shoes of the user, and try to come up with several reasonable use cases for a new API, in order to see 1. how it fits, and 2. how it resists expectable shifts in requirements. It is a mindset that is not satisfied with a solution that works on paper. It is a mindset that is only satisfied by the reasonable confidence that other people, who are not the initial author, can profit from the solution.
I do not think this mindset is required to write a proposal. But I think it would be welcome in the roles of the review manager or the language steering group. They have the authority to ask a proposal author to extend the proposal with case studies, when they think it is necessary. They have the authority to request a proposal rework, or to require a new proposal that provides the missing pieces identified during the "fitting test".
The Swift 6 language mode comes with many of the "threshold effects" I was describing above, that require large refactorings. The working solutions developers come up with are difficult to design, and sometimes uneasy to evaluate. The stdlib lacks building bricks, forcing the developers to deal with the problem which was supposed to be solved in the first place, which is synchronization: the sync/async bridge is still in its infancy, there's no built-in support for ordering... (these forums are full of recurring questions that are begging for solutions).
More documentation efforts in the proposals could have enhanced this state of affairs. Identifying user needs would have helped the stdlib ship temporary tools, i.e. tools that are known to be sub-optimal, but are known to be necessary in the current state of the language. If the stdlib must remain pristine, then other libraries such as apple/swift-async-algorithms
should handle this task.
I do not find it sane that my AsyncSemaphore has those download statistics. I suppose @mattie's Queue has high stats as well. Please sherlock us, or come up with better solutions. Their role is not to build optimal solutions. Their role is to help building correct solutions that people can eventually enhance, when they want, and when they can, as their knowledge grows.
On the topic of "own mental models": To me, it is greatly helpful to understand which problems a new language feature solves. A book like this could cover a variety of legacy code that demonstrates the complexity and potential for bugs of asynchronous code before Swift concurrency.
I'd argue a majority of developers did not have to deal with concurrency bugs. Most iOS app development used to be synchronous code, the occasional callback hell, and odd data races that were so impossible to reproduce that project management stopped caring. There was no need to have an elaborate understanding of locking mechanisms and thread hops.
Explaining to developers why the Swift language was moved into a certain direction could be equally, if not more, important as explaining how the features work on a lower level.
Please excuse me, if that is what you were implying with your suggestion. It made me think about what I would like to have answered, and that is "why do we need all of this, and how was this even done before Swift concurrency?".
I would like to understand how many teams are switching to Swift 6 today. It is anecdotal, but even some of the engineers I know that sometimes are a little bit more CV oriented than others are postponing Swift 6 updates. I think in a lot of enterprises (App Store top apps are not just small indies), Swift 6 is entering a âwe will refactor, when we have timeâ.
I would not be surprised Swift 5 mode to be quite longeve.
Right now given the learning curve and the refactoring needed (issues some teams have with documentation and best practices, some still struggle with Structured Concurrency, actor reentrancy, and general async/concurrent programming introduced for Swift 5.x, once burned twice shy as they say) Swift 6 is starting to compete with React Native, Flutter, and KMP (w/ Compose multiplatform, soon reaching Web, Android, iOS) in CTOâs and VPâs minds.
Swift as a great universal language is a testament to skills of the language and compiler team for sure, but my concern is that the complexity this is bringing on might be ultimately hinder native apps that deeply integrate with the OS and new APIs in favour of multiplatform tools (some of them still much more geared towards Objective-C and UIKit internally⊠which to be fair are rock solid and allow you to get 99% out of what you need from the OS and the system frameworks⊠even library devs having to choose might think about path of lowest resistance and is Swift as popular as we hope outside of iOS apps?).
+1 on documentation being part of the pitch/evolution process
People learn a lot about what they're making by having to write a guide or other educational material about it.
Also watching/having to hear the results of another person following it is an important part of that process.
It's why writing a tutorial was one of the assessments I liked to use a lot in the classroom, and the second part was having fellow students have to follow it. Instructive pain and hilarity ensues.
In the real world a collaborative team approach is probably better, because proper documentation/andragogy is a real skill and not one that every developer has the specific chops for.
I think the WWDC videos, attendant example code and tutorials are AMAZING, btw. Some of the best in the biz. It's just a shame that people seem to have a hard time finding them and they only come out once a year, and can go out of date... Also... if Swift is serious about moving beyond just Apple hardware it needs its own culture of valuing docs.
(
)
I feel like we're getting signaling from the tech side that Swift 6 isn't "for everyone" yetâfor example, new Xcode app projects still default to Swift 5 modeâbut that's never been explicitly said. As a result, a lot of people (including myself, at first) got a feeling that we need to move to Swift 6 ASAP or we'd be accumulating tech debt by the day. Reading these forums and getting more understanding of the technical facts has disabused me of that notion.
I'm super-excited about Swift 6: feeling confidence in the correctness of my concurrent code feels like when I first started using Swift and finally had confidence in my code's memory safety. I expect to be moving to it as soon as some of this settles down, and I'm not feeling anxiety about falling behind while I keep most of my old code in 5. But it seems like a whole lot of people are feeling that anxiety and I wonder how much of that is more about messaging than technical issues. I've found it uncommon that people even realize that Swift 5 and 6 modules can coexist.
I think this wouldnât solve the problem. Finding such a tutorial wouldnât tell you if the behaviour described in the tutorial is the current state of the language or not, in the same way that reading the documentation for an arbitrary evolution proposal currently doesnât tell you if some other later proposal changed the described behaviour. The requirement should be to update the canonical documentation, so that when someone reads the documentation as at a particular language version, it is always a complete and accurate representation of the behaviour of the corresponding version of the language.
I agree with a lot of the sentiment in this thread. I work on some simple iOS apps on the side that, although they were not terribly hard to convert to Swift 6, were also made very carefully such that I never had a single data race issue, so this isn't solving a problem I ever had.
I also work on a relatively large mac app for my day job and that probably does have some data races, but transitioning will take significant time and effort. At the end of the day, we need to build things that people want to pay money for, and zero users care what version of swift we target. The amount of effort here is making me consider pushing for writing our apps in cross-platform javascript/c++, or going back to objective-c, and abandoning Swift. Nothing should ever make people want to write javascript!! (kidding, sort of)
I do appreciate that Swift 6 has made me think about concurrency in a new way, but it does feel too strict. For example, I'm processing CIImages in the background, because I'm applying AI models to them and that takes time, but then naturally i want to send those to the main thread to display them. Why is that nearly impossible? I get that there are risks, but I just want to acknowledge that I shouldn't mess with it and let me go be careful and get my actual job done. I know there's sending
, but that appears to have no effect in a completion closure.
Another point I want to make is that watching WWDC I was sold on the idea that Swift 6 would make your code more stable. I have found this to not be the case at all. First off, any project that contains objective-c will start to crash if the obj-c code calls a MainActor from the background. Probably a smell that you should clean up, but in a lot of cases that was fine. You could have a method on a view controller that is perfectly safe to call from any thread, but Swift 6 explodes because you touched a main actor from the background. Even in pure Swift I've seen cases where Swift 6 compiles successfully and yet there are concurrency related crashes. I have a completion closure that is passed in from a view subclass and it's called from a background queue and there's no warnings or errors about it, but it crashes in runtime. That could be related to AVFoundation not being updated, but the "Apple frameworks are behind" has been covered by other posts here.
Anyways, for the folks who are involved with the future of Swift, do you want a language that's logically perfect and prevents 100% of bugs, or do you want something that people can actually use effectively to build real-world projects with? Maybe we can have both someday, but right now it feels like Swift solved a relatively rare problem at the expense of basically ruining the language for any large-scale codebase.
Just to touch on two things that came up here.
While sending
is a really powerful concept, the 6.0 compiler is overly conservative, particularly when combined with closures. It will become much more useful with 6.1.
Second, the Swift 6 language mode is completely intolerant of incorrectly-annotated closures, be them from Swift 5 or ObjC. This is a particularly painful issue because I know of no (reasonable) way to automate an audit of these. However, I have noticed that numerous missing @Sendable
annotations have appeared within Xcode 16 point releases. So, please do file bugs when you find them.
(I realize that neither of these pieces of information help with either the migration pains or the general philosophical questions.)
the problem this is trying to solve is responsibility, if itâs nobodyâs Job to write Documentation, no Documentation will get written.
like you pointed out, itâs not perfect, those docs would still need to be updated over time, and nobody is doing that. at best you get random one-off contributions that have no coherent voice or vision.
but until Apple contracts, hires, or reassigns someone internally to work on Writing Documentation as their Actual Job, itâs the best we can really do to incentivize any docs to get written at all.
itâs not really reasonable to expect Evolution proposers to scrutinize every piece of Swift documentation that has ever been written and update them, especially since the existing documentation is scattered across multiple websites and repositories, including (as it stands) other Swift Evolution proposals.
I wonder about the topic subject a lot⊠Sure, Swift has evolved the way it did, but I wonder if we were making a brand new language from scratch, and the very first thing designed in that language was concurrency, would we end up in a significantly simpler concurrency safe language after adding all/most other bells and whistles of Swift?
Swift Concurrency Manifesto, the foundation of the concurrency vision for Swift, has been published around 8 years ago â in 2017. Surely, ideas from there has been revised, discussed and updated since then, yet the core concepts seems to remain unchanged. So Iâd say it is pretty close to designing from the beginning.
Another example one can think of is Go â they exactly did this, designed with concurrency in mind. But I have troubles to define if itâs simpler and safer? Well, maybe simpler in some cases as compiler wonât stop you from writing unsafe code, but then clearly it is not that safe.
So...it sounds like the problem we're identifying here with Swift's concurrency model is that it's not solved the problem of concurrently implementing the feature and writing documentation?
There is a bit of inherent complexity in Swift concurrency that is a consequence of Swift's value types model. Unlike a GC language, in Swift, reading a reference counted pointer from arbitrary memory is not guaranteed to be safe, because you have to increment the reference count, which can race with another task that overwrites the reference and deallocates the object. To maintain memory safety, you need some set of rules around exclusive access to memory, and thus Sendable checking, isolation, etc. If you're willing to give up on memory safety, then yeah, you probably don't need most of this, but that seems like a step backwards to me.
It helps to think of "data race safety" as two related but distinct goals:
So you could define away 2 by accepting the tradeoffs of a GC, and just give up on 1, and then end up with basically the legacy Java model: your buggy multithreaded code won't crash the JVM in an unsafe way, but it can still deadlock, or exhibit non-determinism, etc.
There are also languages that address 1 by picking one particular abstraction -- actors, channels, array parallelism -- and building everything around that. You end up with a very clean and elegant design, at the cost of making anything that doesn't fit the chosen abstraction hard to express. These languages also don't have an existing body of "legacy" concurrent code -- either Objective-C, or an earlier version of the language -- to interoperate with. So really there's a third goal up there in that list, which is that the high-level abstractions should compose with what came before. So it's a hard problem, because moving further in any direction requires giving up something else.
Swift Concurrency Manifesto, the foundation of the concurrency vision for Swift, has been published around 8 years ago â in 2017.
Thatâs still some 7 years after Swift inception⊠Would the end result be significantly simpler/safer/different if concurrency had been designed as the very first thing in the language (before var
/ let
etc)
I think a concurrency-safe language should at least promote functional programming since FP itself is already safe enough. Historically, Swift struck a nice balance between FP and OOP that allowed you to move the slider towards one end or the other. At least it used to be the case.
Now that SwiftUI is evolving and becoming mainstream in parallel with the concurrency features is sort of a wake up call that the language is irreversibly drifting towards FP and that in fact you may need to change your approaches to programming fundamentally.
And I think a lot of misunderstanding on the one hand and complexity in Swift 6 on the other stem from the fact that Swift's paradigm shift happened gradually and without explicitly stating that it's in fact a big shift towards FP (that might require some bigger incompatible changes in the language).
Does the Manifesto even mention FP? (Read it a long time ago, can't say)
Can you elaborate here? I don't really understand the connection but I'd like to!
Erlang? Not sure if it can be called simple though.
In short and the way I see it, classes and OOP in general are not concurrency-friendly, whereas structs and all the sendable stuff in the language, very much are.
It is why Apple officially recommends using structs by default and whenever possible in this document, even though back when it was published it wasn't very obvious why. In fact this was a shift from the original "use classes by default because they are not copied by value" widely accepted in the industry pre-concurrency.
Also let
is preferred over var
in both FP and concurrency paradigms.
But on top of that you also want to give pure functions (i.e. functions with no side-effects and therefore safe to call from anywhere) a first-class citizen status because marking them as such simplifies both the compiler's and your work. This is still missing in Swift.