Psychology and sociology of Swift feature proposals.

Should proposals take into account psychology and sociology (e.g., peer pressure) so as to increase Swift adoption?

In particular, a feature may be objectively good (as demonstrated by statistically significant experiments) but most developers may think it is bad regardless. What should be done in that case?

We want Swift to be adopted widely by a diverse audience. We don't want developers to reject it because they don't like it.

I see two possible solutions to that kind of design problem:

1. Add the feature anyway and educate developers, hoping to convince them that it's the right thing to do or at least accept that it is a necessary evil. Swift's String class does this. There are traditional string operations that are deliberately hard with Swift.String because they are not compatible with Unicode strings.

2. Change the feature so it is more palatable to developers who dislike it. Swift's nullability does this. We believe non-nullability should be used only when necessary. But Swift also has syntax additions like optional chaining and the `nil` identifier itself along with semantic additions like implicitly-unwrapped optional to make it fairly reasonable to use nullable references if you have to or want to do so.

ยทยทยท

On Dec 21, 2015, at 7:35 AM, Amir Michail via swift-evolution <swift-evolution@swift.org> wrote:

Should proposals take into account psychology and sociology (e.g., peer pressure) so as to increase Swift adoption?

In particular, a feature may be objectively good (as demonstrated by statistically significant experiments) but most developers may think it is bad regardless. What should be done in that case?

--
Greg Parker gparker@apple.com Runtime Wrangler