I'm trying to understand the new Swift 3 (4?) pointer API and Swift's
memory model.
More specifically, I'd like to know more about what exactly it means for a
pointer to be initialized or not.
For example, I suppose the following code example doesn't satisfy the
precondition in the subscript documentation (ie floatsPtr not being
initialized when using its subscript):
let numFloats = 123
let floatsPtr = UnsafeMutablePointer<Float>.allocate(capacity: numFloats)
for i in 0 ..< numFloats { floatsPtr[i] = Float(i) * 0.1 } // Setting values
for i in 0 ..< numFloats { print(floatsPtr[i]) } // Getting values
floatsPtr.deallocate(capacity: numFloats)
I'd like to understand why/how this could lead to undefined behavior, and
what exactly it means for a pointer to be initialized or not.
I've read
But I don't feel that I fully understand what it means for a pointer to be
initialized, or bound, and if the preconditions and rules for undef
behavior are the same no matter if Pointee is a trivial type or a class
type.
I think it’s common practice to initialize trivial types via subscript assignment. Earlier versions of the proposal actually showed examples of this and claimed that it was valid pattern. However, during review those examples were removed because it encouraged bad practice and complicated the issue.
The fact is, code like this is not going to break anything in the compiler and it’s common enough that any model model verifier is going to need to special-case trivial types. I think it would be fine to rewrite the subscript precondition as follows:
/// - Precondition: the pointee at `self + i` is initialized.
should read
/// - Precondition: either the pointee at `self + i` is initialized
/// or `Pointee` is a trivial type.
-Andy
···
On Aug 5, 2016, at 12:43 PM, Jens Persson <jens@bitcycle.com> wrote:
I'm trying to understand the new Swift 3 (4?) pointer API and Swift's memory model.
More specifically, I'd like to know more about what exactly it means for a pointer to be initialized or not.
For example, I suppose the following code example doesn't satisfy the precondition in the subscript documentation (ie floatsPtr not being initialized when using its subscript):
let numFloats = 123
let floatsPtr = UnsafeMutablePointer<Float>.allocate(capacity: numFloats)
for i in 0 ..< numFloats { floatsPtr[i] = Float(i) * 0.1 } // Setting values
for i in 0 ..< numFloats { print(floatsPtr[i]) } // Getting values
floatsPtr.deallocate(capacity: numFloats)
I'd like to understand why/how this could lead to undefined behavior, and what exactly it means for a pointer to be initialized or not.
But I don't feel that I fully understand what it means for a pointer to be initialized, or bound, and if the preconditions and rules for undef behavior are the same no matter if Pointee is a trivial type or a class type.
On Fri, Aug 5, 2016 at 9:58 PM, Andrew Trick <atrick@apple.com> wrote:
On Aug 5, 2016, at 12:43 PM, Jens Persson <jens@bitcycle.com> wrote:
I'm trying to understand the new Swift 3 (4?) pointer API and Swift's
memory model.
More specifically, I'd like to know more about what exactly it means for a
pointer to be initialized or not.
For example, I suppose the following code example doesn't satisfy the
precondition in the subscript documentation (ie floatsPtr not being
initialized when using its subscript):
let numFloats = 123
let floatsPtr = UnsafeMutablePointer<Float>.allocate(capacity: numFloats)
for i in 0 ..< numFloats { floatsPtr[i] = Float(i) * 0.1 } // Setting
values
for i in 0 ..< numFloats { print(floatsPtr[i]) } // Getting values
floatsPtr.deallocate(capacity: numFloats)
I'd like to understand why/how this could lead to undefined behavior, and
what exactly it means for a pointer to be initialized or not.
But I don't feel that I fully understand what it means for a pointer to be
initialized, or bound, and if the preconditions and rules for undef
behavior are the same no matter if Pointee is a trivial type or a class
type.
I think it’s common practice to initialize trivial types via subscript
assignment. Earlier versions of the proposal actually showed examples of
this and claimed that it was valid pattern. However, during review those
examples were removed because it encouraged bad practice and complicated
the issue.
The fact is, code like this is not going to break anything in the compiler
and it’s common enough that any model model verifier is going to need to
special-case trivial types. I think it would be fine to rewrite the
subscript precondition as follows:
/// - Precondition: the pointee at `self + i` is initialized.
should read
/// - Precondition: either the pointee at `self + i` is initialized
/// or `Pointee` is a trivial type.
Sorry to pick nits, but... every pointer that exists is already
initialized. What you're asking about, I think, is whether the *memory
referenced by the pointer* is initialized.
···
on Fri Aug 05 2016, Jens Persson <swift-dev-AT-swift.org> wrote:
I'm trying to understand the new Swift 3 (4?) pointer API and Swift's
memory model.
More specifically, I'd like to know more about what exactly it means for a
pointer to be initialized or not.
I'm trying to understand the new Swift 3 (4?) pointer API and Swift's memory model.
More specifically, I'd like to know more about what exactly it means
for a pointer to be initialized or not.
For example, I suppose the following code example doesn't satisfy
the precondition in the subscript documentation (ie floatsPtr not
being initialized when using its subscript):
let numFloats = 123
let floatsPtr = UnsafeMutablePointer<Float>.allocate(capacity: numFloats)
for i in 0 ..< numFloats { floatsPtr[i] = Float(i) * 0.1 } // Setting values
for i in 0 ..< numFloats { print(floatsPtr[i]) } // Getting values
floatsPtr.deallocate(capacity: numFloats)
I'd like to understand why/how this could lead to undefined
behavior, and what exactly it means for a pointer to be initialized
or not.
But I don't feel that I fully understand what it means for a pointer
to be initialized, or bound, and if the preconditions and rules for
undef behavior are the same no matter if Pointee is a trivial type
or a class type.
I think it’s common practice to initialize trivial types via subscript
assignment. Earlier versions of the proposal actually showed examples
of this and claimed that it was valid pattern. However, during review
those examples were removed because it encouraged bad practice and
complicated the issue.
The fact is, code like this is not going to break anything in the
compiler and it’s common enough that any model model verifier is going
to need to special-case trivial types. I think it would be fine to
rewrite the subscript precondition as follows:
/// - Precondition: the pointee at `self + i` is initialized.
should read
/// - Precondition: either the pointee at `self + i` is initialized
/// or `Pointee` is a trivial type.
Depending on where you intend to make this change, you may be implicitly
adding the requirement that every possible bit pattern is a valid
representation of a trivial type. It's fine if you're just changing the
setters for pointee and subscript, but it shouldn't apply to the
getters, IMO.
···
on Fri Aug 05 2016, Andrew Trick <swift-dev-AT-swift.org> wrote:
On Aug 5, 2016, at 12:43 PM, Jens Persson <jens@bitcycle.com> wrote:
I'm trying to understand the new Swift 3 (4?) pointer API and Swift's memory model.
More specifically, I'd like to know more about what exactly it means
for a pointer to be initialized or not.
For example, I suppose the following code example doesn't satisfy
the precondition in the subscript documentation (ie floatsPtr not
being initialized when using its subscript):
let numFloats = 123
let floatsPtr = UnsafeMutablePointer<Float>.allocate(capacity: numFloats)
for i in 0 ..< numFloats { floatsPtr[i] = Float(i) * 0.1 } // Setting values
for i in 0 ..< numFloats { print(floatsPtr[i]) } // Getting values
floatsPtr.deallocate(capacity: numFloats)
I'd like to understand why/how this could lead to undefined
behavior, and what exactly it means for a pointer to be initialized
or not.
But I don't feel that I fully understand what it means for a pointer
to be initialized, or bound, and if the preconditions and rules for
undef behavior are the same no matter if Pointee is a trivial type
or a class type.
I think it’s common practice to initialize trivial types via subscript
assignment. Earlier versions of the proposal actually showed examples
of this and claimed that it was valid pattern. However, during review
those examples were removed because it encouraged bad practice and
complicated the issue.
The fact is, code like this is not going to break anything in the
compiler and it’s common enough that any model model verifier is going
to need to special-case trivial types. I think it would be fine to
rewrite the subscript precondition as follows:
/// - Precondition: the pointee at `self + i` is initialized.
should read
/// - Precondition: either the pointee at `self + i` is initialized
/// or `Pointee` is a trivial type.
Depending on where you intend to make this change, you may be implicitly
adding the requirement that every possible bit pattern is a valid
representation of a trivial type. It's fine if you're just changing the
setters for pointee and subscript, but it shouldn't apply to the
getters, IMO.
We do not want to allow reading trivial values from uninitialized memory.
I took a crack at the comments:
-Andy
···
On Aug 5, 2016, at 10:42 PM, Dave Abrahams via swift-dev <swift-dev@swift.org> wrote:
on Fri Aug 05 2016, Andrew Trick <swift-dev-AT-swift.org <http://swift-dev-at-swift.org/>> wrote:
On Aug 5, 2016, at 12:43 PM, Jens Persson <jens@bitcycle.com> wrote: