I mean, not really. Any attempt to define reasonableness in programming tends to fall apart in the face of the reality of programming. For example, consider this hypothetical C data structure library:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
typedef struct {
size_t count;
size_t capacity;
char bytes[];
} MyTailAllocArray;
MyTailAllocArray *MakeArray(size_t size) {
MyTailAllocArray *array = malloc(sizeof(MyTailAllocArray) + size);
if (array == NULL) { return NULL; }
array->count = 0;
array->capacity = size;
memset(array->bytes, 0, size);
return array;
}
size_t ArrayElements(MyTailAllocArray *array, char *elements) {
if (array == NULL) { return 0; }
elements = array->bytes;
return array->count;
}
This defines a simple tail-allocated data structure. Here it's an array of char
but it could be nearly anything, it's just that array of char
is simple. Imagine here that MyTailAllocArray
is an opaque type (that is, the structure definition is not in the public headers), to allow the library to evolve the type going forward.
We then have a simple function that passes out an interior pointer to the elements. The question is, is it safe to assume that if the elements
pointer is non-null it's safe to dereference? The answer is no. There are two cases where it is not safe to dereference: the first is if you passed a null pointer for array
, the second is if array->capacity
is 0
(that is, it was originally allocated as a zero-sized array). In that case, the pointer is pointing off the "back" of the data structure, and is therefore out of bounds and is not safe to dereference.
Now, it isn't unreasonable to say that this C library is not defensive enough against this kind of situation, but it's an unfortunate reality that C code tends not to be. It's really common to find functions that require you to check the returned length before you go accessing a pointer.