What Belongs in an Educational Tool to Demonstrate the Unwarranted Assumptions People Make in C/C++

Evaluation of operands in assignment operation

It is exactly the same situation. The only difference is that in your second example, a, b, and c have no side effects. No matter which way your compiler's implementation decides to evaluate them, the result will still be the same since none of the evaluations does anything.

Is there a Way to Get Warned about Misbehaving Designated Initializers?

...no warning that my initialization order was not respected.

A particular initialization order is an expectation based on something other then that stated in the standard. (as pointed out in the comments )

C99 section 6.7.9, p23: 23 The evaluations of the initialization list expressions are indeterminately sequenced with respect to one another and thus the
order in which any side effects occur is unspecified.
[emphasis mine]

There is therefore no problem here except undefined (or unspecified) behavior. Very similar to other C behaviors such as the ambiguity with order of evaluation of function arguments.

EDIT

C99 has this to say about that:

from C99 §6.5.2.2p10:

Order of evaluation of function arguments is
unspecified
, The order of evaluation of the function designator,
the actual arguments, and subexpressions within the actual arguments is
unspecified, but there is a sequence point before the actual call.

[emphasis mine]

read more here

That you would prefer a warning (which you stated well, +1) is another matter. I am not sure how practical it would be though to provide a warning for -every- -undefined- -behavior- in the C/C++ languages.

It is interesting to note some of the stated assumptions/opinions in this discussion why the C++ standards do not include Designated Initializers. (Yet) ...

...C++ is more interested in putting the flexibility on the side of
the designer of a type instead, so designers can make it easy to use a
type correctly and difficult to use incorrectly.

Is this a C variable?

In C, that would be a declaration of a function returning a pointer-to-int and takes an unspecified number of arguments. The declaration of a function that takes no arguments would be int *x(void);

Assignment to std::vector element leads to memory corruption

Seems really easy to understand after all the investigation you already did.

If the vector is resized then the memory gets relocated and you're accessing already freed memory on the assignment.

//this is essentially the left side in your original code
int32_t* leftSide = &(nodes.at(currNodeId).childrenId.at(QUADRANT::RU));
int RU = makeSubtree(pointsByQuadrant.at(QUADRANT::RU),
maxX,
maxY,
minX+(maxX-minX)/2,
minY+(maxY-minY)/2);
//this is the left side as it should be
int32_t* leftSide2 = &(nodes.at(currNodeId).childrenId.at(QUADRANT::RU));
assert(leftSide2 == leftSide); //this will fail if the vector resizes

As an aside I'm not sure that this failing is now standards conforming, because the C++17 Standard introduced this rule:

In every simple assignment expression E1=E2 and every compound
assignment expression E1@=E2, every value computation and side-effect
of E2 is sequenced before every value computation and side effect of
E1

Why do I get different outputs from different compilers for this code?

Simply put, there is no rule that guarantees that "4 4" is right. Same goes for "4 10".

As others have mentioned in the comments you have a case of undefined behaviour here. Even if this would be defined behaviour, code like this is difficult to understand. So I recommend to do

cout << Change(a);
cout << a;

or

int b = a;
cout << Change(a);
cout << b;

whatever you really want to do.



Related Topics



Leave a reply



Submit