Why Do Constant Expressions Have an Exclusion For Undefined Behavior

Why does a consteval function allow undefined behavior?

This is a compiler bug. Or, to be more precise, this is an "underimplemented" feature (see the comment in bugzilla):

Yup - seems consteval isn't implemented yet, according to: https://clang.llvm.org/cxx_status.html

(the keyword's probably been added but not the actual implementation support)

Is the compiler allowed leeway in what it considers undefined behavior in a constant expression?

Since Johannes does not seem to want to convert his comment into an answer then I will self-answer. I had an offline conversation with Howard Hinnant and he confirmed the consensus in the comments that gccs behavior in this context is indeed conforming.

The relevant section from the draft standard would be section 1.4 Implementation compliance which says in paragraph 2:

Although this International Standard states only requirements on C++ implementations, those requirements are often easier to understand if they are phrased as requirements on programs, parts of programs, or execution of programs. Such requirements have the following meaning:

and has the following bullet (emphasis mine):

If a program contains a violation of any diagnosable rule or an occurrence of a construct described in this Standard as “conditionally-supported” when the implementation does not support that construct, a conforming implementation shall issue at least one diagnostic message.

A diagnostic can either be a warning or an error. So once gcc provides a warning about the undefined behavior it does not requires a subsequent warning about the constexpr itself.

Although this is conforming behavior generating an error for the constexpr and allowing SFINAE would seem to be more robust behavior. Considering that gcc issues errors in other instances of undefined behavior in a constexpr this does not seem like the intended behavior or if it was, at least inconsistent behavior, so I filed a bug report.

constexpr undefined behaviour

§5.19/2 (on the second page; it really should be split into many paragraphs) forbids constant expressions containing

— an lvalue-to-rvalue conversion (4.1) unless it is applied to

— a glvalue of integral or enumeration type that refers to a non-volatile const object with a preceding initialization, initialized with a constant expression, or


— a glvalue of literal type that refers to a non-volatile object defined with constexpr, or that refers to a sub-object of such an object

str[1000] translates to * ( str + 1000 ), which does not refer to a subobject of str, in contrast with an in-bounds array access. So this is a diagnosable rule, and the compiler is required to complain.

EDIT: It seems there's some confusion about how this diagnosis comes about. The compiler checks an expression against §5.19 when it needs to be constant. If the expression doesn't satisfy the requirements, the compiler is required to complain. In effect, it is required to validate constant expressions against anything that might otherwise cause undefined behavior.* This may or may not involve attempting to evaluate the expression.

 * In C++11, "a result that is not mathematically defined." In C++14, "an operation that would have undefined behavior," which by definition (§1.3.24) ignores behavior that the implementation might define as a fallback.

Can expression using pointers causing unspecified (not undefined!) behaviour be used in constexpr context?

In addition to the catch-all case regarding UB, towards the end of the list of forbidden expressions in [expr.const] is,

— a relational or equality operator where the result is unspecified

This also appears in cppreference list, currently numbered #19.

Does an expression with undefined behaviour that is never actually executed make a program erroneous?

If a side effect on a scalar object is unsequenced relative to etc

Side effects are changes in the state of the execution environment (1.9/12). A change is a change, not an expression that, if evaluated, would potentially produce a change. If there is no change, there is no side effect. If there is no side effect, then no side effect is unsequenced relative to anything else.

This does not mean that any code which is never executed is UB-free (though I'm pretty sure most of it is). Each occurrence of UB in the standard needs to be examined separately. (The stricken-out text is probably overly cautious; see below).

The standard also says that

A conforming implementation executing a well-formed program shall produce the same observable behavior
as one of the possible executions of the corresponding instance of the abstract machine with the same program
and the same input. However, if any such execution contains an undefined operation, this International
Standard places no requirement on the implementation executing that program with that input (not even
with regard to operations preceding the first undefined operation).

(emphasis mine)

This, as far as I can tell, is the only normative reference that says what the phrase "undefined behavior" means: an undefined operation in a program execution. No execution, no UB.

Signed integer overflow undefined behavior

Since a pull request was opened for this case, it seems that the reference to chapter 7.2 was wrong and it should have been 7.1 instead.

Is this failing test that adds zero to a null pointer undefined behaviour, a compiler bug, or something else?

This looks like an MSVC bug the C++14 draft standard explicitly allows adding and subtracting of the value 0 to a pointer to compare equal to itself, from [expr.add]p7:

If the value 0 is added to or subtracted from a pointer value, the result compares equal to the original pointer value. If two pointers point to the same object or both point one past the end of the same array or both are null, and the two pointers are subtracted, the result compares equal to the value 0 converted to the type std::ptrdiff_t.

It looks like CWG defect 1776 lead to p0137 which adjusted [expr.add]p7 to explicitly say null pointer.

The latest draft made this even more explicit [expr.add]p4:

When an expression J that has integral type is added to or subtracted from an expression P of pointer type, the result has the type of P.

- If P evaluates to a null pointer value and J evaluates to 0, the result is a null pointer value.

- Otherwise, if P points to element x[i] of an array object x with n elements,85 the expressions P + J and J + P (where J has the value j) point to the (possibly-hypothetical) element x[i+j] if 0≤i+j≤n and the expression P - J points to the (possibly-hypothetical) element x[i−j] if 0≤i−j≤n.
(4.3).

- Otherwise, the behavior is undefined.

This change was made editorially see this github issue and this PR.

MSVC is inconsistent here in that it allows adding and substracting zero in a constant expression just like gcc and clang does. This is key because undefined behavior in a constant expression is ill-formed and therefore requires a diagnostic. Given the following:

constexpr int *p = nullptr  ;
constexpr int z = 0 ;
constexpr int *q1 = p + z;
constexpr int *q2 = p - z;

gcc, clang and MSVC allows it a constant expression (live godbolt example) although sadly MSVC is doubly inconsistent in that it allows non-zero value as well, given the following:

constexpr int *p = nullptr  ;
constexpr int z = 1 ;
constexpr int *q1 = p + z;
constexpr int *q2 = p - z;

both clang and gcc say it is ill-formed while MSVC does not (live godbolt).



Related Topics



Leave a reply



Submit