Silent Elimination of Bounds Checks
The C language supports pointer arithmetic where one operand is a pointer to an object and the other operand is an integer. When an expression that has an integer type is added to a pointer, the result has the type of the pointer operand.
Pointer arithmeticin C is primarily meant to support the manipulation of a pointer within an array of elements, as in the following example:
int a[2] = {1, 2}; int main(void) { int j = 0; for (int * pi = &a[0]; pi < &a[2]; ++pi) j += *pi; return j; }
This same program can be rewritten in an equivalent form using array notation:
int a[2] = {1, 2}; int main(void) { int j = 0; for (int i = 0; i < 2; ++i) j += a[i]; return j; }
Array indexingis equivalent to pointer arithmetic, but it is pointer arithmetic performed by the compiler, not the coder. An expression such asa[i], for example, is translated to*(a+i).
Provided the pointer operand and the result point to elements of the same array object during pointer arithmetic, or to one past the last element of the array object, the evaluation is well defined; otherwise, the behavior is undefined.
Undefined Behaviors
C programmers are not always entirely familiar with the implications of undefined behavior. The C Standard [ISO/IEC 2011] definesundefined behavioras behavior, upon use of a nonportable or erroneous program construct or of erroneous data, for which the standard imposes no requirements. Conforming implementations can deal with undefined behavior in a variety of fashions, such as:
- Ignoring the situation completely, with unpredictable results
- 翻译文档中或执行程序ed manner characteristic of the environment (with or without the issuance of a diagnostic message)
- Terminating a translation or execution (with the issuance of a diagnostic message)
Undefined behaviors are extremely dangerous, both because they need not be diagnosed and because they can result in any behavior. Most of the security vulnerabilities described in my book,Secure Coding in C and C++,Second Edition, are the result of exploiting undefined behavior in code.
Another problem with undefined behavior is compiler optimizations. Because compilers are not obligated to generate code for undefined behavior, these behaviors are candidates for optimization. By assuming that undefined behavior will not occur, compilers can generate code with better performance characteristics.
Compiler Optimizations
The basic design of an optimizer for a C compiler is largely the same as that of an optimizer for any other procedural programming language. The fundamental principle of optimization is to replace a computation with a more efficient method that computes the same result.
However, some optimizations change behavior. In some cases, these optimizations are beneficial and eliminate undefined behaviors and vulnerabilities; in other cases, these optimizations may inadvertently introduce vulnerabilities.
Compiler developers may choose between several behavior models when translating code. However, compiler writers are not obligated to follow any particular model and are often opportunistic about which model to apply in any given situation.
The first of these is thehardware behavior model. Compilers that follow this model generate code as if the undefined behavior were defined to be whatever action the hardware instructions would take if the undefined behavior occurred. For example, if the hardware instruction wrapped around on signed integer overflow, the compiler would simply treat the wraparound as the defined behavior for integer overflow. For many years, this policy was nearly universal, so several generations of C and C++ programmers have assumed that all compilers behave this way.
Another model issuper debug. Compiler vendors that follow this model provide an intensive debugging environment to trap (nearly) every undefined behavior. This policy severely degrades the application’s performance, so it is seldom used for building applications.
Finally, thetotal licensemodel allows compiler writers to treat any possible undefined behavior as a “can’t-happen” condition, permitting aggressive optimizations.
Vulnerabilities Resulting from Undefined Behaviors
越来越多的编译器作者正在advantage of undefined behavior in the C programming languages to improve optimizations. These optimizations frequently interfere with the ability of developers to perform cause-effect analysis on their source code—that is, to analyze the dependence of downstream results on prior results. Consequently, these optimizations eliminate causality in software and increase the probability of software faults, defects, and vulnerabilities. One such occurrence afflictedPlan 9’s sprint() functionand is further documented inCERT Vulnerability Note VU#162289, “C compilers may silently discard some wraparound checks.”
In general, a programmer might code a bounds-check such as
char *ptr; // ptr to start of array char *max; // ptr to end of array size_t len; if (ptr + len > max) return EINVAL;
No matter what behavior model is used, there is a bug. Iflenis very large, it can causeptr + lento overflow, which creates undefined behavior. Under the hardware behavior model, the result would typically wrap around—pointing to an address that is actuallylowerin memory thanptr.
In attempting to fix the bug, the experienced programmer (who has internalized the hardware behavior model of undefined behavior) might write a check like this:
if (ptr + len < ptr || ptr + len > max) return EINVAL;
However, compilers that follow the total license model may optimize out the first part of the check, leaving the whole bounds check defeated. This optimization is allowed becauseptrplus (an unsigned)lencan compare less thanptronly when an undefined behavior occurred during calculation ofptr + len. The compiler legitimately assumes that undefined behavior never happens; consequently,ptr + len < ptris dead code and can be removed.
Optimizations may be performed for comparisons betweenP + V1andP + V2, wherePis the same pointer andV1andV2are variables of some integer type. The total license model permits this expression to be reduced to a comparison betweenV1andV2.
In our example,
if (ptr + len < ptr || ptr + len > max) return EINVAL;
this optimization translates as follows:
ptr + len < ptr ptr + len < ptr + 0 len < 0 (impossible, len is unsigned)
However, ifV1orV2are such that the sum withPoverflows, then the comparison ofV1andV2will not yield the same result as actually computingP + V1andP + V2and comparing the sums. Because of possible overflows, computer arithmetic does not always obey the algebraic identities of mathematics.
This problem is easy to remediate once it is called to the attention of the programmer. For example, if it is known thatptris less than or equal tomax, then the programmer could write
if (len > max – ptr) return EINVAL;
This conditional expression eliminates the possibility of undefined behavior.
In this example, the expressionbuf + nmay wrap for large values ofn, resulting in undefined behavior.
int f(char *buf, size_t n) { return buf + n < buf + 100; }
When compiled using GCC 4.3.0 with the-O2option, for example, the expression
buf + n < buf + 100
is optimized ton < 100, eliminating the possibility of wrapping. This behavior is usually not an issue unless one expression wraps but the other does not. This code example is still incorrect, because it is not safe to rely on compiler optimizations for security.
The undefined behavior can be eliminated by performing the optimization by hand:
int f(char * buf, size_t n) { return n < 100; }
Recommendations
The pointer overflow behavior that caused the vulnerability inPlan 9’s sprint() functionchanged as of GCC 4.2.4, GCC 4.3.1, and GCC 4.4.0, and all subsequent versions 4.2.x where x >= 4, versions 4.3.y where y >= 1, and versions 4.z where z >= 4.
The optimization is performed by default at-O2and above, including–Os, but it is not performed by default at-O1or-O0. The optimization may be enabled for-O1with the-fstrict-overflowoption or disabled for-O2and above with the-fno-strict-overflowoption. Cases where optimization occurs may be detected by using-Wstrict-overflow=NwhereN >= 3.
C and C++ programmers must be very careful to avoid undefined behaviors in their code. Compilers that implement the total license model are becoming increasingly common; such compilers have the latitude to change how they compile undefined behaviors at any time, and they frequently do so to further optimize code. Consequently, the results produced by code that depends on undefined behaviors can change arbitrarily without notice.
Programmers frequently overreact by disabling optimizations in their code. Optimizations are just as likely to eliminate vulnerabilities from code as they are to introduce them. Diagnostics such as-Wstrict-overflowcan be very useful to identify cases where the compiler optimizes based on the assumption that signed overflow does not occur. Frequently these optimizations are harmless; sometimes they can be surprising.