Why Use #Define Instead of a Variable

Why use #define instead of a variable

Mostly stylistic these days. When C was young, there was no such thing as a const variable. So if you used a variable instead of a #define, you had no guarantee that somebody somewhere wouldn't change the value of it, causing havoc throughout your program.

In the old days, FORTRAN passed even constants to subroutines by reference, and it was possible (and headache inducing) to change the value of a constant like '2' to be something different. One time, this happened in a program I was working on, and the only hint we had that something was wrong was we'd get an ABEND (abnormal end) when the program hit the STOP 999 that was supposed to end it normally.

Why do most C developers use define instead of const?

There is a very solid reason for this: const in C does not mean something is constant. It just means a variable is read-only.

In places where the compiler requires a true constant (such as for array sizes for non-VLA arrays), using a const variable, such as fieldWidth is just not possible.

What is the benefit of using #define to declare a constant?

(This is a C++ answer. In C, there is a major advantage to using macros, which is that they are pretty much the only way you can get a true constant-expression.)


What is the benefit of using #define to declare a constant?

There isn't one.

I have seen a lot of programs using #define at the beginning.

Yes, there is a lot of bad code out there. Some of it is legacy, and some of it is due to incompetence.

Why shouldn't I declare a constant global variable instead ?

You should.

A const object is not only immutable, but has a type and is far easier to debug, track and diagnose, since it actually exists at compilation time (and, crucially, has a name in a debug build).

Furthermore, if you abide by the one-definition rule, you don't have to worry about causing an almighty palaver when you change the definition of a macro and forget to re-compile literally your entire project, and any code that is a dependent of that project.

And, yes, it's ironic that const objects are still called "variables"; of course, in practice, they are not variable in the slightest.

Why would someone use #define to define constants?

#define has many different applications, but your question seems to be about one specific application: defining named constants.

In C++ there's rarely a reason to use #define to define named constants.

#define is normally widely used in C code, since C language is significantly different from C++ when it comes to defining constants. In short, const int objects are not constants in C, which means that in C the primary way to define a true constant is to use #define. (Also, for int constants one can use enums).

#Define VS Variable

Well, there is a great difference. You can change the value of width, you can take its address, you can ask for its size and so on. With WIDTH, it will be just replaced with a constant 10 everywhere, so the expression ++WIDTH doesn't make any sense. Ono the other side, you can declare an array with WIDTH items, whereas you cannot declare an array with width items.

Summing it up: the value of WIDTH is known at compile time and cannot be changed. The compiler doesn't allocate memory for WIDTH. On the contrary, width is a variable with initial value 10, its further values are not known at compile time; the variable gets its memory from the compiler.

In C++, is it better to use #define or const to avoid magic numbers?

Don't worry about efficiency in this case since all of them will be computed in compile-time.

You should stop using Macros (at least to define constants) whenever you can. Macros are wild things against namespaces and scopes. On the other hand const objects have type and this can reduce unintended mistakes.

It's always useful to read Stroustrup's piece of advises: "So, what's wrong with using macros?"

Advantage of #define instead of creating a function in embedded

It may be related to performance.

A function call has some overhead (i.e. calling, saving things on the stack, returning, etc) while a macro is a direct substitution of the macro name with it's contents (i.e. no overhead).

In this example the functions foo and bar does exactly the same. foo uses a macro while bar uses a function call.

Sample Image

As you can see bar and printY together requires more instructions than foo
.

So by using a macro the performance got a little better.

But... there are downsides to this approach:

  • Macros are hard to debug as you can't single step a macro

  • Extensive use of a macro increases the size of the binary (compared to using function call). Something that can impact performance in a negative direction.

Also notice that modern compilers (with optimization on) are really good at figuring out when it's a good idea to automatically inline a function (i.e. your code is written with a function call but the compiler decides to inline the function as if it was a macro). So you might get the same performance using function call.

Further, you can use the inline key word as a hint to the compiler that you think it will be good to inline a function. But even with that keyword the compiler may decide not to inline. The only way to make sure that the code gets inline, is by using a macro.

What is the difference between global variables and #define in c

Well, global variables can be edited from everywhere.

Basically, in the low level, a variable is stored in RAM memory and created after launching your application, it always has an address in RAM. Defines are just macros, your compiler will just replace your define names with its values in the compilation step.

#define can't be edited, it's just a macros. And #define is not just about values, you can define almost everything that you want, for example:

// Defining a constant
#define PI 3.14

// Defining an expression
#define MIN(x,y) ((x)<(y))?(x):(y)

// Defining a piece of code
#define STOP_TIMER \
clock_gettime(CLOCK_REALTIME, &__t1); \
__lasttime = \
(double) (__t1.tv_sec - __t0.tv_sec) + \
(double) (__t1.tv_nsec - __t0.tv_nsec) / 1000000000.0;

And, in most situations, defines are used to set some OS-specific or hardware-specific values. It's a really powerful thing, because it gives you the opportunity to change things dynamically in the compilation step. For example:

// Example with OS
#ifdef __linux__
#define WELCOME_STRING "welcome to Linux!"
#else
#define WELCOME_STRING "welcome to Windows!"
#endif

// Example with hardware
#if __x86_64__ || __ppc64__
#define USING_64BIT
#else
#define USING_NOT64BIT
#endif

Alternatives to using #define in C++? Why is it frowned upon?

Why is this this code bad?

Because VERSION can be overwritten and the compiler won't tell you.

Is there an alternative to using #define?

const char * VERSION = "1.2";

or

const std::string VERSION = "1.2";

Why are constants used instead of variables?

A variable, as the name implies, varies over time. Variables mostly allocate memory. In your code, when you declare that a value will not change, the compiler can do a series of optimizations (no space is allocated for constants on stack) and this is the foremost advantage of Constants.

Update

You may ask why do we use Constants after all?

It's a good question, actually, we can use literal numbers instead of constants. it does not make any difference for the compiler since it sees both the same. However, in order to have a more readable code (--programming good practice), we'd better use constants.

Using constants, you can also save your time!. To be more specific, take below as an example:

Suppose a rate value for some products in a shopping system (rate value = 8.14). Your system has worked with this constant for several months. But then after some months, you may want to change the rate value, right?. What are you going to do? You have one awful option! Changing all the literals numbers which equal 8.14! But when you declare rate as a constant you just need to change the constant value once and then changes will propagate all over the code. So you see that by using constants you do not need to find 8.14's (literal numbers) and change them one by one.



Related Topics



Leave a reply



Submit