Enumerated types... what's the point?

Fred Christianson flc at n.sp.cs.cmu.edu
Sat Mar 24 04:45:34 AEST 1990


In article <159 at caslon.cs.arizona.edu> dave at cs.arizona.edu (David P. Schaumann) writes:
>For example, say you had a list of 100 symbols you wanted to use as constants.
>Of course, order is important.  So, you type out 100 #defines.  Ok, fine.
>Suddenly, you discover you need a new symbol, which should go between the
>third and forth symbols.  If you use defines, you are pretty much stuck
>retyping the values on 97 defines.  If you use enum, all you have to do
>is insert the name in the list, and continue on your merry way.

#define a1	1
#define a2	a1 + 1
#define a3	a2 + 1
...
#define a100	a99 + 1

would allow you to easily add a new symbol anywhere.

#define a1	1
#define a2	a1 + 1
#define new     a2 + 1
#define a3	new + 1
...
#define a100	a99 + 1

I think this would be better done with enums simple because I prefer the
way the enum definition looks to the way 100 #defines look.  But if
order is important the #defines may be better because you are immediately
aware of the order by looking at the #defines whereas you (or at least me)
don't tend to think of enum symbols as ordered.

>Also, if you use #defines, you may be tempted by your explicit knowledge of
>the value of that symbol to use it in some way that would break if you
>change the value.

I've seen this done with enums also, although it is less likely to happen.

----
Fred



More information about the Comp.lang.c mailing list