[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: too many warnings from Bison CVS for Pike

From: Hans Aberg
Subject: Re: too many warnings from Bison CVS for Pike
Date: Sat, 25 Feb 2006 14:48:09 +0100

On 24 Feb 2006, at 22:24, Akim Demaille wrote:

Le 23 févr. 06 à 21:34, Hans Aberg a écrit :

In summary, if one does not need a dynamic polymorphic variable in the extra-parser computations, then a discriminating union will be useful. But it does not serve as a replacement in the case a dynamic polymorphic variable is needed.

There is no problem with storing a polymorphic pointer in a
variant.  I fail to see the point of your comment.

There are two different styles of C++ polymorphic programming: The static template based one, which attempts to resolve as much possible typing during compile time, but which is limited to the types provided by the template system. And there is the dynamic "virtual" class hierarchy one, which provides better generality of types and makes template programming largely unnecessary, but which requires free store allocations, which is often relatively slow in compiler implementations (but which can be overcome the day C++ facilitates GC implementations). This type of programming circulates about designing polymorphic types with a specific interface; template programming circulates much about forcing objects becoming treatable as classes, that in its turn can be handled by the static typing in the template system. Template programming is this respect very cumbersome, relative to the polymorphic hierarchy.

The use of a union is an intermediate, it seems, that avoids free store allocation, but which cannot be used with any dynamic type, for example recursive types or those requiring a pointer or handle (an optimizing implementation may mix unions, pointers and handles). A handle is required if the types should be able to self-mutate, or if one is implementing a tracing GC. The latter is currently difficult to implement in current C++, due to that it is difficult to extract runtime information about the root set, but I think should be resolved in the next major C++ revision. There is also the base class question: When adding a new derived class to a C++ polymorphic class hierarchy, the base class need not be changed.

Now, when I look at the link you gave
it gives the class definition:

template<typename T1, typename T2 = unspecified, ...,
         typename TN = unspecified>
class variant;

So it means that this will serve as a generalized union, but it will not cover the aspects of a C++ polymorphic class hierarchy. For example, recursive types are not be covered; this requires special template extensions. And, when adding new classes, this variant must be changed. Also, it gives an example on how to combine two types 'int' and 'string':

class my_visitor : public boost::static_visitor<int>
    int operator()(int i) const
        return i;

    int operator()(const std::string & str) const
        return str.length();
And if one has more types, one will have to get back and add function overloading of operator().

By contrast, in the class hierarchy model, one would have classes like:
  struct object {
    data operator()(const data&);

  struct integer : object {
    data operator()(const data&);

  struct string : object {
    data operator()(const data&);
Then, a class 'data' would maintain a pointer to object. Now, if the classes integer and string would have been recursive, then operator() would be automatically able to handle that recursiveness. If I add new classes, then that would automatically extend the recursiveness to that, without having to change the interface of the class object. This is useful when working with massively recursive objects, such as mathematical expressions requiring generality.

Returning to the question of Bison, it could well be that the use of variants is very suitable in the implementation of a compiler, where the set of variant types will be known beforehand and one opts for a outputting some static code. But I use essentially an interpreter setup, meaning that the parser produces some C++ objects that will be computed on the fly. There the C++ polymorphic class hierarchy serves very well, because I only use the parser to give the user a chance to construct the object that will later be computed; the computational engine is in fact much more general than the actual parser, and the parser will become extended as needed, as the project moves along.

  Hans Aberg

reply via email to

[Prev in Thread] Current Thread [Next in Thread]