Classes that have custom destructors, copy/move constructors or copy/move assignment operators should deal exclusively with ownership. Other classes should not have custom destructors, copy/move constructors or copy/move assignment operators.By "not have," Martinho really means "not declare", because all classes have destructors, one way or another, and the blog post makes clear that Martinho expects a non-resource-handling class to have the copying and moving behavior exhibited by its data members (and hence to have the corresponding functions, assuming such functionality is employed). If we revise the wording of the rule in accord with
s/have/declare/g
, we get:Classes that declare custom destructors, copy/move constructors or copy/move assignment operators should deal exclusively with ownership. Other classes should not declare custom destructors, copy/move constructors or copy/move assignment operators.This advice makes me a bit uncomfortable.
In concept, it's great. Morally, I'm on board. I agree that classes that don't manage resources should be designed so that the compiler-generated functions for copying, moving, and destruction do the right things. I'm just not sure that taking advantage of that by not declaring any of these functions is a good idea.
Consider a class
Widget
that doesn't do resource management. Per the Rule of Zero, it declares none of the five special functions covered by the rule. Further assume that its data members are both copyable and movable. Widget
objects are therefore copyable and movable, too.class Widget { public: ... // no dtor or copy or move functions };Wonderful. Life is good.
Now assume something in the software doesn't work the way it should. It could be a behavioral problem (i.e., your run-of-the-mill bug) or it could be a performance problem. Either way, debugging ensues. Let's assume that during debugging, it becomes convenient to temporarily add a destructor to do something like produce a log message for tracing purposes:
class Widget { public: ~Widget(); // temporary destructor ... // no copy or move functions };The addition of the destructor has the side effect of disabling generation of the move functions, but because
Widget
is copyable, all the code that used to generate moves will now generate copies. In other words, adding a destructor to the class has caused presumably-efficient moves to be silently replaced with presumably-less-efficient copies. That strikes me as the kind of thing that (1) is likely to surprise people and (2) could really complicate debugging. Hence my discomfort.I'm inclined to recommend that a better way to rely on the compiler-generated copy and move functions is to expressly say that they do the right thing--to define them via
=default
:class Widget { public: Widget(const Widget&) = default; Widget(Widget&&) = default; Widget& operator=(const Widget&) = default; Widget& operator=(Widget&&) = default; ... };With this approach, the spirit of the Rule of Zero remains: classes that don't manage resources should be designed so that the compiler-generated functions for copying, moving, and destruction do the right things. But instead of expressing this by not declaring those functions, it's expressed by declaring them explicitly and equally explicitly opting in to the compiler-generated implementations.
What do you think? Does this advice make sense? Should the Rule of Zero perhaps be the Rule of the Five
default
s?Thanks,
Scott
25 comments:
I want to say two things about this.
First, I would want to have a "chat" with anyone in my team that willy-nilly adds destructors to classes and doesn't think about the Ro3 ;)
Then there's the fact that the generation of copies when there is a destructor present is now deprecated since C++11. Clang already warns about this behaviour, and GCC has an outstanding issue about it (http://gcc.gnu.org/bugzilla/show_bug.cgi?id=58407). Hopefully compiler writers will catch up on this.
If debugging ensues, why would you ever want to look at the Widget destructor? It was compiler generated after all. As long as all data ownership is delegated to proper classes, you should only have to look at those data-owning class destructors.
I forgot to mention: this warning is enabled in clang with the flag -Wdeprecated.
@Anonymous: As I said, perhaps the motivation for adding the destructor was to have it produce debugging-related messages, e.g., "In Widget destructor."
@Martinho Fernandes: Thanks for chiming in.
@Martinho Fernandes: Regarding feature deprecation, I think it's important to recognize that the committee is extremely conservative about actually getting rid of deprecated features. C++98 deprecated operator++ on bools, for example, but such code remains legal (albeit deprecated) in C++14. Applying operator++ to bools with gcc 4.8.1 with -Wall yields no warning, and the most recent CTP compiler from Microsoft has the same result with -W4. (I didn't check Clang.) I believe it's unwise to assume that because something is deprecated, it won't be used, especially if such use is accidental (as might well be the case when using the deprecated generation of copy operations for a class with a user-declared destructor).
Meh. It can be a rule of five defaults, it's a bit of boilerplate but if one wants to be sure...
But I prefer to follow the rule of zero and use the warnings (like martinho said).
In multithreaded code, the destructor is often an interesting case to study. When a bug occurs, it's a nice «breakpoint» spot to have. Having gone through some tricky cases moving fron C++03 to C++11 code for my PhD, I'm inclined to side with Scott on this one. Even though the bugs I've had were rarely due to the default destructor code, the side effects that stemmed from that method were quite a few times interesting to check in a debugger for analysis purposes.
If you need tracing you can easily add a tracer class to your class members.
Hi Scott, as Martinho said, moves are not _silently_ replaced by copies anymore.
Furthermore, IIRC adding constructors/destructors/.../.../ as default and not adding them is not the same for POD-like types.
That could be surprising too, depending on your domain.
I'm inclined to side with Scott here.
If you can explicitly mark a function to be default then you should. Then changes to the class later have zero effect on the defaulted functions. No matter what the change is. Which is a good guarantee to have.
Oh, I also doubt the old behaviour will be dropped any time soon, but I hope compiler writers follow on clang's footsteps and do add warnings about it so that at least we can avoid accidental use.
GCC even has the same -Wdeprecated flag as clang, but while -Wdeprecated in clang also warns about both copy constructors and ++ on bools, sadly in GCC -Wdeprecated warns about neither.
Strongly in favor of fixing this concern in the compiler, rather than confuse every class declaration with 5 default empty functions. Assume the people reading your C++ code are capable C++ programmers.
Regarding debuggers, my opinion is the same as with compilers: whatever anyone chooses to do to cope now, the tools still need fixing.
Whenever you find yourself needing to change your code somehow so that you can debug it, you should be filing a feature request (and I should follow my own advice :D).
It's not really a new rule, the rule of 5 defaults, is it? At least not in this case. While the destructor in the example as shown is empty, the implication is that it will actually hold some code (logging). In that case wouldn't the "normal" rule of 5 apply? You've explicitly defined one, so you should define all 5, even if you are just declaring the other 4 as default.
I understand the original post was about whether the rule of 5 applies to non-resource controlling classes. In regards to that, I tend to agree with Scott that it should be more guidance than rule. But in the counter-example described, I don't think we are defining a new rule. It's just a very simple case of the existing rule.
@Karsten said:
If you need tracing you can easily add a tracer class to your class members.
The downside of that is that it (likely, unless you can slip it into an existing padding region) changes the footprint of your class. That might also perturb performance measurements, although admittedly less that disabling moves would, likely.
FWIW, as I was reading this piece I was mentally asking myself, why doesn't Scott recommend that we explicitly qualify these implicit functions as default/delete as appropriate? As I continued reading I was happy to see that is what you eventually recommended.
A huge +1 from my side.
I think the explicitly defaulted versions signal that the implementer was aware of them and explicitly wanted to state that they are fine. So for complex classes I think it might be a good idea to add them. OTOH if you just have a struct with a few public members and no fancy methods, they are likely more distracting then helping.
I think the advice on using =default for these are great.
It gives clear documentation to the reader of the code that you want to have the compiler generated versions of these constructors and operators.
what i think people are missing here is the fact that afaik no ide/compiler gives you a nice feeling for what functions are compiler generated and also to breakpoint into them. tbh i find it kind of dumb that if I have a class that has member vector I cant put a breakpoint that will be fired when that vector starts to be destructed.
but tbh luckily IDK if I ever needed that feature or to know what funciton are compiler generated (except ctor, dtor, asignment) so it is not that bad, but then again Im not writing high q code tbh.
@Mike G., @Karsten: You could as well create a small "tracer" class with a destructor, and temporary add it into an inheritance list of your "widget" class. If this tracer class has no members, it will not change memory layout of the widget. If you need to pass a name of your widget into the tracer's log, you could make the tracer a template with a char* as a template parameter (or pass the widget type itself and use RTTI.
Given all this, what do people make of http://isocpp.org/files/papers/n3578.pdf? It proposes to prevent the very problem Scott has pointed out.
Hi Scott,
I am also a proponent of "Rule of Zero" as opposed to "Rule of Five".
However, my arguments slightly differ from Martinhos. I tell my students and listeners to write their classes in a way that the compiler-provided destructor, copy/move ctor/assignment automatically are correct (either defaulted or deleted).
All resource management should be put in the hand of library classes and corresponding members. new/delete are verboten (with the exception of initializing a unique_ptr until C++14). Polymorphic types that need to be kept on the heap should always be instantiated via make_shared(). This avoids the need to declare a virtual dtor, but only works if applied consistently (and doesn't work with unique_ptr).
To fill the gap that unque_ptr must be bend hard to fill (i.e., care for resources that aren't pointers), I promote a library addition for another resource Management classes in N3949 (scope_guard, unique_resource).
The argument of debugging is IMHO a weak one, and a deficit of the tooling. Code that doesn't need to be written usually can not be written wrongly: "less code = more software" and by my calling, developers shouldn't debug interactively, but should write good unit tests to avoid the need for debugging (I know, that is another hard to learn and follow topic).
To sum up: even for the average C++ developer the "rule of zero" is much easier to follow than anything else, because "things just work". Just don't use plain pointers, or self-written resource management classes. Rely on library experts to provide such for you.
Regards
Peter.
I don't agree with the "rule of five defaults". In short, the advice reads like adding boilerplate-Big5 to any class that does not need them otherwise, just because under certain circumstances someone some day might want to add one of them and could forget to add the others.
I concur with Martinho that if anyone writes a destructor, constructor or op= for whatever reason, it should be a reflex to at least consider the others.
Maybe a compromise would suit, a "Rule of All or Nothing": As long as you can, stick to the Rule of Zero, but if you have to write at least one of the Big Five, default the rest.
@Arne Mertz: I think "The Rule of All or Nothing" would be a pretty good rule.
Post a Comment