
Description
_Summary_
I propose a space optimization for variables of type Option<E>
when E
is a nullary, integral enum type.
_Motivation_
There's no need to waste memory for storing a separate tag in variables of type Option<E>
if E
is an integral enum type and the set of valid values of E
does not cover all possible bit patterns. Any bit pattern (of the size of E
) that doesn't represent a valid value of type E
could be used by the compiler to represent the None
value of type Option<E>
.
_Details_
Given a nullary, integral enum type E
, the compiler should check if some bit pattern exists which does not represent a valid value of type E
(the only valid values are the ones determined by the nullary enum variants of E
). If such "invalid" bit patterns are found, the compiler should use one of them to represent the None
value of type Option<E>
and omit storing the tag in variables of type Option<E>
. If more than one such "invalid" bit pattern exists, there should be a language defined method to deterministically determine which one of those bit patterns is used to represent the None
value. I think the bit pattern of None
should be language defined rather than implementation defined in order to make Option<E>
values serialized to disk more stable between different compilers / compiler versions.
In determining whether a certain value of such space optimized type Option<E>
is None
or not, the algorithm should simply check whether or not the binary representation of said value is equal to the binary representation of the language defined "invalid" value.