Description
TypeScript Version: 2.3.2
I bumped into a strange behaviour wrt tuples by spotting a difference in the way type inference works with value and reference types.
Code
// Value types are inferred to their most specific type.
const foo = 42; // will infer to foo: 42 instead of foo: number
// Reference types infer to their most generic type.
const bar = [1, 2]; // will infer to bar: number[] instead of bar: [number, number]
This is understandable, because const
/readonly
work the same way as their JS equivalents (values of reference types are mutable), so casting to the most specific type without having a way of enforcing actual immutability would cause more harm than good.
At that point I expected that explicitly typing my tuples as [number, number]
will not only make the compiler comply (no more number[]
is not assignable to [number, number]
), but also give me additional type guarantees. Then I discovered that I can do the following no problem:
const foo: [number, number] = [1, 2, 3];
const quux: [number, number] = [10, 11];
quux[3] = 12;
const bar: [number, number] = [4, 5];
bar.pop();
const baz: [number, number] = [6, 7];
baz.shift();
const qux: [number, number] = [8, 9];
qux.splice(0, 1);
Expected behaviour:
There are probably 2 things that could be done here:
- The exact length of tuples is enforced. Property accessor disallows accessing elements outside tuple's range (currently it uses the union type for everything outside the range). This is a documented behaviour, so probably harder to amend (unless implemented with a flag).
- Mutable
Array.prototype
methods are disallowed on tuples. Virtually all of them, when called on a tuple, guarantee that you're going to end up with a value incompatible with your tuple type. The exception would be calling them with arguments that do nothing (e.g.[1, 2].splice(0, 2)
) and calling those which add and element (e.g.[1, 2].push(3)
).
Actual behaviour:
Tuples behave like arrays with only minimum length being enforced.