Description
This doesn't compile although it should:
trait Identity { type Ty; }
impl<'a, 'b> Identity for &'a &'b () { type Ty = &'a &'b (); }
fn test<'a, 'b>(_: <&'a &'b () as Identity>::Ty) {}
//~^ ERROR lifetime mismatch
It does pass if the impl is changed to
impl<T> Identity for T { type Ty = T; }
The issue here is that when we try to normalize <&'a &'b () as Identity>::Ty
, the result is &'?1 &'?2 ()
, where ['?1, '?2]
are new region variables that are unified with ['a, 'b]
, respectively. However, when we try to compute the implied bounds after normalization, we get '?2: '?1
instead of the desired bound 'b: 'a
.
@rustbot label C-bug T-types A-implied-bounds