Skip to content

loss of implied bounds after normalization #106567

@aliemjay

Description

@aliemjay

This doesn't compile although it should:

trait Identity { type Ty; }
impl<'a, 'b> Identity for &'a &'b () { type Ty = &'a &'b (); }
fn test<'a, 'b>(_: <&'a &'b () as Identity>::Ty) {}
//~^ ERROR lifetime mismatch

It does pass if the impl is changed to

impl<T> Identity for T { type Ty = T; }

The issue here is that when we try to normalize <&'a &'b () as Identity>::Ty, the result is &'?1 &'?2 (), where ['?1, '?2] are new region variables that are unified with ['a, 'b], respectively. However, when we try to compute the implied bounds after normalization, we get '?2: '?1 instead of the desired bound 'b: 'a.

@rustbot label C-bug T-types A-implied-bounds

Metadata

Metadata

Assignees

No one assigned

    Labels

    A-implied-boundsArea: Implied bounds / inferred outlives-boundsC-bugCategory: This is a bug.T-typesRelevant to the types team, which will review and decide on the PR/issue.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions