Skip to content

Default number of jobs is unsafe (cabal-install) #5776

Open
@hasufell

Description

@hasufell

Just ran cabal new-install pandoc-include-code without really thinking and it blew up my 16gb ram machine very quickly. Earlyoom was even unable to kill any of the processes, so I had to wait 15 minutes for the machine to become responsive again, potentially getting file corruption and break my recent work.

Number of processors is almost always the wrong choice for jobs with GHC. This is something that comes from the gcc world and even there it's sometimes wrong if the objects are too big.

The default should be safe, not fast. We shouldn't make random assumptions about the memory of the users machine.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions