bug-guix
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#64775: /run should be cleaned on boot


From: Hilton Chain
Subject: bug#64775: /run should be cleaned on boot
Date: Sun, 06 Aug 2023 21:18:35 +0800

Hi all,

On Sat, 22 Jul 2023 04:24:17 +0800,
Saku Laesvuori via Bug reports for GNU Guix wrote:
>
> [1  <text/plain; us-ascii (quoted-printable)>]
> > > I vote for TMPFS, since that would also reduce flash wear.
> > > Honestly I don't get why it's not already using TMPFS.
> >
> > One argument could be how much ram it takes:
> >
> >   $ du -sc /run/*
> >   12      /run/blkid
> >   0       /run/booted-system
> >   0       /run/current-system
> >   1312    /run/setuid-programs
> >   524     /run/udev
> >   1848    total
> >
> > That is with no explicit setuid programs configured, on a machine with a
> > fairly minimal configuration.
> >
> > Not a *huge* amount of ram, but not nothing, either...
>
> I'd say it's effectively nothing for almost all devices capable of
> running Guix. On my laptop the size of /run is 4804 (4.7M). In a quick
> test one terminal window with only zsh running in it took almost 10
> times as much ram.
> [2 signature.asc <application/pgp-signature (7bit)>]
> No public key for 257D284A2A1D3A32 created at 2023-07-22T04:24:17+0800 using 
> RSA

I'm currently using tmpfs for /tmp, /run and /var/run on my Guix
Systems.

If you are interested, this is my base file systems:
--8<---------------cut here---------------start------------->8---
(cons* (file-system
         (device "none")
         (mount-point "/tmp")
         (type "tmpfs")
         (check? #f))

       (file-system
         (device "none")
         (mount-point "/run")
         (type "tmpfs")
         (needed-for-boot? #t)
         (check? #f))

       (file-system
         (device "none")
         (mount-point "/var/run")
         (type "tmpfs")
         (needed-for-boot? #t)
         (check? #f))

       (delete %debug-file-system
               %base-file-systems))
--8<---------------cut here---------------end--------------->8---

Thanks





reply via email to

[Prev in Thread] Current Thread [Next in Thread]