bug-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#26011: 26.0.50; tramp should respect large-file-warning-threshold


From: Robert Marshall
Subject: bug#26011: 26.0.50; tramp should respect large-file-warning-threshold
Date: Wed, 08 Mar 2017 12:46:57 +0000
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/26.0.50 (gnu/linux)

Michael Albinus <michael.albinus@gmx.de> writes:

> Robert Marshall <robert.marshall@codethink.co.uk> writes:
>
> Hi Robert,
>
>> If from dired you attempt to copy a file to a tramp ssh dired (I have
>> dired-dwim-target set to t) and that file is very large, emacs will pause
>> for some time and eventually stop with:
>>
>> tramp-file-name-handler: Memory exhausted--use C-x s then exit and
>> restart Emacs
>>
>> If tramp is going to open the file and it is large I think it should
>> warn the user (respecting large-file-warning-threshold?) rather than
>> going ahead without confirmation and erroring with an alarming message!
>
> Well, this happens in case Tramp inserts the file into a temporary
> buffer.  What about this patch:
>
> diff --git a/lisp/tramp-sh.el b/lisp/tramp-sh.el
> index 071ef79..8561962 100644
> --- a/lisp/tramp-sh.el
> +++ b/lisp/tramp-sh.el
> @@ -2147,6 +2147,11 @@ file names."
>  First arg OP is either `copy' or `rename' and indicates the operation.
>  FILENAME is the source file, NEWNAME the target file.
>  KEEP-DATE is non-nil if NEWNAME should have the same timestamp as FILENAME."
> +  ;; Check, whether file is too large.  Emacs checks in `insert-file-1'
> +  ;; and `find-file-noselect', but that's not called here.
> +  (abort-if-file-too-large
> +   (tramp-compat-file-attribute-size (file-attributes (file-truename 
> filename)))
> +   (symbol-name op) filename)
>    ;; We must disable multibyte, because binary data shall not be
>    ;; converted.  We don't want the target file to be compressed, so we
>    ;; let-bind `jka-compr-inhibit' to t.  `epa-file-handler' shall not
>

Yes that's better thank you, you still get the transient error if I
continue (my file was 4Gig) but I guess it's too late to do anything
else by the time it gets there


Robert





reply via email to

[Prev in Thread] Current Thread [Next in Thread]