gnu-misc-discuss
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Web versions


From: Schanzenbach, Martin
Subject: Re: Web versions
Date: Sun, 14 Mar 2021 20:39:18 +0100


> On 14. Mar 2021, at 19:25, Alfred M. Szmidt <ams@gnu.org> wrote:
> 
>   The same is true for JS/Webassembly. In fact, one could argue that this
>   is a significant part of the value offering (offline use of the web 
> application).
>   You can copy the whole site offline and continue using it.
>   Yes, there MAY be interaction with a REST API, but that is a completely 
> different
>   story not directly related to webassembly at all.
> 
> The same is absolutley not true for Javascript or Webassembly, it is
> nigh impossible to download the full set of scripts and other code to
> run it locally.  And, again -- it is running code (binary, obfuscated,
> or source) from someone else machine.

Why is it a problem that it is binary?
.deb packages also contain only binaries. You can download the source,
but that is not part of apt/deb and can be done out of band due to the freedoms
you (hopefully) have. You can also include non-free repositories. Then you 
cannot.
You may even install a free program but the dependencies it pulls in are 
ABI-compatible
proprietary libraries. As a user there is nothing you can do against that 
except changing the distribution
channel. In the browser you would call such a change "browsing to an 
alternative website".
But most importantly, it is not the fault of the program that it was 
distributed with
ABI-compatible libraries or the fact that is compiled/dymanically linked!

Can you explain to me what the difference is: Why is the browser as http client 
bad
to download binary compiled free software but the apt http client is just fine?

And regarding the difficulty of copying the scripts: How does the browser do it 
then?
I think you have not programmed a web application using JS/WA yourself that 
actually "obfuscates"
it (e.g. using webpack or webassembly or both).
What happens there is that it "packs" all the code into a single file and 
optimizes it for space
and performance. Transpilation and packing is not so much different from 
compilation and linking.
For webassembly, it is actually just semantics.

That "blob" (sic) is then downloaded by the browser, or not. It may be cached 
(=already downladed) or it may
be a webextension. I guess nothing is really keeping you from copying the 
program and running it on another machine or browser
either.
This is because you can very well download a copy any time and run it basically 
anywhere.
That is exactly what the browser does and why it is done.
Which brings me to the second point I made: For most JS applications ("single 
page applications")
the fact that you can run the program offline without "downloading it from 
another persons server" all the time
is the core value offering compared the older style of web applications using 
application servers.

And then of course there are webextensions which are even less distinguishable 
from other programs running on
your system.
Theoretically, you could have a webextension that has a compiled emacs which 
replaces all input fields so you can use it for that.
There is not even a connection with a server. (Except, of course, for 
downloading the webextension which might have happened through apt and I am not 
sure if you find this problematic now or not as the deb repository is certainly 
somebody else's server).

>  You have no idea if the code
> you got is free software or not, it is a binary blob that
> automatically runs on your computer.  I suggest you read the
> Javascript trap.
> 
>   Have you tried running emacs on a C64 recently?
> 
> Emacs has never run on a C64.
> 
>> The suggestion in this thread was to make GNU port to webassembly, and
>> then be run in a web browser, from someone elses machine.
> 
>   No, it was not:
> 
>   " ince WebAssembly is now a reality, maybe you guys should get to
>   making the browser versions of AAAAAAAALL your software?  "
> 
> In other words, exactly what I wrote.

You said the issue is that that the code comes from "somebody else's server" I 
was making the point that
the OP said no such thing and I tried to define "browser version" for you (the 
text that would follow if you hadn't removed it).
Note that OP did not even say "web applications" and hence did not imply the 
download from a website/server at all.

Please understand that I am trying to understand the logic behind the general 
rejection of webassembly as a target platform;
or JS for that matter.
It is a relevant question for packages such as GNUnet and GNU Taler. Both have 
features that require the use of a
webextension, for example.
The article on the JavaScript trap is sensible and I agree. It also proposes a 
mechanism that allows us to
have a technical means to determine if the binary is FS. Which is something 
actual binaries often lack and why reproducible builds
is a good thing (how do YOU determine if your emacs is actually compiled from a 
free software source?). But the reasoning behind the necessity for a mechanism 
in the case of web applications (JS/WA or otherwise) is well explained in the 
article.

Seeing the OP, initially I thought it was completely uncontroversial to add a 
compilation target such as WA to a gnu package
if there is a use case for it.
Especially since the platform is conceivably free software (a browser) -- 
unlike, say, the M1 platform.
But apparently not. The reasoning behind it is, however, unclear to me.


- Martin

Attachment: signature.asc
Description: Message signed with OpenPGP


reply via email to

[Prev in Thread] Current Thread [Next in Thread]