discuss-gnustep
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Some GNUstep discussions in other forums


From: H. Nikolaus Schaller
Subject: Re: Some GNUstep discussions in other forums
Date: Thu, 27 Dec 2018 12:56:49 +0100

Hi David,

> Am 27.12.2018 um 12:15 schrieb David Chisnall <gnustep@theravensnest.org>:
> 
> On 26/12/2018 16:08, Patryk Laurent wrote:
>> Hi David,
>>> a language (which is somewhat dated even in its latest incarnation). 
>> I would love to know your thoughts with respect to that point, if you'd care 
>> to share (off list if you'd prefer). Or might you have a talk/article you 
>> could point me to?
> 
> A programming language is intended as a compromise between the abstract 
> algorithms in the programmer's head and the concrete hardware on which it 
> runs.  Ideally, it should be easy to map in both directions: from the human, 
> to the programming language, and then to the target hardware. Roughly 
> speaking, high-level language is one that optimises for the 
> human-to-programming-language translation, a low-level language is one that 
> optimises for the programming-language-to-machine translation.
> 
> Objective-C is a good late '80s programming language.  It has an abstract 
> machine that is very close to late '80s hardware: flat memory, a single 
> processor.

Really? First of all even the good old 680x0 did have a coprocessor concept and 
multitasking support. AFAIR it was even multiprocessor capable but I am not 
sure if there were machines built with several processors.

And ObjC has NSThread and there are Distributed Objects which are very old but 
powerful concepts. If DO is used properly you have a single set of 
communicating objects which can be spread over multiple processors, machines, 
even distant internet nodes...

Unfortunately Apple almost abandoned the concept and did not take care much 
about binary compatibility and an open protocol specification (GNUstep DO are 
not compatible to OS X DO). Therefore it is rarely used.

>  This isn't the world that we're currently living in.  Computers have 
> multiple, heterogeneous, processors.  My ancient phone (Moto G, first 
> generation) has four ARM cores, some GPU cores, and a bunch of specialised 
> accelerators.  It has a fairly simple memory hierarchy, but my laptop and 
> desktop both have 3 layers of caches between the CPU cores and main memory 
> and have private DRAM attached to the GPU.
> 
> A modern language has to expose an abstract machine that's similar to this.

Hm. What do you mean with this? IMHO a modern (high level) language should not 
expose but hide all this heterogenity and 3 layers of cache if they exist in 
physical HW.

Usually you don't program for four ARM cores and a specific number of GPU cores 
and accelerators. You rather expect that your high-level (application) program 
(game, physics simulator, CAD, word processor, web browser, video editor, ...) 
is compiled in a way that it magically makes best use of what is available.

Of course, ObjC is not ideal for implementing system level libraries or even a 
kernel.

>  A good language also has to remove mechanical work from the programmer.  
> Objective-C does some nice things with reflection here: for example, you need 
> a lot less boilerplate in Objective-C to wire up a GUI to its controller than 
> in Java.
> 
> Objective-C more or less gives you memory safety if you avoid the C subset of 
> the language (though that's pretty hard).  Unfortunately, even if you avoid C 
> in your own code, all non-trivial Objective-C programs link to a load of 
> complex (and, therefore, buggy) C libraries. They have no protection from 
> these libraries: a single pointer bug in the C code can violate all of the 
> invariants that the Objective-C runtime depends on (as you can see from a lot 
> of previous posts in this list).

> 
> Modern Objective-C, with ARC, at least gives you temporal memory safety, 
> though it also gives you memory leaks if you have cyclic data structures and 
> don't explicitly break memory cycles.  Classes such as NSArray give you 
> spatial memory safety if you use them instead of C arrays (and don't call 
> methods like -data).  With Objective-C++, you can use lower-overhead things 
> like std::string and std::vector for primitive types and get memory safety if 
> you use .at() instead of operator[], but it's somewhat clunky (memory safety 
> is possible, it isn't the easiest option).
> 
> C++ has evolved a lot in the last 7 years.  With std::shared_ptr and 
> std::unique_ptr, you get the same level of memory safety as ARC, with similar 
> overheads.  ARC integrates nicely with Objective-C++, so you can put 
> Objective-C object pointers into C++ structs safely (including, for example, 
> having a std::vector<id>).  Objective-C and C++ have very different 
> strengths: Objective-C provides high-level abstractions for late binding, C++ 
> provides tight coupling for low-level compile-time specialised data 
> structures.
> 
> If you have to write Objective-C now, I'd recommend Objective-C++ with ARC as 
> the default base language.  It's no surprise that this was Microsoft's choice 
> for WinObjC and apparently Apple also uses Objective-C++ extensively in their 
> own frameworks.  GNUstep is somewhat crippled by using neither ARC nor 
> Objective-C++ internally.  Both significantly improve developer productivity.
> 
> The three big challenges in language design for modern requirements are:
> 
> - Concurrency (including heterogeneous multiprocessing)

> - Error handling
> - Safe isolation (sandboxing / compartmentalisation)
> 
> Be suspicious of any 'new' language that doesn't have a good story for all of 
> these.  If you can't express the idea of a graph of objects that the GPU now 
> has exclusive access to, then your language isn't suitable for modern 
> hardware.

Why should the programmer (or a modern high-level language) have to care that a 
thing called "GPU" exists besides a "CPU" in modern harware? It they have there 
seems to be something wrong with the abstraction level.

>  If you have to think about memory safety and can't easily integrate with 
> sandboxed C libraries, then it isn't suitable for modern security 
> requirements.
> 
> David

BR,
NIkolaus




reply via email to

[Prev in Thread] Current Thread [Next in Thread]