[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Help building Pen.el (GPT for emacs)

From: Shane Mulligan
Subject: Re: Help building Pen.el (GPT for emacs)
Date: Fri, 23 Jul 2021 18:51:19 +1200

Hi Jean and GNU friends,

GPT is potentially the best thing to happen to emacs in a very long time. It will bring back power from the corporations and save it to your computer, open source and transparent, and offline.

Please consider including a collaborative, open source prompts repository in the next version of emacs.

So far I'm yet to see anything like it, but I see in commercial products everywhere that they have full domain over this new type of code.
I am trying to build up relationships in my project Pen.el with others who value open source. gptprompts.org, for example.
This is to create a catalogue for pen.el. One thing we have just introduced
is a field to specify a licence for each prompt. However, I must say
that prompts are more like functions. *Soft prompts* are very granular
prompts as they have been reduced to a minimal number of characters using optimisation. 

Therefore, there must be support for prompting at
the syntax level of emacs, in my opinion. And it is also clear now that since a prompt looks more like binary code,
that this is a new type of function definition and a new type of programming is emerging.

A prompt function is a function defined by a
version of a Language Model (LM) and a prompt (input), but
as is the case in haskell, every function may
be reduced to one that takes a single input
and returns a single output. In other words,
most prompt functions will be parameterised
and have an arity greater than one.

I am building a collaborative imaginary
programming environment in emacs. This is an
editing environment where people can integrate
LMs into emacs, extending emacs with prompt
functions. The power of this is profound and beyond belief.
I have coined the term "prompt functions", so don't expect to be able to find
it online if you go searching.

Here is a new corporation which is creating a prompt engineering environment.
However, they do not have their own operating system to integrate prompting into. That's why emacs is years ahead, potentially.
A prompt is merely a function with a language model as a parameter. Without integration, it's quite useless.


I think a prompts database -- something like
Datomic or other RDF-like, immutable storage
must be added into GNU organisation to store
selected prompts and generations, and a GPL or EleutherAI GPT model
is ultimately integrated into core emacs via
some low level syntax through partnership with EleutherAI.
I would expect in the future to download emacs
along with an open- source GPT model, and be
able to create prompt functions as easily as
creating macros.

A 1:1 prompt:function database of sorts is a
good starting point in my opinion, but
remembering the generations is also important.
But the scale is immense. This is why a p2p
database that can remember immutably is
important, in my opinion. If this seems too
grand of scale, then at the very least
consider a GNU prompts repository.

> Sounds like a replacement for a programmer's mind.
Yes it is. It trivialises the implementation and requires that programmers now be more imaginative, and will be supported by the language model.
Rather than writing an implementation, function is defined by the
input types and a Language Model and version of the language model.

> Where is definition of the abbreviation NLP?
NLP stands for Natural Language Processing. Until recently, code was not considered part of that domain, but the truth is NLP algorithms are extremely useful for code generation, code search and code understanding.

> What is definition of the abbreviation LM?
LM stands for Language Model. It is a statistical model of language, rather than use formal grammars. Emacs lisp functions and macros do not have a syntax for stochastic/probabilistic programming.
Good, but is there a video to show what it really does?

Here is an online catalogue of GPT tools. Pen.el is among the developer tools.

=Pen.el= and emacs has the potential to do all the things for all of
the products in =gpt3demo.com=.

> I would like to demonstrate Pen.el with this particular video which I have created to demonstrate a new type of programming -- collaborative within a language model.https://mullikine.github.io/posts/caching-and-saving-results-of-prompt-functions-in-pen-el/

> Do you mean "exemplary" or "examplary", is it spelling mistake?
I am building a DSL for encoding prompt design
patterns to generate prompt functions for


> Pen.el creates functions 1:1 for a prompt to an emacs lisp function.

What this means is that a prompt may be
parameterized to define a relation (i.e.
function) and therefore code and I have chosen
to create one parameterized function per prompt.

The prompt text once associated to a LM
becomes a type of query (i.e. code), so
prompts should not be discounted as being any
less than such, and qualify for the GPL3 license.

> I understand that it is kind of fetching information, but that does not solve licensing issues, it sounds like licensing hell.
This is exactly why a GPL LM or compatible LM
is absolutely crucial and needs to be
integrated, otherwise all imaginary code will
be violating and harvesting open source for
the foreseeable future as there is no



Shane Mulligan

On Tue, Jul 20, 2021 at 5:04 AM Jean Louis <bugs@gnu.support> wrote:
* Shane Mulligan <mullikine@gmail.com> [2021-07-18 11:01]:
> Pen.el stands for Prompt Engineering in emacs.
> Prompt Engineering is the art of describing what you would
> like a language model (transformer) to do. It is a new type of programming,
> example oriented;
> like literate programming, but manifested automatically.

Sounds like a replacement for a programmer's mind.

> A transformer takes some text (called a prompt) and continues
> it. However, the continuation is the superset of all NLP tasks,

Where is definition of the abbreviation NLP?

> as the generation can also be a classification, for instance. Those
> NLP tasks extend beyond world languages and into programming
> languages (whatever has been 'indexed' or 'learned') from these
> large LMs.

What is definition of the abbreviation LM?

> Pen.el is an editing environment for designing 'prompts' to LMs. It
> is better than anything that exists, even at OpenAI or at
> Microsoft. I have been working on it and preparing for this for a
> long time.

Good, but is there a video to show what it really does?

> These prompts are example- based tasks. There are a number of design
> patterns which Pen.el is seeking to encode into a domain-specific
> language called 'examplary' for example- oriented programming.

Do you mean "exemplary" or "examplary", is it spelling mistake?

I have to ask as your description is still pretty abstract without
particular example.

> Pen.el creates functions 1:1 for a prompt to an emacs lisp function.

The above does not tell me anything.

> Emacs is Grammarly, Google Translate, Copilot, Stackoveflow and
> infinitely many other services all rolled into one and allows you to
> have a private parallel to all these services that is completely
> private and open source -- that is if you have downloaded the
> EleutherAI model locally.

I understand that it is kind of fetching information, but that does
not solve licensing issues, it sounds like licensing hell.

> ** Response to Jean Louis
> - And I do not think it should be in GNU ELPA due to above reasons.
> I am glad I have forewarned you guys. This is my current goal. Help
> in my project would be appreciated. I cannot do it alone and I
> cannot convince all of you.

Why don't you tell about licensing issues? Taking code without proper
licensing compliance is IMHO, not an option. It sounds as problem

> > Why don't you simply make an Emacs package as .tar as described in Emacs
> Lisp manual?

> Thank you for taking a look at my emacs package. It's not ready net
> for Melpa merge. I hope that I will be able to find some help in
> order to prepare it, but the rules are very strict and this may not
> happen.

I did not say to put it in Melpa. Package you can make for yourself
and users so that users can M-x package-install-file

That is really not related to any online Emacs package repository. It
is way how to install Emacs packages no matter where one gets it.

> > How does that solves the licensing problems?
> The current EleutherAI model which competes with GPT-3 is GPT-Neo.
> It is MIT licensed.

That is good.

But the code that is generated and injected requires proper

> Also the data it has been trained on is MIT licensed.

Yes, and then the program should also solve the proper contributions
automatically. You cannot just say "MIT licensed", this has to be
proven, source has to be found and proper attributions applied.

Why don't you implement proper licensing?

Please find ONE license that you are using from code that is being
used as database for generation of future code and provide link to
it. Then show how is license complied to.

> The current EleutherAI model which competes with Codex is GPT-j.
> It is licensed with Apache-2.0 License

That is good, but I am referring to the generated code.


Take action in Free Software Foundation campaigns:

In support of Richard M. Stallman

reply via email to

[Prev in Thread] Current Thread [Next in Thread]