By 1990 or so, we could see the writing on the wall that Sun wasn't going to support NeWS for very much longer, no matter what they claimed.
https://donhopkins.com/home/images/X11NeWSPorscheHitsSunSoft...
So there was an ongoing discussion about the best way to do it all over again from scratch. Scheme was obviously an excellent language to use instead of PostScript. Here are some notes from February of 1990 (before the abomination that is CORBA was thrust upon the world), discussing how to apply John Warnock's "linguistic motherboard" ideas to other languages like Scheme, and how to implement them on top of something like Xerox PARC's "PCR" (Portable Common Runtime, essentially the virtual operating system runtime that Cedar and other Xerox software like Interpress required to run on other platforms like Unix).
(But first here's some other stuff I wrote recently about Cedar and PCR, for context. And I've inserted some links into the notes.)
https://news.ycombinator.com/item?id=22378457
>I believe that stuff is the port of Cedar to the Sun. Xerox PARC developed "Portable Common Runtime", which was basically the Cedar operating system runtime, on top of SunOS (1987 era SunOS, not Solaris, so no shared libraries or threads, which PCR had to provide). He demonstrates compiling a "Hello World" Cedar shell command, and (magically behind the scenes) dynamically linking it into the running shell and invoking it.
>Experiences Creating a Portable Cedar.
>Russ Atkinson, Alan Demers, Carl Hauser, Christian Jacobi, Peter Kessler, and Mark Weiser.
CSL-89-8 June 1989 [P89-00DD6]
http://www.bitsavers.org/pdf/xerox/parc/techReports/CSL-89-8....
>Abstract: Cedar is the name for both a language and an environment in use in the Computer Science Laboratory at Xerox PARC since 1980. The Cedar language is a superset of Mesa, the major additions being garbage collection and runtime types. Neither the language nor the environment was originally intended to be portable, and for many years ran only on D-machines at PARC and a few other locations in Xerox. We recently re-implemented the language to make it portable across many different architectures. Our strategy was, first, to use machine dependent C code as an intermediate language, second, to create a language-independent layer known as the Portable Common Runtime, and third, to write a relatively large amount of Cedar-specific runtime code in a subset of Cedar itself. By treating C as an intermediate code we are able to achieve reasonably fast compilation, very good eventual machine code, and all with relatively small programmer effort. Because Cedar is a much richer language than C, there were numerous issues to resolve in performing an efficient translation and in providing reasonable debugging. These strategies will be of use to many other porters of high-level languages who may wish to use C as an assembler language without giving up either ease of debugging or high performance. We present a brief description of the Cedar language, our portability strategy for the compiler and runtime, our manner of making connections to other languages and the Unix operating system, and some measures of the performance of our "Portable Cedar".
>PCR implemented threads in user space as virtual lightweight processes on SunOS by running several heavy weight Unix processes memory mapping the same main memory. And it also supported garbage collection. Mark Weiser worked on both PCR and the Boehm–Demers–Weiser garbage collector.
https://donhopkins.com/home/archive/NeWS/linguistic-motherbo...
Linguistic motherboard metaphore
hardware/software metaphore
A motherboard is a nicer metaphore than a ball of wax.
PostScript as a linguistic motherboard
We need non-proprietary bus in order for this idea to succeed
Vendor supplied cards - software modules. Motherboard extends over the net. PCR allows tightly coupled modules on the same card (or compatible, closely linked cards) to communicate in one address space, through local procedure calls. PostScript allows cards to communicate in memory through PCR and over the net through remote procedure calls. Polylith is a software bus. Generalization of client-server model: both ways. send code, not just data.
PCR is like the data, address, and control lines.
PCR:
http://www.bitsavers.org/pdf/xerox/parc/techReports/CSL-89-8...
https://news.ycombinator.com/item?id=22378457
https://news.ycombinator.com/item?id=22456720
PostScript is like Mitch Bradley's Forth for the S-Bus.
Open Firmware:
https://en.wikipedia.org/wiki/Open_Firmware
Allows device independant bootstrapping (binding of hardware on cards to software on the net). Device drivers. Jack Callahan's paper describes how once you are bootstrapped and communicating with your virtual hardware modules, you can generate and put into place optimized device specific drivers (automatic dynamic stub generation).
Many of the ideas originated at adobe, and have made the rounds throughout the industry and academia.
Adobe has proven PostScript's applicability to page description. Sun has proven PostScript's applicability to network extensibility.
Adobe and Sun have demonstrated that these concepts work for particular applications, using a propriatary bus, like DEC's BI bus, i.e. a proprietary language implementation. They are not nearly as useful as they could be were they based on an open bus -- a public domain language implementation. PCR is a tightly coupled open bus, we need a public domain PostScript (and Scheme, and other languages) interpreter to go along with PCR.
Witness all the people complaining about problems with NeWS (like non-availability, not running on particular pieces of hardware, server core dumps), and Adobe PostScript (like the 9600 baud 7 bit printable ascii data bottleneck), because of problems that would be easy to fix if they had sources. The failure of NeWS as a window system has demonstrated that it can fail simply because it's controled by Sun. The failure of PostScript as a ?? general purpose programming language ?? some people might argue is due to poor design, but I don't think most people I have heard criticize PostScript know what they're talking about, because they have attacked the cosmetic problems of "backward" syntax, but not addressed the real problems of dynamic binding. PostScript is very well designed, and very powerful. Postfix notation is as backward as prefix (cf big/little endian battles), both of which are simpler than infix. PostScript's main problem is dynamic binding. Reverse polish notation is a cosmetic problem, which is addressed by LispScript. There are extensions to PostScript that would make it a lot more useful, and the dynamic binding problem might even be solvable. PIX and NetScript come very close, and are (or will be) in the public domain.
PIX:
https://ieeexplore.ieee.org/document/301934/
https://news.ycombinator.com/item?id=17637483
https://news.ycombinator.com/item?id=15327211
Right now, Scheme addresses this issue beautifully, and I think it may be quite applicable to the problem, given a good enough implementation (designed for the task). ELK comes close.
ELK:
https://en.wikipedia.org/wiki/Extension_Language_Kit
Scheme and PostScript in combination (in different address spaces, in the same address space, or a synthesis of the two) might be an interesting approach. Scheme with PostScript data types and NeWS extensions. Scheme on a PostScript virtual machine (the byte code that Scheme compiles into -- so you can device independant compiled scheme code).
Try to pinpoint what it is about PostScript that makes it better than Scheme for this problem. First of all, I think it's easier to interpret. Less overhead. Easier for machines to generate? Or is that BS? The data types. Magic dictionaries as an interface to data structures (like memory mapping device registers instead of having i/o instructions). Dynamic binding is simpler, but you could implement closures on top of that, I think (by making it extremely easy to switch dictionary stacks, the way class.ps does) and even continuations (by switching execution stack).
Disadvantages of PostScript: Dynamic binding, addressed above. Polymorphic operators, so built-in operators have to do type checking. Provide type specific operators as primatives that the polymorphic ones are built in terms of, not unlike Crispin Goswell's PostScript interpreter.
Crispin Goswell's PostScript interpreter:
https://news.ycombinator.com/item?id=13198492
http://computer-programming-forum.com/36-postscript/46e6f5fc...
http://www.chilton-computing.org.uk/inf/se/mmi/p004.htm
Advantages of NeWS, PIX, NetScript: light weight processes, event queue, interprocess communication, magic dictionaries, garbage collection, ...
PCR would be an excellent base. PostScript would be a good first step, and would be extended towards scheme in an upward compatible manner. Proprietary vendor supplied (or public domain) cards could plug right in (Cedar graphics, Folio fonts, Andrew editor, X protocol, etc...)