Obligement - L'Amiga au maximum

Mercredi 24 avril 2024 - 08:10  

Translate

Fr De Nl Nl
Es Pt It Nl


Rubriques

Actualité (récente)
Actualité (archive)
Comparatifs
Dossiers
Entrevues
Matériel (tests)
Matériel (bidouilles)
Points de vue
En pratique
Programmation
Reportages
Quizz
Tests de jeux
Tests de logiciels
Tests de compilations
Trucs et astuces
Articles divers

Articles in english


Réseaux sociaux

Suivez-nous sur X




Liste des jeux Amiga

0, A, B, C, D, E, F,
G, H, I, J, K, L, M,
N, O, P, Q, R, S, T,
U, V, W, X, Y, Z,
ALL


Trucs et astuces

0, A, B, C, D, E, F,
G, H, I, J, K, L, M,
N, O, P, Q, R, S, T,
U, V, W, X, Y, Z


Glossaire

0, A, B, C, D, E, F,
G, H, I, J, K, L, M,
N, O, P, Q, R, S, T,
U, V, W, X, Y, Z


Galeries

Menu des galeries

BD d'Amiga Spécial
Caricatures Dudai
Caricatures Jet d'ail
Diagrammes de Jay Miner
Images insolites
Fin de jeux (de A à E)
Fin de Jeux (de F à O)
Fin de jeux (de P à Z)
Galerie de Mike Dafunk
Logos d'Obligement
Pubs pour matériels
Systèmes d'exploitation
Trombinoscope Alchimie 7
Vidéos


Téléchargement

Documents
Jeux
Logiciels
Magazines
Divers


Liens

Associations
Jeux
Logiciels
Matériel
Magazines et médias
Pages personnelles
Réparateurs
Revendeurs
Scène démo
Sites de téléchargement
Divers


Partenaires

Annuaire Amiga

Amedia Computer

Relec


A Propos

A propos d'Obligement

A Propos


Contact

David Brunet

Courriel

 


Interview with Carl Sassenrath
(Interview conducted by Philippe Lourier and excerpt from TechnetCast - November 1998)


Carl Sassenrath In an earlier life, as OS Manager for Commodore, Carl Sassenrath designed and implemented the multitasking Amiga OS kernel. Unsatisfied with existing programming languages which he characterizes as non-intuitive and unnecessarily complex, he used his experience to design a new programing dialect. The result is REBOL - a language with a cause.

Our guest today is Carl Sassenrath, founder and CEO of Rebol Technologies and a leading innovator in software technology for over 15 years. He has worked most notably at HP and Apple, and he's probably best known as the designer and implementor of the Amiga OS kernel.

When the original Amiga came out in 1985 it was the first multimedia personal computer on the market that featured multimedia features at a time when IBM PC were limited to 16 colors. It featured a graphical user interface, a fast graphic subsystem and support for animation and sound.

And all this was made possible by a highly efficient modular, multitasking operating system with small memory requirements. An operating system that featured a loadable file system, shared libraries and more importantly was open and designed to be extensible. Carl, welcome to the program.

- It's great to have you on. Obviously the Amiga is the domain where you spent most of your life until you started REBOL. How did you get involved with Amiga?

Well, the Amiga was a start up company way back in 1983. At that time, I was at Hewlett-Packard and I had been working on a bitmap display system there with Hewlett-Packard implementing a graphical user interface. And it was on a prototype of the Sun workstation that we got directly from Stamford University.

I was very much into bitmap displays and I thought their were future was bright. And there was an invitation to get involved with Amiga that if I came to work there, that I could write whatever operating system I wanted.

- What products did Amiga offer at the time?

Well, Amiga was a company in disguise. It was one of those early self-marketing attempts. They were making game controllers for the Atari. That was just a front for the Amiga computer that was being worked on behind the scenes. Jay Miner and Dave Morris founded the company. Jay was employee number three at Atari, so he had a lot of experience in video game machines.

- So they hire you and you have a blank check to write an operating system. Did you have any experience writing operating systems before? What was your software programming experience at that point?

At that point, I was very much into operating systems and languages. Prior to Amiga, I had worked on the MPE kernel, the operating system for HP3000 computers, at HP. It's a fully robust commercial operating system and its still in use today. And while I was there I studied essentially every operating system that was out there. I was sort of a fanatic about operating systems.

- So you were the right guy for the job.

Well, I had been dreaming about it for so many years.

- What were some of the design constraints that guided your decisions as you started to build this new operating system?

Well, the constraints were pretty extreme because the first objective of the Amiga originally was to be a video game machine. That's why the company was formed. In those days most video game programmers went right to the metal. What I wanted to do was implement an operating system that was so efficient, yet also provide multitasking, for instance.

The balancing act was to try to make an operating system that would be useful for game programmers, but also have multitasking, dynamically shared libraries and devices - essentially have an operating system architecture to it. It would be efficient enough that most games would go ahead and use it.

- I guess multitasking is essential for a platform that would support animations, graphics, sounds, everything going on at the same time.

Right.

- How did you go about implementing multitasking in the Amiga OS kernel?

Well, back in those days, most multitasking systems were pretty large. They were on IBM?s and DEC machines, etc. What we needed to do was create something that was very efficient. So, I came up with what I guess was one of the very first micro-kernel designs for a multitasking kernel. It's very thin, very lean and mean to get the job done quickly.

- Micro-kernels are now the rage, but at the time they were really a new concept.

Yes, that's right.

- What are some of the features of a micro-kernel as opposed to an operating system kernel that includes everything but the kitchen sink?

Well, the idea is that a kernel of an operating system can be specifically oriented towards providing a relatively small number of services on which you can build layers.

What you want to do at the heart of an operating system is share resources: memory resources or whether those are profit resources or IR resources? What you're really after is to organize, manage and share that stuff. So, the smaller the code for that, the better off you are in terms of a micro-kernel. And then you build layers around it of all the other stuff.

- You had memory constraints that larger systems didn't have. What were some of the conditions that you were working in as far as memory?

The Amiga was an interesting machine because it was a 32-bit machine. It had 25 DMA, Direct Memory Access channels that were all happening simultaneous. It had a bit-blitter and a copper which were essentially running in parallel to the processor.

Pieces of memory had to be used for instance for the sound or for screen graphics, etc. The OS had to orchestrate all of that.

- What CPU was used in the original machine?

The original machine was a Motorola 68000.

- A 32-bit processor?

Yes, that was a 32-bit machine. And we had auto-configuration, essentially what they call plug and play today. There was no interrupt configuration necessary. And accelerated graphics, as I said, was bitmapped 4,096 color.

- That was a huge amount at the time.

Yes, oh yes, back in those days there were four colors on the PC.

- What was the video sub-system like? Was there dedicated video memory?

No, the video memory was organized so it actually came out of main memory, a split buffer so that the machine would be running in parallels. You could have the processor running in one area of memory while, for instance, while the video was running in another area of memory. And all of this was highly multiplexed with the BMA channels that were going on.

- Now, Carl you also mentioned, just to go back to the software side, shared libraries, how did that work?

When we started building the system, there were other people that were working on the graphics part and the GUI and the audio library.

The traditional way of doing that back in those days was to statically link everything together into a single ROM. And it became clear to me that the projects were so different and so far out of sync that there was no way it would ever get together and essentially do a static ROM build.

Besides, that would have slowed down the entire development process. So, I began to think about how we could use some mechanism of separating out all those modules into pieces that could be loaded dynamically. And as a matter of fact, the ROM is not even statically linked on the Amiga.

When the machine boots, it actually scans the entire ROM, finds all of the different libraries and models that are within the ROM and then links them all together. And that same model happens after, basically comes up off a disk, you know it scans a disk and looks for other kinds of libraries, etc. that are on it.

- Were these ROM libraries written in C or assembly?

The micro-kernel is written totally in assembly code.

- How were these libraries dynamically loaded? What did the interface between the kernel and these libraries look like?

Each of the libraries had a set of standard vectors that that were standardized by the operating system. In addition, each library could have additional vectors, entry points essentially, into the library. And every library could also have its private data area, so if it needed to keep track of information.

- Static information?

Yes, information about what things it had allocated, what it had running, it could do that. It had its own data segment for that too. And the device model actually was built on top of the library model and really the only difference between the library model and the device model is that the device model was an asynchronous model. It was message based rather than the library which was more of a C function call kind of mechanism.

- You needed an asynchronous model to deal with latency. Did the drivers run in kernel mode?

No, everything ran in user mode. Only a very small portion of the kernel ran actually in supervisor mode - the core scheduler dispatcher of tasks. Interrupt processing, for instance, would also run in supervisor mode as they called it back then. Everything else ran in user mode, so it could use other resources and not the kernel resources for what they were doing.

- And the source code was available for these internals?

Well, not in those days. Actually, I don't think the kernel source code is still available to this day. It was proprietary.

- I've seen postings from users that were recompiling the kernel. Are those sources variants of the original Amiga OS?

I think what has happened is that people have actually gone in and disassembled it, taken it and picked it apart and reversed engineered it in the 13-14 years since its creation.

- In retrospect, how successful was this extensible shared library architecture? Did it work the way you originally envisioned it when you designed it?

It was much more than I envisioned. When you design something like this, when you design anything, especially when you're on the cutting edge, you're not quite sure and have some doubts here and there about certain elements of your design.

And you know that only history will tell whether those were the right decisions to make. And, yes, history has proven that the Amiga was a very good design.

- What were some of the other major characteristics of the Amiga operating system?

One of the things I felt strong about was that it was a message-based kernel. In other words, device drivers would be separate tasks that would be sent messages with their I/O requirements. There are usually 20 or 30 independent tasks that are running right after you boot an Amiga for instance.

Even back in 1985 Amigas were running printer device, graphics, and sound drivers, and disk I/O, all separate. Passing these messages around was very important.

- How was message passing implemented? I guess, the first requirement was that it be very fast.

The message overhead was very, very low which allowed the machine to run efficiently. On the very first machine, a 7MHz 68000, the kernel processed on the order of 10,000 messages a second.

There was a standard message interface and a standard structure for communicating to device drivers.

- Lets take this to the next level. What did the user interface look like? If I were an Amiga programmer, what API would I program to on the original Amiga?

The original Amiga had a GUI with pull-down menus. It had multiple screens as well and this is still a concept that the rest of the world hasn't really seen. You could run multiple resolutions at the same time on the Amiga, but have pull-down screens with different resolutions.

If you're playing a full-screen video game, you could also have on top of it your editor or your compiler and you could switch between these windows of different resolutions quite easily.

- What was the original resolution?

The original, maximum resolution was 720x480 pixels. That was necessary to do over-scanned video output to television sets.

- Obviously, one of the most compelling features of the Amiga was its graphical user interface. What did it look like? Was it also a descendent of what was going on at Park Xerox and what Apple also used?

Yes, very much. We studied the Xerox work. We also studied at that time the Lisa. The Macintosh had not come out the door quit yet. We had different opinions on various things. We also had to deal with color and most of those systems had no color. So, that was a new consideration.

We also elected to go with a two-button mouse for instance which is kind of today's standard for the PCs at least. And that was based on the ability to both point at something on the screen and also be able to pull up menus on the screen.

- I'm sure there were many design meetings leading up to that decision?

Yes.

- That's one of the holy wars of personal computing, the one-button mouse vs. the two-button mouse?

We were very good at having discussions about these things and really ironing them out. And we came up with very good consensus decisions on things such as the two-button mouse.

- Is that why the Amiga was so successful - at least technically in the early days - because of this synergy between people working in a small group?

Yes, I think that had a lot to do with it. It was a very small team of people that all sat in the same room and communicated constantly with each other on the design of the machine. There was also the spirit of what we were doing. It's just a great feeling to know you're making a new computer and we had great dreams.

- Lets go back to software development. What was it like to write software for Amiga in the early days? Were there any development tools for programmers? Did they have to write directly with a C compiler to the API? What compilers were available?

With any new machine you have that bootstrap step of getting started with development tools. Amiga initially purchased the Green Hill C Compiler and ported it over to the Amiga.

- What tools did you use for your own development?

I developed the kernel on an HP emulator workstation that allowed me to test the code even before the processor and the hardware were ready.

That was all done in assembly code. We also used 68000 based Sage Computers that were available back in those days. We used the Sage compilers for a while and then we ported the Green Hills Compiler.

And at one point Lattice made a C compiler for the PC, they got involved and they made a Lattice C compiler for the Amiga. And then after that Manx also got involved and produced an even better C compiler. A leapfrog effect started with the competition in the marketplace to make a better C compiler for the Amiga.

- And before long the Amiga supported a number of different languages.

Yes, oh yes. The traditional set pretty much.

- I want to move away from the Amiga. Before we leave the topic however, I have to ask you about what's going on right now with the Amiga. Last year Gateway, I believe, purchased the Amiga technology. What's going on in the Amiga community these days?

Gateway essentially spun off a division of the company that was independent of Gateway to take and pursue the Amiga vision of that type of product - a high-performance multimedia computer, but still available at kind of consumer-level prices. The following of the Amiga was so strong and it really pushed them in that direction.

They were primarily interested in the proprietary technology when they originally purchased the Amiga. The patent portfolio of the Amiga is quite broad since it did a lot of these things first. But after they received a couple hundred thousand e-mails from enthusiasts all over the world that started believing very strongly in the Amiga, they began to think that maybe there was a good market for all these independent thinkers out there.

- They bought more than just a set of patents. They acquired a movement. There is development effort underway as we speak to release a new Amiga operating system.

Yes. There is a development effort going on right now. It's hot and furious and moving along and high energy. And they have some great ideas. I don't participate in it on a daily basis. I've talked to them a few times and consulted with them on some decisions. But I think they're making some very good decisions, very much following in the footsteps of that Amiga vision or dream for what a computer can be.

- What are your thoughts on Be? Be's vision is in large part Amiga's original vision for a truly multimedia, high-performance, no-compromise personal computer. We talked a few times with the Be engineers down in Menlo Park and they're enthusiastic about the Amiga. They definitely see it as a forerunner of what they're doing right now. Do you know anything about their technology?

Oh yes, yes. I've gone down there and visited them and talked to Jean-Louis Gassée and seen what they're up to, and I like what they're doing. They have a lot of really good ideas. Many of their ideas are very similar to the Amiga. It's sort of the next generation in terms of the performance and structure.

- Do you follow operating systems very closely? The movement behind Linux is gaining momentum. What are your thoughts on Unix operating systems, for example?

I've studied Unix for many, many years all the way back to the HP days prior to Amiga. I studied the Unixes that were available at that time. Back then they were more research projects, university and grass-roots products. Over the years they've evolved into a lot more than that.

Linux is a very interesting operating system to watch because it has this open source momentum behind it. Of course, it has a very Unix-like flavor, but it also has the ability to evolve very quickly. And it seems to be incredibly efficient. And you know it's actually a real pleasure to work with?

- Do you think it's performance is better than another of Unix flavors?

From what I've seen, yes. We have a number of boxes here because we port REBOL to so many platforms. I think Linux is our top performer. I haven't sat down and actually done a precise measurement, but just what I've seen of it, it does very well.

- Although that may be hard to believe because it is such a large piece of software, Windows NT is built on top of a micro-kernel. Some people would say that it has "too many layers", but it is a layered operating system.

Yes, it is a layered operating system and I think the its kernel is probably pretty good. I have never seen the source code to the kernel but the API seems to be pretty reasonable compared other products from the same company. So, I think it has a basis that's pretty sound.

But like you said, on top of that there are layers and layers and layers of other APIs and other abstractions, many of which have different pursuits and visions. The overall complexity of NT is quite high. There are at least ten thousand, probably more twenty thousand API interfaces to that operating system. One person or a small shop of programmers have a hard time dealing with that kind of complexity.

- And this will probably get worse with NT 5 which is significantly larger than the NT 4.0. You mentioned complexity. Complexity is one of your favorite topics, and this brings us to REBOL. One of the objectives of REBOL is to make programming easier and to hide some of the complexities of programming. Is that correct?

Yes that's correct. We're after productivity here. I've been involved with computer languages for 20 years and I essentially invented REBOL. I never found a language one that I felt fully productive in, one that really seemed to be well suited to getting the job done.

- You know, it seems that the complexity of learning a language and using a language interferes with developing solutions. Programmers spend most of their time struggling with the language rather than struggling with the problem domain.

That's why one of the fundamental principles behind REBOL is to get away from the language and be one step removed from the actual computer language, to speak in terms or to write your solution in terms of the problem domain.

- How does REBOL accomplish this?

We call it dialecting. It turns out that it's not an entirely new concept. Forth uses it, although REBOL is really nothing like Forth in terms of implementation and the way it functions. One of the concepts of Forth that was very good was that if you wanted to control a telescope, you should do it in terms of astronomy, in terms of stars and their locations, azimuths etc. You'd write a sub language within Forth that would be for controlling telescopes.

Or if you wanted to control a car engine, you're talking about spark plugs and cylinders and gear boxes and you'd want to write your solution in terms of those elements.

This isn't really different from English. If you're a lawyer you speak in legal terms. If you're a doctor you speak in medical terms. You have your own vocabulary and even have your own grammar, the way you rearrange words to make it more specific to that domain. And that's one of the ways we humans have evolved to deal with complexity.

If humans had the same problems communicating through language as computers have, we would just come to a grinding halt as a society. But we're able to get around those and we adapt our language to the problem domain. That adaptability is very much what REBOL is about.

- How can REBOL users build vocabularies or grammars specific to their domain? Do they need to teach the language?

Yes. There is a predefined grammar within REBOL, a functional language itself that's underneath it all. And within that functional language there is the ability to handle this dialecting. We're still working a lot of these ideas.

In many cases, you can build a grammar that's very simple without even knowing that you're doing it. In other words, combinations of words and you know values, numbers, strings that kind of thing. And you can kind of build a little ad-hoc way of interpreting those.

But we're also working on a way for that to be more formalized, so that you can actually specify the kinds of choices and the grammar, in a way similar to regular expressions or B and F grammar notation. That will make it a lot easier for people to develop their own grammar.

I don't think that everyone's will be developing their own grammars. Specialists in medicine, for instance, will develop a grammar that will be very useful to some kinds of doctors in research and they'll provide that as sort of a layer to all of those doctors. So those doctors won't be writing directly in REBOL, they'll be writing in this dialect of REBOL that was meant for them.

- I want to read a quote from you that I found on the web. It brings together the discussion we had on operating systems and your current endeavor, REBOL: "Once the language is completely in distribution, the second phase is to develop a small and flexible operating system which is integrated in a unique way with the language". Is it still your plan to move on and integrate REBOL into an operating system?

Well, that's a long-term thing. My opinion is that these days you do not set out to write a new operating system. What you do is you set out to make things more productive, or you set out to make things simpler.

Or you set out to build on these ideas of distributed messaging or inner communications, one of the things REBOL is doing. And you get that ability in there. You get that all figured out. And what ends up happening is you end up with all sorts of new applications, that things that exist today but also things that people have not imagined yet.

And you build that first, that whole base of applications. And then the operating systems that run underneath become essentially meaningless to what operating system you're running on. At that point, when you have no longer a need for a particular operating system, you can remove it from the picture. And you know that could happen.

REBOL applications are not bound to the operating system. All REBOL applications are sent rest on top of REBOL. They're isolated from all of the OS internals and we make REBOL machine independent.

- REBOL would be both the language and the platform.

Well, I don't want to really make that statement now. You know people would think you're insane if you said you were going to go out and write another operating system.

- Or an abstraction of an operating system.

Yes. And I don't want people to think of REBOL in that way. I want people to think of REBOL in terms of messaging and dialecting and intercommunication, essentially easier ways of creating software.

- Platform-independence, language and platform, these issues are central to Java. What are your thoughts on Java?

Well, that's a pretty broad question. I think Java has had some very good ideas put into it. I met its implementor a number of years ago and I've watched his work. I've always liked James Gosling's work.

The thing about Java though, is I don't think it's enough. In many ways it's very traditional. Of course it is independent of the big software empire. It is also a step in the right direction in terms of object oriented style and removes some of the dangers that C and C++ brought.

- Are you big on object-orientation?

I started out with object oriented technology in 1982. HP was one of the original alpha test sites for the Xerox Smalltalk language. And I just ate that stuff up. I got very deeply involved in object oriented technology. For many years I followed object-oriented technology.

I was involved in the implementation of various languages that were object oriented. You know I used to go to all the conferences on object oriented programming, etc. I think objects are another tool to add to your tool belt in terms of how you solve problems. But in a lot of ways they're not the panacea. The world is not as object oriented as everyone would like to think it is. I don't tell for instance the table that I'm sitting at to move across the floor. It involves a person to pick it up and carry it across the floor.

So those kinds of relationships and interactions aren't always expressed well in object oriented programming. And what ends up happening is that things start to become brittle after a while, it starts to crack. What ends up happening is people start reusing code, not by driving their objects, but by going out and copying the source code to objects and then adding what they want to that source code. So driving them essentially not the way the object model is set up.

- Well, one problem with C++ is that the object model is not dynamically extensible. There's no run time binding between objects.

That's right. We saw that early on too. When C++ came out I was at Apple Advanced Technology. And there were a lot of the Xerox guys still at Apple Advanced Technology at that time. And we used to joke about it being C+- because these Smalltalk people were looking at the thing without the dynamics. Without dynamic objects, you can?t do what you need to do with objects.

And that's what C++ was. And they've tried many various you know tricks to try to get around that. But essentially all of those have been corruption of the language and the language design.

- That may be why component object models are successful right now. They bring a solution to that one problem.

Right.

- You mentioned Apple. What was your involvement with Apple?

Apple got me involved in a parallel processor project that was very secret at the time. It's not well known but they were developing a parallel processor that ran very, very fast, at about the speed of a Cray. And they needed an object oriented operating system for it. And so they hired me to architect that object oriented operating system.

- It seems like there were many operating system projects going on at Apple in the late '80s?

Yes, this was in the '86, '87, '88 timeframe.

- Is this the project that evolved into the Taligent deal?

Yes. The first part of that design did evolve into Taligent and what was going on there. But this was very early, this was before that.

- To this day there is no new Apple operating system although Apple very early on identified the need for a modern, high-performance OS to support the type of applications they wanted to run.

Yes, and it is ironic. And it is actually very frustrating because I'm a big fan of Apple and I think that Macintosh is also a very well done machine.

And you know my feeling - and this is probably blasphemy - is that they should have ported their whole operating system over to the Intel Architecture and let it compete directly with Microsoft. The world would be a better place today.

- What are your thoughts on the evolution of technology and software development since that period? What are some comments that you make to yourself?

I think we're still in the early days. Before computers I was into neurology. The brain is very slow. The propagation of signals through neurons is incredibly slow.

But you look at everything the brain does and you have to ask, "Why is that so much more adaptive?" Where is that power coming from? Well it's coming from parallelism. And we haven't even begun to tap the power of parallelism yet in computers.

- One problem is the complexity of the algorithms involved.

It's not only that. It turns out there are matters of bandwidths. The brain is directly connected. Neurons synapse onto other neurons in a direct connection. Most computers use buses, where all of the data has to be transferred through a buffer that becomes a bottleneck to the whole process.

- Do you think neuroscience may give a pattern, a model for computer science, for ways to develop computer systems?

Yes, I think so. Back in the AI days there was a lot of that thought going into computer designs and into artificial intelligence designs. Every so many years we'll revisit that. I can't project when we'll actually be able to do this neural type of computer. But I suspect it will probably happen.


[Back to the top] / [Back to the articles' menu]