The A-Z of Programming Languages: Ada

S. Tucker Taft, maintainer of Ada, speaks about the development of the language and its 1995 and 2005 revisions

Computerworld is undertaking a series of investigations into the most widely-used programming languages. Previously we have spoken to Alfred v. Aho of AWK fame, and Chet Ramey about his experience maintaining Bash.

In this article we chat to S. Tucker Taft, Chairman and CTO of SofCheck. Taft has been heavily involved in the Ada 1995 and 2005 revisions, and still works with the language today as both a designer and user.

Computerworld spoke to Taft to learn more about the development and maintenance of Ada, and found a man deeply committed to language design and development.

How did you first become involved with Ada?

After graduating in 1975, I worked for Harvard for four years as the "system mother" for the first Unix system outside of Bell Labs. During that time I spent a lot of time with some of the computer science researchers, and became aware of the "DOD-1" language design competition.

I had been fascinated with programming language design for several years at that point, and thought it was quite exciting that there was a competition to design a standard language for mission-critical software. I also had already developed some strong opinions about language design, so I had some complaints about *all* of the designs.

In September of 1980, a year after I left my job at Harvard, I returned to the Boston area and ended up taking a job at Intermetrics, the company responsible for the design of the "Red" language, one of the four semifinalists and one of the two finalists for DOD-1. By that time [the language was] renamed to Ada in honor of Lady Ada Lovelace, daughter of Lord Byron and associate of Charles Babbage.

Although Intermetrics had shortly before lost the competition to Honeywell-Bull-Inria, they were still quite involved with the overall process of completing the Ada standard, and were in the process of bidding on one of the two major Ada compiler acquisitions, this one for the Air Force. After a 6-month design period and 12-month public evaluation, the Intermetrics design was chosen over two others and I became first the head of the "Ada Program Support Environment" part, and then ultimately of the Ada compiler itself.

One of the requirements of the Air Force "Ada Integrated Environment" contract was to write the entire compiler and environment in Ada itself, which created some interesting "bootstrap" problems. In fact, we had to build a separate "boot" compiler in Pascal, before we could even compile the "real" compiler. By the time we delivered, we had written almost a million lines of Ada code, and had seen Ada go from a preliminary standard to a Military standard (MIL-STD-1815), to an ANSI standard (Ada 83), and finally to an ISO standard (ISO 8652, Ada 87). I also had to go through the personal progression of learning the language, griping about the language, and then finally accepting the language as it was, so I could use it productively.

However, in 1988 the US Department of Defense announced that they were beginning the process to revise the Ada standard to produce "Ada 9X" (where X was some digit between 0 and 9). I quickly resurrected all my old gripes and a few new ones, and helped to write a proposal for Intermetrics to become the "Ada 9X Mapping/Revision Team" (the government's nomenclature for the language design team). This time the Intermetrics team won the competition over several other teams, including one that included Jean Ichbiah, the lead designer of the original Ada 83 standard. I was the technical lead of the Intermetrics "MRT" team, with Christine Anderson of the Air Force as the manager of the overall Ada 9X project on the government side.

Join the TechWorld newsletter!

Error: Please check your email address.

Tags a-z of programming languages

More about ApacheASABell LabsetworkHerculesHoneywellISONASAPraxis




Too many choices

Althoug I appreciate the information presented above there is one area where I differ.

That's in the area of learning a bunch of languages.

It's what I call the "worlds largest mall syndrome". One wanders the mall indefinitely without making concrete decisions about what you want. While choices are good, too many choices are bad.

There is only so much time to experiment and then one has to settle in and get things done. Too much cross-language development is itself part of the software problem historically. If the same language can be used for embedded real-time and large client-gui projects successfully and has the power to exploit AWK and other scripting facilities is yet another language really of any benefit. FOCUS is something I advocate.

An element of trust must accompany any project and I trust Ada because of its open-ness. Consistancy, massive regressions, tweaking over decades time and used in a massively diverse number of ways. That's what builds confidence not yet another way of doing the same thing, which is what the NEW language basically does. We keep re-inventing 99.9% of the wheel in yet another new lanague.

A few solid development languages, testing and scrutinized over decades of smart people using them. Not several more languages to choose from.

Remember the problem with Ada? Too new and not a solid tested (over time) background? Why keep making that mistake or the "mall syndrome".

Bob Trower


Closing comments

Re: Do you have any advice for up-and-coming programmers?

I emphatically agree that programmers should have familiarity with multiple languages. I think to some extent this is a failure of the various languages. However, to some extent, things such as assembly code, compiled and interpreted languages differ (to some extent) of necessity. There are many ways to combine things and each language presents a living example of differing philosophies as to how to combine things, which things belong at the same level of abstraction, etc.

I know a lot of programming languages and there is not, to my knowledge any one language that will allow a programmer to learn all of that which he should know.

I find the fact that Java, C, C++, PHP and Basic are not mentioned a little strange. According to the TIOBE Programming Community index, these account for about two thirds of languages by popularity. Hopefully it is just assumed that their mainstream nature makes them already known by everyone.

Re: Don't believe anyone who says that we have reached the end of the evolution of programming languages.

I also strongly agree with this. However, it appears as if things have stagnated a little. New languages appear even more often now than ever, it seems. However, their 'newness' is not that 'new'. I feel we have wandered from the path of pursuing this properly. The hardware, the operating system kernel, operating system services (especially user I/O) and applications are all related to the language used with them. Once upon a time, C was invented as a portable language and used to code the operating system upon which it ran. The common 'hello world' application *assumes* user I/O, but an ancient one. Languages such as Java have left their hardware behind.

What I would like to see is a proper general purpose language that is very small, allows complete control down to emitting binary data or assembly code and whose standard libraries:

1) Do not confuse language constructs with library constructs

2) Actually deal with realistic expectations:
a) GUI interfaces
b) Network I/O
c) Resources such as threads and memory

3) Allows mandatory declaration and manual management, as well as automatic declaration and management. Low-level programming against silicon should not be confounded by nonsense such as garbage collection and variable manipulation in shell scripts should not be confounded by mandatory declarations and manual memory management.

I don't think there is any fundamental disjoint between using a language for fine-grained control of hardware and the same language for very high-level constructs or broad generic programming. What I see are what look like religious arguments between adherents of one type of programming over another. I see no reason why a language compiler should disallow very detailed specification of a programmer's intentions so that it can resolve that which is resolvable at compile time right then and there and do whatever optimizations that could entail. I also see no reason why a compiler should disallow a programmer leaving it up to the compiler as to whether or not it should compile and optimize at build time, at load time, or at run time. None are mutually exclusive. There is no fundamental reason why the same code can not be interpreted at run time if so desired or compiled at build time.

Demanding run-time resolution of things like typing and garbage collection, just increases the world's carbon foot-print for no reason. Demanding that everything resolve at compile/build (or even demanding an advance compile/build) just makes programming tedious for no reason.

The design of a small, elegant, general purpose language that does the above and the tools to design and build programs written in that language is evidently a difficult job. However, that job, by being at the base of a very large pyramid of derived work can be extraordinarily difficult and still pay off.

Though I lack the wit (thus far) to design such a language and build the tool-chains necessary to implement it, I have a feeling that in the 21st century it is not as difficult a task as it once seemed.

With respect to using your general purpose language for low-level programming, I would suggest that a powerful macro system could be used to define the emission of binary data using a custom assembly language defined in the macro system. That would allow rather rapid bootstrapping to new hardware and allow even application programmers coming late to the game to program directly against hardware if and when needed.

We really need a revolution where chip designers, motherboard manufacturers, component designers, operating systems developers, network designers and application developers speak the same language. Libraries and paradigms might differ and most people would never be able to cover the gamut of the different types of development. However, I seen no reason why the basic underlying language could not be rationalized. Arguably, at some level, it is rationalized at the level of machine code. I think we can do better than that.

Comments are now closed

Top Whitepapers

Twitter Feed

Featured Whitepapers