This is likely to be the first of a series of blog posts over some months.
I am planning to learn Ada. For those that do not know, Ada is a programming language.
Why? Well, why not? I like to expand my skills and knowledge, and it is a good language.
I have used many many programming languages over the years, high level, object oriented, and even some very "special" ones like postscript and ps-algol. I have done a lot of assembler in a range of processors. However, at the end of the day my go-to language is C, more specifically GNU C (as it has a few invaluable extensions to normal C).
I like C for several reasons. It makes it very easy to work at a low level with the compiler generating code that you expect - or does it? Sadly, times have changed. Once upon a time I could tell you what the compiler is doing - but there have been two key changes. One is some very smart optimisation in compilers, and the other is the way processors (even relatively simple ones like ARM) pipeline and cache things. The combination of the two is a compiler that knows that some things that seem obvious are not so, because of the way the pipelining and caching work. You can end up with compiler output that is quite confusing to understand, but ultimately rather good. You almost have to consider RAM to be a slow peripheral these days, which is a new way of working for me.
I also like C because of the way it allows me to access bits and bytes as I want, or does it? The big and little endian nature of processors matters, and so does the way they handle 32 bit for 64 bit registers. C has types which you cannot always tell are as specific size. Indeed, in the FireBrick code we have types like ui8 and ui32, and so on, so we know exactly what we are dealing with. You can do that with C if you try.
I also like C because all of the libraries in linux, where I do most coding, all work with C interfaces and include files. This remains true and is actually a stumbling block for learning any new language. To be honest, the FireBrick code would probably have benefitted from being done in Ada from scratch as we do not have an existing operating system - we have our own. Sorry to Cliff who did suggest that when we started the latest design (FB2700, etc) - maybe it would have been a good idea.
So, I have started, and Ada is basically a simple procedural language like C, not fundamentally different in concept. Well, to start with. The syntax is different, and in some ways more verbose / wordy. So far it seems to have some very nice features. The data object definitions and very strict typing are excellent, and something we should all use (GNU C is not too bad at some of that). The explicit mapping of data to binary structures is nice too. A nice thing is the option for a shitload of runtime checks, including data types with strict ranges.
It also has good exception handling, which is useful in any system. It also has a multiple threaded operation and scheduling as part of the language - awesome!
My understanding is that Ada suffered from being the recommended, or even mandated, language for military projects, and so people steered clear of it. That is a shame.
Another point that appeals is that it is not a stagnant language - Ada 2012 is the latest version (I think). With C, you have C99 and GNU C, but it is not a progressing and updating language in the same way.
So far I have read a lot, and done the obligatory "hello world" program. I need to read more, and code more. There is a lot to actually learning a language. Far more than the Computer Science degree stuff - you have to really understand it in practice and be able to use it properly in an operating system environment. I have to find code I can make in Ada, and that is where the fun starts.
I work a lot in a linux environment. Lots of standard libraries, from mysql, to tls/ssl, and I use them. I need "bindings" to allow these standard libraries to be used from Ada rather than C. That is likely to be where the work lies.
So, reading a book and playing with code while on holiday next week I expect, and I'll report back. Can I "embrace" Ada as a language, or will this be a small diversion and intellectual exercise, or will I just forget it and watch TV? Yet to see.
2017-08-02
Subscribe to:
Post Comments (Atom)
Don't use UPS to ship to UK
I posted about shipping and importing and tax and duty - general info. But this is specific. DON'T USE UPS! I had assumed the UPS issue ...
-
Broadband services are a wonderful innovation of our time, using multiple frequency bands (hence the name) to carry signals over wires (us...
-
For many years I used a small stand-alone air-conditioning unit in my study (the box room in the house) and I even had a hole in the wall fo...
-
It seems there is something of a standard test string for anti virus ( wikipedia has more on this). The idea is that systems that look fo...
Actually, C is getting more updates past C99. Not very well publicised though. I mostly wanted to ask what your opinion of C++ was? It suffers a lot from people who continue to write C-like (or Java-like, or C-with-object-like) patterns, but if you write modern-style C++ it's its own language which gives you both access to the low level things that you're used to but also a very strongly type-safe high-level language.
ReplyDeleteWell, OOD is an area I find not quite there yet, sorry. I struggle with it still. I have worked on projects using OOD, but not happy with it.
DeleteYou can write very modern, still type safe, c++ without OOD at all. It's almost orthogonal to the good bits of C++
DeleteIndeed, Stepanov disliked object orientation, so the standard library (the bits that used to be called the Standard Template Library) makes no use of inheritance at all.
DeleteOne thing you should note (and probably appreciate) is that the standardization culture is far more embedded in the Ada world than in the C one. Ada programmers are expected to know and program to the language standard, while the C world half the time seems to be a throw-it-over-the-wall-and-it-just-works sort of thing.
ReplyDelete(Aside: C99 has provided inttypes.h for, well, nearly twenty years now. Why are you defining your own types rather than just using uint32_t etc?)
You should read CAR (Tony) Hoare on ADA before you go too far. More that it was designed by committee than mandated for government projects. Google Tony's Turing lecture.
ReplyDeleteWhy Ada though? It is established in the military world, not because of particular qualities, but because the DoD mandates it. Ada majors on strong typing and also the feature of separating the interface to a module from its instantiation. Both these, in theory, enforce a programming style that should produce 'better' code in the sense that it is likely to have less bugs in it. But not necessarily more efficient, even with compiler optimisation.
ReplyDeleteHow about Rust? It's trying to address similar issues of type and data safety, plus some newer programming paradigms borrowed from functional programming. It seems to be growing in popularity and there's even an OS - redox-os - being developed in it. I tried Rust a year or so ago but got horribly lost in its safe data handling mechanisms (moving data, borrowing data, lifetimes etc). So for now I stick with Python for quick 'n dirty stuff and C where I need something faster (usually signal processing).
They're in the process of updating the rust book to the 2nd edition (https://doc.rust-lang.org/book/second-edition/). Although the first edition was good the second is miles better (although still not finished).
DeleteIt may be worth another look as the chapters on ownership and lifetimes are way better.
Have you considered Rust? That seems to be getting ever more popular in the C space.
ReplyDeleteAs someone who has used Rust (nonprofessionally) and Ada (professionally _and_ the SPARK toolset) I can see absolutely no reason to choose Ada over Rust other than if Ada is tickling your inquisitive funny bone.
DeleteReally Rust is becomming better and better and much more suited, IMHO, to the sort of work the Rev. does. I've mentioned it before but I thought I'd get another in.
You are aware that there was a C11 standard? i.e. it has continued to progress since C99?
ReplyDeleteActually, no, I was not... Interesting. Thanks.
Deletehttp://www.open-std.org/jtc1/sc22/wg14/www/standards.html
DeleteI studied Ada many, many years ago - my lecturer at the time was on the working group to standardise "Ada 9x" which became Ada 95.
ReplyDeleteThe one thing that was beaten into us is "The Language Reference Manual *is* the Language - if there's a way of doing it that's in the LRM, and a different way to do it, go the LRM way even if you think your way is better".
I know there were a series of studies done back in the 90s where a beginning Ada programmer - i.e. somebody who had just about managed to memorise the tty.io libraries - was put against an experienced C coder. The Ada guy was working to the "LRM is God" basis, the C coder was working to the "Safety Critical C" standards as used in aerospace and vehicle industries. The Ada code had significantly fewer (IIRC around 70% fewer) bugs against the SCC code. This was put down to such aspects as mandatory exception handling, explicit mandatory typing and type conversion and a "One True Way" approach to relatively complicated problems.
I've been putting some *serious* thought into whether an IoT toolkit based on Ada would be feasible - as far as I can see the GCC and thus GNAT backends should be capable of targetting standard embedded platforms, and I'm thinking that an enforced, standardised approach to coding style and actually enforcing limits on what a coder can get away with *should* lead to a platform that's more secure by default - it's always possible to write bad code in any language if you try hard enough, but making the good code be the easy way *should* help.
Of course, Ada is a 45 year old language, and thus the New Coders won't want to know, because as we all know there's nothing to be learned from history, and new is always better, isn't it?
I'm fairly sure that GNAT is the predominant Ada compiler out there for targetting most embedded platforms other than perhaps the very smallest (which will generally run raw asm, not Ada-compiled code, in any case).
DeleteFor decades now, GCC's been the go-to cross-compiler for embedded work, thanks to Cygnus's focus on that field back in the 80s and early 90s.
> Of course, Ada is a 45 year old language, and thus the New
ReplyDelete> Coders won't want to know, because as we all know there's
> nothing to be learned from history, and new is always
> better, isn't it?
It's true that being new doesn't necessarily mean it's better but there is a general improvement over time, if that weren't the case we'd still be living in caves.
Please consider that when somebody shows you something new and shiney it isn't always just because it's new and shiny.
Most of my stuff these days is in Python - just because I once knew Ada doesn't make me a luddite masochist - indeed "new and shiny" isn't always for its own sake, but there's an awful lot of "new model" wheels out there :)
DeleteWhen I write D code, I always use the same type names as in C99, i.e. uint32_t etc, instead of using the standard D types even though those are fixed width anyway, unlike C/C++, because it shows intent - that I actually need the widths to be a certain exact value. Similarly I use the 'at least n bit' types, the optimality variable ones, too. (All the C99 types are available in D's std.stdint iirc.) But rolling your own, eg ui32, for brevity is a good idea as my fingers get tired typing uint64_t all the time.
ReplyDelete