Jul 31, 2014

How many bits are allocated by "int i;"?

A few years ago I took the Texas certification exam you must pass if you want to teach computer science in grades 8 through 12. It was a waste of time because I wound up teaching in junior colleges where you need at least a masters degree, but no certification. (Isn't it more than a little bit odd that you need a certification to teach in grades K-12 but none is required for college level teachers? Makes no sense to me.)

The test questions were mostly very simple, some down right stupid. But one question left me completely stumped.

How many bits are allocated by "int i;"?

You are supposed to pick the correct answer from a list of four possible values. None of the values were 64, but 8. 16. and 32 were on the list. IIRC 24 bits was also on the list.

At first I thought it must be 32. But, then I realized they had not stated which programming language they were talking about. Off the top of my head I could think of four languages with an "int" data type. It could have been C, C++, Java, or C#. I'm sure there are many more possibilities.

Depending on the age of the test and the version of the standard and/or the implementation of which ever language it was the answer could reasonably be 16 or 32. Which to pick? I picked 32. I didn't get a perfect score on the test, actually I got a high "B". Kind of embarrassing. But, I didn't feel too badly about the score. More than one question was either incomplete or had an answer that wasn't in the multiple choice list of answers. Ever been in that situation? You have to pick the correct answer from A, B, C, or D, but there is not enough information provided to pick the answer they want. Makes for a nasty day.

The point I'm trying to make is that the people, the panel of experts, chosen to write the test knew less about programming languages than I expect a second year CS student to know. Maybe standards for second year students have declined since I was in school (yeah, I know, typical old fart comment!). But, standards for experts have not declined. So, how could this happen? Even if you assume the language was C, the answer could be either 16 or 32. And, truth be told, I know of implementations where it could have been 36 bits.

The people who wrote the test did not know that there was more than one valid answer to the question. They could have checked most of the answers to the test questions by exercising their google muscles. But, they did not. The people trusted to write the certification test for CS teachers in Texas are clueless on the subject they are supposed to be experts in. So clueless they didn't even try to verify their answers. OK, what does that tell you about the other certification tests?

To make it worse... not long after I took that test CS was dropped from the required curriculum in Texas schools. That almost makes sense, if you can't even write a valid certification test, maybe you shouldn't try to teach the subject at all.


Jul 24, 2014

Does the world want or need a new programming language?


I recently reviewed notes going back to 2002 on the design of a programming language I've been calling Stonewolf. I have started to implement the language at least 4 times and have redesigned it at least a dozen times since then. Why do I continuously throw it away and start over? The main reason is that I have never convinced myself that my reasons for wanting it were suffieciently strong to be worth the effort of a project that could well take me the rest of my life.

My original reason for wanting a new programming language was the huge amount of pure crap you have to code to get anything done these days. The last commercial project I worked on was not that complex. The formal design document was only a dozen pages long. But, I can describe it in a paragraph. I had a bunch of original images stored as blobs in a database along with an image policy for various mobile devices. The job was to created versions of the images tailored for each mobile device. But, to do it on the fly and to cache all the new versions of the images. If originals were deleted the system was supposed to delete the cached reformatted images.

Not too complex, right? The job was done in Java using the Java imaging API and the Java database API. What took most of the time? Writing code to read and write the image files and converting to and from the format supported by the various APIs. Read the data in one format, convert it to another format, push it through the API, convert it to another format, write it out. Lots of work that had to be done because the different APIs were not compatible. Days spent writing code that just shuffled data from one structure to another structure... what a waste of time.

Most of the projects I've done lately are like that, pick the APIs, and spend all the project time writing code to duct tape them together in to something that resembles an application. What a waste of time.

Another example is some code I wrote developing a solution to the so called “nearest neighbor” problem. It is a hard problem but I came up with a pretty good solution. I developed it in C++. As I worked I realized I was spending more time working through C++'s extraordinarily complex type declarations than I spent working on the problem. The types I wanted were there, but to use them I had to write these obscenely complex type declarations. Of course, I was developing the thing as a template so I could adapt it to store multiple types of objects. That made it even worse. Testing templates is not easy. I thought I had written a complete test suite. Not so, first time I tried to use the template in a program I found that a whole section of the code had never even been compiled, and in fact, did not compile. Did not, because it could not. The code had never even been parsed.

Why does it have to be so hard? If I had never used a system that wasn't so absurdly complex I might just accept it as normal and move on. But, as a young programmer I learned LISP. Back in the early '70s I had pretty much every feature of C++ but without the complexity. By the early '80s when Standard Lisp came along you had everything C++ has now, but without the complexity.

(Lisp only has a couple of problems, first off nobody likes the parenthesis, and it is named “Lisp”. Very few managers have the balls needed to go before their management and say they are going to use “lisp” for a mission critical system. Three of the greatest languages in history, Lisp, Scheme, and Smalltalk, were all killed by their names. After all, what “Real Man, He Man, Programmer” wants to go to a party and say they Lisp, Scheme, or Smalltalk all day?)

My number one open source project for many years was SDL. A great project. But, what is it? A cross platform pile of duct tape that attempts to provide a consistent, integrated, API using many different libraries on different platforms that all do the same thing in different ways. The reason I first got involved with SDL was because I was looking for a portability platform for Stonewolf. By the way, SDL is an excellent pile of duct tape. It saves the programmer an enormous amount of time. But, it is still duct tape being used to cover over the absurd complexity that has been created in the form of languages, libraries, and operating systems.

Every time I started to work on Stonewolf I ran into questions that I could not answer. That is a good thing really. It made me do a lot of research. Just Unicode cost me months of reading and studying. Ever thought about the problem of defining “<” on strings with three cases and two ways to represent every letter that has a modifier?

So, finally I just started collecting reasons why I wanted a new programming langugue. I have quite a pile and it keeps growing. Each reason could be the subject of a very long article. Here they are, in no particular order, and certainly not finished yet.

I want a language that:
  1. Understands that Moore's law is an unstoppable force. Every 20 years Moore's law lets me buy 1024 times as much hardware for fewer dollars.
  2. Understands that my life span is more valuable than computer machine cycles. It has to get rid of all the dumb ass time wasters built into so many programming languages.
  3. Understands that multi core and 64 bits is the norm, not the exception. Every year the number of cores is going to increase.
  4. Understands that there is such a thing as a network and is happy to work with it.
  5. Does NOT try to make every I/O device look like a stream. Most things that are treated as streams are really random access devices. The rest are better thought of as message/event devices.
  6. Is more like a Swiss army knife and less like duct tape. The way to add new “libraries” is to integrate them into the language, not just provide a thin layer that exposes the API worts and all. It should be like SDL. If it has to provide duct tape, it should provide excellent duct tape.
  7. Cleans up the conflation of concepts that are built into most programming languages. For example: Variables have values. Values have types. Variables do not have types so why do we treat them like they do?
  8. Supports language extension through generic programming and an extensible set of operators without an abomination like C++ templates or the need for C++ style operator declarations. (I have some ideas... but this is a tough problem.)
  9. Is very easy to read. There can be no ambiguity about where a control structure ends, or begins for that matter. Likewise the syntax should make it easier for the language implementation to identify semantic errors. How many days of your life have been wasted looking for the place where you typed “=” instead of “==”? How much life span has been wasted looking for typos like the misplaced semicolon in “if(x < 0);”? C like syntax is full of gotchas like that one.
  10. Has a rich set of control structures. Why can't I do an SQL like select on an object container and get back a vector of objects that meet my criteria? Why aren't threads built into all programming languages? At the runtime library level?
  11. Supports more than one human language. I am not talking about locales and Unicode, I am talking about supporting multiple human languages at the source code level. Maybe the hardest problem on the list.

Oh, by the way, I also want a great IDE for the language and there should be versions for every kind of machine there is and every operating system and even for embedded systems with no OS. And, yes, I want an egg in my beer...

I'm sure to come back to this and add more “wants” to the list. But, for now this is enough to keep me busy for a very long time.

The fact is that large numbers of people have been programming since the 1950s. And yet, most of the languages we are using perpetuate mistakes that first appeared the 1950s and 1960s. Now days it is common to find people with 40+ years of programming experience. Dennis Ritchie was only 28 or 29 when he started developing C and in is early 30s when the first version was called “complete”. (If you ever programmed in K&R C you know it was no where near complete.) Brain Kernigan was a few months younger that Ritchie. As brilliant as they are they had not lived long enough to acquire the mature understanding of the practice of programming that someone with 40+ years of experience has. They most certainly did not have an understanding of 21st century computing environments. (It may seem that I am picking on C, but I use C for my examples because so many people know it and so many languages perpetuate the mistakes of C. And, it is one of my favorite languages!)

Why do we still use languages that do not reflect the environment or experience of modern programmers?

Can I solve all these problems? I doubt it, but I can try and in trying I can at least hope to get people thinking about solutions. The group is often smarter than the individual. But, is it even worth trying? Does anyone but me care about these problems?

Jul 11, 2014

So what does a reliability engineer do?

Reader Smitty50 (an old and dear friend) asks "What does a reliability engineer do?"

Good question.

Depends on the kind of organization we are talking about. In an organization that operates systems, a reliability engineer uses measurements and statistical methods to ensure that the overall system stays within the required level of reliability as the system and the components from which it is built age and the systems capabilities change.

These are critically important people. Our lives and our civilization depend on the engineers responsible for ensuring the reliability of the telecommunication systems, our bridges and road systems, our sewers, our water supply, the great hydroelectric damns, and so many other things that we all take for granted.

But, reliability has to be built in from the beginning. A system and every component of the system has to be designed and built to be reliable. You can not test in reliability any more than you can test in quality. So, the reliability engineer must be an important part of development too! The difference between a mature engineering discipline and typical new product development is the presence of reliability engineers in the design process and the quality of the mathematical tools they have and use.

Pretty common in things like civil engineering, so common in fact that you never see them because civil engineers are trained in reliability from the ground up. Remember the Tacoma narrows bridge? Never again. You can also find people in that role all through aerospace and telecommunications.

In software? Ehh... not so much. My bet is that if Microsoft had to comply with the same reliability requirements that Boeing or Airbus have to comply with they would never have shipped a single product. I was at a meeting with MS in Redmond and remember being called a liar by an MS manager when we told him the reliability of the telephone system. We were told that what we claimed (a matter of public record BTW) was not possible. But then, Windows blue screens a lot more often than a 747 crashes or your land line phone stops working.

Recently I had my cancer treatments disrupted because the version of Windows that ran the equipment blue screened and the medical staff had to wait two weeks for a technician to come out and fix it. I found a facility that uses equipment that is not controlled by Windows. Who wants to actually die, or at least be seriously burned, by a blue screen of death?

OK, shouldn't just pick on MS, pretty much every software product is just as bad. The only software that seems to even approach a professional level of reliability is software used in aircraft, and open source software. Not all of open source is that good. But, I can't remember the last time my Ubuntu Linux box crashed.