Archive for the 'Programming' Category

Jun 18 2023

Dead programming languages and the best programming languages to learn

Published by under Programming,Thoughts

After having read about a million articles, blog posts and discussion threads about the above topics, here are my own 2 cents.

In the real world, there is no such thing as a dead programming language. Somebody somewhere will always have a legacy application that’s still being used for actual business purposes that’s written in an obscure, ancient language and that still needs to be maintained and kept alive.

Business don’t rewrite a working software solution just because it’s not written in the hyped programming language du jour. When it comes to technology, actual businesses use the tools they invested in for as long as possible, and hopefully even longer. You don’t replace a hammer just because a new one hit the market. You don’t replace a truck that only has driven 300,000 miles so far just because a new one was introduced to the market. You don’t replace a commercially used Pizza oven just because there’s a new one. The list goes on.

And with custom software solutions, there is another elephant room that needs to be adressed: It almost never makes any sense whatsoever to rewrite an existing, working solution. The costs are prohibitive and the benefits run against zero.

As long as there is a tool or a business application out there that’s being used and that’s written in Algol, Fortran, COBOL, BASIC, Pascal, Ada, PL/1, mainframe Assembler, dBase, Clipper, FoxPro or whatever else might cross your mind, there will always be a job for somebody who knows how to speak those so-called dead languages. And that person will have a competitive advantage over the developers that only know the fancy new languages and software stacks. Everybody knows Python these days. (If you ask me, Python is one of the more obscure modern Basic dialects.) Only few still have actual experience with the older technology.

That being said, there’s another thing that most people don’t want to hear or talk about: Basically, once you know how to program in one language, you can pick up the semantics of most other languages relatively quickly. Most of the time, it’s more about understanding paradigms and concepts and how things generally work than the language itself.

But since software developers love to reinvent the wheel, you’ll also need to learn all the attached frameworks and tools – that, at the end of the day, all do similar things, most of the time they’re just different for the sake of being different.

So if everything is so similar, why are there so many different languages, frameworks, APIs and what-not?

The answer is quite simple, actually: It’s always more fun and more interesting to invent a new programming language for a specific problem or task or project than it is to just do the job at hand. That is why all those different languages usually excel at one very specific feature – and in every other aspect look very similar to everything else that was already available when the language was designed.

Prolog, for example, had backtracking built into the language; that was the one (killer) feature that set it apart from the rest of the pack. You did not have to implement an own backtracking algorithm when you used Prolog, you could elegantly do backtracking in a single line of code. (Disclaimer: My last encounter with Prolog was almost 30 years or so ago – my memory might betray me here. There probably was more interesting stuff in the language; but I also remember that it was missing a lot of other things that were easy to do in other language ecosystems; simple flatfile database access, for example, which was how we usually wrote to disk and indexed it in the day.)

Now that I’ve made my case for programming languages not dying, what should you learn in this time and age?

Technically, it doesn’t matter, simply because in my experience you cannot properly prepare yourself for the reality of the next job that you take.

For example, for my current job, I had to learn Lua, and I had to learn it very fast. Lua only appeared in the context of game development on my radar, which means it was only a word and an abstract idea for me. I don’t work in (modern) game development. And in the past, I have only written and finished a couple of text adventures that used my own natural language parser as the game interface. Those parsers and games were written in Turbo/Power Basic and Turbo/Borland Pascal. Lua was not even invented back then. Neither were things like Python, Perl, PHP, Java or JavaScript for that matter.

If you would have told me less than half a year before I even interviewed for that job that I would have to learn Lua very soon, I would probably have laughed at you in total disbelief. Learn a total niche language – AGAIN – for a new job? Really?

I’ve been there before quite often. I even worked for a company that made a living writing and selling compilers and development tools for their very own niche language.

Actually, whenever programming became part of whatever job I was working at the time – I almost always ended up using niche or dead programming languages.

My simple personal reality is this: Python, Java, C, C++, Java, not even PHP, Pearl or Ruby when they were still a thing ever made me any money or opened the door to an interesting job. The various dead Pascal, Basic or xBase dialects did – or the ability to quickly pick up something like Lua and become productive enough in it.

That is important, I think: You almost never need to actually master anything, you just need to know something well enough to get something done in or with it. The day after you’ve finished a project, you can usually kiss all the problem specific details that you’ve learned goodbye and the next task at hand will force you to learn a huge load of other, totally unrelated things. At least that’s how it always was for me. At best, I could take some general concepts and abstracts with me. At best.

Throughout the years, it has always helped me that I had started at a time when system memory was still counted in Kilobytes, when 5,25″ disks were still a thing and my first 10 Megabyte(!) hard disk was till a few years in the future.

Programming a computer in a Basic dialect was the default way of using a computer at that time.

Back then, we were forced to understand some fundamental technical concepts first in order to actually do something – anything – with those machines. There were no GUIs. There were no apps for everything. There was no public Internet. And still, everything was way more fun back then as it is today. If something didn’t work, we knew we had only ourselves to blame.

Nowadys we sit on millions of lines of codes others have written and we can only pray that their stuff works and does its job. Things have become brutally complex and the ugly truth is that nobody fully understands anything anymore. For example, a Raspberry Pi with Linux and Python on it is an astronomically more complex machine than what we had in the early 1980s: The Pi blows even mainframes of that time out of the water – it has more compute power, more memory and it runs an operating system that is more complex than the Unix systems that were being developed in the Bell Labs at that time; those systems did NOT have Gigabytes of RAM at their disposal like the Pi does. And they had to run on one single CPU core. And no, those were no 64 Bit machines either…

But I am grateful that I started back at that time and got to see and experience all the developments first hand. I learned Basic on a Casio PB100. My first “real” computer was a Sinclair ZX81 with one (!) Kilobyte of RAM. I learned 6502 Assembler and UCSD Pascal on an Apple ][ clone. I witnessed the rise of DOS. I was around when IBM OS/2 and Novell Netware both lost to Microsoft Windows NT. My first online experience happened on a 300 baud acoustic coupler – and that had nothing to do with the Internet, that was a just dialing into some computer somewhere far away. I saw the browser war and how Internet Explorer won over Netscape. I installed my first Linux distribution long before the Linux kernel had reached version 1.0 and before the company called Red Hat was founded. I learned COBOL, JCL and SQL on an actual IBM mainframe running MVS.

I burned a lot of time chasing the latest technological fad and buying into marketing hypes and saw many, many hyped products and technologies come and go.

The problem with it all is that I cannot give you the advice to only learn technologies or languages when somebody is paying you for it. This, unfortunately, is also not how it works. The problem in the IT business is that you constantly need to learn and read and study these thick “phone books”, also known as documentation and reference manuals – and nobody will ever appreciate that constant effort. What we do in IT simply is too abstract for the imagination of most people. They just don’t understand what it is what we’re doing and the huge amount of new stuff that we constantly need to learn just to be able to stay in the job.

There are no silver bullets for anything in our line of work. And to get back to the headline of this post, that also means that there is no best programming language to learn in this year or any other year, ever. In your spare time, learn what interests you, learn what right now might be fun for you. During your daytime job, you will be forced to learn enough crap that does not interest you at all, so don’t waste your unpaid, personal time on these things.

For me, playing with yet another Basic dialect has always peeked my interest and always qualified as a good time for me. Basic has always been designed for productivity and efficiency, and modern Basic dialects – and there are plenty of them out there – fully support OOP, networking, multithreading, multi-platform development or even 2D/3D game or desktop GUI or web development while at the same time providing execution times on the same level as optimized C. You just have to get over the fact that all so-called professionals in their ignorance and arrogance will laugh at you for using Basic – and not something as horrible as C++ or an “enterprise” language like Java (which does not do one single thing for you that you could not do in, let’s say, FreeBasic, PureBasic or BlitzMax-NG; those Basics even have their compilers written in their own language, as far as I know Java still does not have a compiler that’s written in Java).

I also still like to observe the progress of FreePascal, even though I haven’t produced any new Pascal code in years. I never understood why developers ran to C instead of Pascal – just like C, Pascal can also be used as an actual system programming language and operating systems have successfully been written in Pascal. Pascal also found wide adoption in robotics, for example. Compared to Pascal, C is just an ugly, cryptic language in my eyes and ears, and C++ is an order of magnitude uglier than Object Pascal. I know that I am not alone in my dislike for those curly braces languages, and many people have filled books with reasons why, for example, C++ is actually a very bad and poorly designed programming language. But just like fast food is bad for you, people still keep eating it, no matter how well you make your case against it.

Update: You also need to consider that while there might be many job offerings for certain programming languages, you need to ask yourself the very important question whether you want to spend your life in the specific industries or business environments in which those languages are mostly being used. Do you really want to be on the team that maintains that extremely boring accounting and banking software? Or help sell insurances? For the rest of your professional life? A lot of that boring stuff is written in C++, C# and Java. Interesting things like operating systems and compilers, on the other hand, are usually written in plain old C.

Going mainstream and learning and doing what everbody is learning is usually not the right answer to anything – you will only be competing with a lot of people who all read the same blogs or learned whatever their universities were influenced or flat-out bought to teach by an industry that is always looking for cheap developers. Developers that are fluent enough in a mainstream programming language are a dime a dozen. Whenever possible only invest your time in learning mainstream things when you’re getting paid for it.

Seasoned experts with domain specific knowledge are hard to find and expensive, and what they know is never taught at any college or university – they gained their experience through actual work in the field.

We only have limited time on this planet and life always finds a way to mess up any plan you might have had. At the end of the day, the only thing that should matter to you is to have as much fun as possible on your way. Don’t waste your time on fads or doing what everbody else does. Spend your time with what interests you and what you enjoy the most.

Comments Off on Dead programming languages and the best programming languages to learn

Mar 09 2017

Android Programming: B4A eliminates the pain of Java

Published by under Android,B4A,Programming,Software

Throughout the last one and a half decades of my career, programming came too short for my liking. I’ve spent most of my work time in networking and building server landscapes and as much as I’ve enjoyed diving deeply into things like virtualization, operating systems, TCP/IP, routing, switching and global infrastructures, I’ve missed building my own software — and by that I mean software that goes beyond the size of the occasional script.

During my teenage years, I’ve created text adventures in Turbo Pascal and Turbo Basic with self-developed parsers that could understand rather long German sentences (in the context of the game world). In my professional career, I’ve written commercial desktop applications and server daemons and services.

Now I want to look at the world of mobile apps — and since I only own devices running versions of Google’s Android and Amazon’s FireOS (which is based upon Android), Android is the beast I will study.

This leads me straight to the biggest issue with Android: Google champions Java as the “official” programming language for Android.

Over the last twenty years, I have tried again and again to give Java a fair chance and to warm up to it. At the same time, Java has proven again and again that all the prejudices against it are true, and I never managed to understand why people – or should I rather say corporations – actually choose to use it. (I think that Paul Graham in his essay about Java raised a very valid point: It is because all the wrong people, normally corporate suits, like Java — it is not chosen by the engineers.)

Some people at Google chose Java as the official language for Android development, so, once again, I tried to warm up to it and spend some time reading some introductory material. There are some very well written books on the market, but it only takes a few dozen pages for the sad realization to kick it that Java is an absolute horrible, overly complex and complicated mess of a language. Even worse, the whole culture of Java is a bureaucratic nightmare, and it shows in every single line of code written in the language.

Just a simple example. You want to fire one of those little text notifications that appear for a brief moment on the screen of your Android device. They are called toast messages, and in the book Android Programming – The Big Nerd Ranch Guide, the sample code for firing a toast message looks like this:

Toast.makeText(QuizActivity.this,R.string.correct_toast,Toast.LENGTH_SHORT).show();

The string correct_toast needs to be defined in a file called strings.xml, and yes, that’s an XML file. Because, well, everything in Java wants to use XML – maybe because XML in its hideousness is such a great match for Java.

< resources>
< string name="correct_toast">Correct!< /string>
< string name="incorrect_toast">Incorrect!< /string>
< /resources>

When you look at the complexity of the Toast.maketext(…. line, it reveals almost everything that’s wrong with Java even on such a small scale. See all those dots? Java is absolutely anal about object-orientation and forces that concept down your throat wherever it can — whether it makes sense or not, whether it’s efficient or not. On the left side of the dot, you usually have an object and on the right side of the dot, you either have a method or a property. Now count the amount of dots and tell me that you think it’s obvious what this thing does. And then look at the Toast.makeText() call itself: There is another dot right after it: Toast.makeText().show(). Yes, makeText() obviously returns another object, and we call the show() method of that object.

Now tell me that this is friendly, simple and easy to understand – especially for people without five billion years of experience in the industry.

Java was designed for large teams of corporate programmers and their daily problems and their project sizes.

Well, I’m not a corporate programmer. I’m a one man shop who wants to get results. Naturally I’ve spent some time looking for alternatives that would make my life simpler.

When you look for real alternatives to Java for Android development, it won’t take long until you find a product called B4A, which before that was named Basic4Android. It also has siblings for Apple iOS – B4I – and for the desktop – B4J. You can find it on the Internet on www.b4x.com. The desktop version is free as in beer, the mobile versions need to be purchased but have time limited trial versions.

B4A is a Visual Basic-like programming language and the code that you write with it is eventually translated to Java and then compiled using the standard Android SDK tools. B4A comes with an own IDE and visual designer and a set of useful helper tools. (And unlike the IntelliJ-based Android Studio, the .NET based B4A IDE does not take five minutes from launch to reaching a state of responsiveness on my old notebook – which is just another thing that speaks volumes about Java.)

But what B4A really does for you is it takes away the pain that Java is.

This is how you fire a toast message in B4A:

ToastMessageShow( "Here comes a toast message with long visibility.", True )

Now isn’t that MUCH simpler and easier to read?

I’ve invested roughly the same amount of time — the equal of 1.5 work days —  with both Android Studio/Java and B4A, in each case starting from scratch with a book. In the case of Java, it was the book Android Programming – The Big Nerd Ranch Guide that I’ve mentioned before, for B4A I’ve used B4A: Rapid Android App Development Using Basic by Wyken Seagrave.

With the Java book, I got stuck after the first Hello World app and nothing compiled anymore — I’ve followed the tutorial and the compiler kept whining about some menu things it couldn’t resolve and being the Java noob that I am, I couldn’t fix it.

After spending the same amount of time with B4A, I was already playing with a second activity (read: a second program window), had buttons that used the built-in Text-To-Speech engine to actually SPEAK text messages, I had photos displayed in my app and there even is a little browser box in my test app that shows the Dilbert website just to see if I can make that work.

So I bought a license for the freshly released version 6.80 of B4A and once again failed to find any love for Java.

I have a real project for Android that I want to work on, and if things go well, you’ll find the result in the Google Play Store and in the Amazon App Store some time later this year. I will reveal the details at a later time, but you can trust me on this: It won’t be written in Java.

Comments Off on Android Programming: B4A eliminates the pain of Java

Next »