Platform and programming language predictions

My totally non-scientific view and opinions, based upon decades of experience in the software industry:

  • Windows: It’s not going anywhere. It will remain the most popular desktop OS, it will remain very relevant in the server room as well — there’s just way too much mission critical software running on that platform and too much accumulated knowledge about the platform that nobody will ever want to waste. Besides – the reality also is that the platform is stable and predictable. And Microsoft isn’t going anywhere either. Also, Microsoft and their platform cater to grown-up worker-class people that need to support their families, have kids that go to school, have downpayments on houses and cars and that have plenty of things beyond computers in their lives. These people just need to pragmatically get the job done – which is, let’s face it, something that just is not on the agenda of any other platform out there.
  • macOS: It had its brief moment when tech people wanted to play with it, because they had heard that there is a Unix underneath a relatively user-friendly GUI. While there are still tech folks out there that might want to use a Mac for exactly that reason, the majority has gone back to Windows in the meantime. Unlike macOS, Windows actually pays the bills. And unlike Apple hardware, Windows hardware is an affordable commodity.
  • Desktop Linux: It does not have – and won’t ever have – a place on the mainstream desktop, that means the desktop of a regular, non-technical user that is not a tech nerd and that does not have a gigantic tech support department behind him or her that keeps the system running. End of fantasy.
  • Server-side Linux and Linux on appliances: Linux has a place in the server room, on web servers, compute and storage clusters and for hosting highly customized software and applications. It is a valid and great choice for anything that’s super-custom. However, it you want to build actual appliances or if you are looking for an open foundation for something that you want to sell later as a proprietary solution, use FreeBSD or any other BSD instead. The GPL will be your legal enemy, but the BSD and MIT licenses were explicitly designed to enable businesses. There’s a reason why Apple (all of their operating systems) and Sony (Playstation) and Nintendo (Switch) chose FreeBSD – and not Linux – as the foundation of their respective platforms.
  • Cloud computing: The “trillion dollar mistake” everybody seems to be falling for. Using “the cloud” just means you’re using someone else’s computer and that you entrust your valuable business data to someone else that you don’t even know. The only reason why big corporations fall for the cloud hype is that they believe they might save some costs down the road because they don’t have to hire all these IT people themselves and thus can reduce at least some of their many HR issues (read: employment contracts, law suits launched by disgruntled ex-employees and severance packages). The cloud exists so that companies can eliminate a bit of the unwanted human factor. And it exists so that Microsoft, Amazon and Google can become the compute infrastructure of the world – with all the power that comes along with it. Basically, the entire industry has run full circle back to the 1950s/1960s when the only way to use a computer was to rent one from IBM. (In case you don’t know, IBM never sold those machines – you literally could only rent them from IBM.) In the time and age of open source software, offering software as a cloud-based service is also a smart way to avoid the open source trap: If you can no longer sell software licenses, then make it too complicated for customers to self-host the software and instead make sure that they need to subscribe to your (cloud/hosted) services in order to successfully and productively use your open source software.
  • C – still the most used system language for compilers and operating systems. It’s not an application language anymore (if it ever even was one), but it also won’t go anywhere because of its strong entrenchment in the systems space. Engineers who need to work close – really close – to the actual iron of a machine still choose this language for very valid reasons. (Even though, I might add, their life would be better if they chose Free Pascal instead. Kernighan’s essay on Pascal has long lost its validity, in case somebody wants to bring that up now; it was only true for the original Niklaus Wirth implementation.)
  • C++ – as horrible as it is, it just cannot ever go away. Too much stuff that should have been written in any other language – because the software would then have been better and more maintainable and less bug-ridden – has been written in C++ and nobody will ever rewrite that software. The C++ gremium will keep adding all the language features they can find to the language, making it an even bigger and uglier monstrosity, and some poor bastards will still feel be forced to learn at least a subset of the damn thing to write code in it and earn a living. May God have mercy on their souls, because the average software house doesn’t.
  • Rust – one of those super-hyped languages that was created to replace C and C++. It is not the first one to try this and it won’t be the last. Make a realistic guess how much success that language will have on the long run. It’s like day-dreaming that French or German will replace English as the global business language in this century.
  • D – See my remarks on Rust. The exact same story.
  • Go – Go won’t ever replace C or C++. Thus far, it mostly managed to replace Python in certain problem domains or for certain tasks simply because it can be compiled, is faster than Python and is adequate for many things that system administrators do in their daily jobs: Go found a niche and became a dev ops language. Just don’t believe for a second that it will ever become mainstream enough to become a language for the next mobile or desktop GUI application or game. And it really doesn’t matter that somebody wrote Qt or Gtk or whatever-floats-your-boat bindings for it – in the real world outside of your basement nobody uses Go for that.
  • PHP – Is anybody still using it outside of the WordPress community? And if there are such folks, how many of them are not trying to replace it?
  • JavaScript – yeah, unfortunately, it’s the Lingua Franca of web frontends. I heard that TypeScript made it somewhat bearable. But lucky me, I don’t have to do web development — HTML, CSS and JavaScript are already three of the reasons why I hate web development with a passion. (Think of me what you want: I liked Flash and its own version of EcmaScript, ActionScript, and still firmly believe that while the implementation could have been significantly improved, the idea behind Flash was good and made much more sense than this whole HTML 5 bullshit. Flash was killed by Steve Jobs because back at that time it threatened the dominance of the iPhone; things that were created in Flash/ActionScript could run on any platform, including Android – so that was dangerous to Apple and needed to be killed. Too bad that too many “decision makers” bought into the Apple FUD at the time.)
  • Lua – A language I actually use on a daily basis in my job. It’s integrated into our product, so for me Lua is super-relevant. If you had told me this a few years ago before I switched jobs, I either would have laughed or looked at you in total disbelief. But here we go – Lua is a bread and butter tool for me now. It’s also a bread and butter tool in certain industry niches and many other products have embedded Lua as their scripting language of choice, too. That’s what it was designed for: An embedded scripting language for applications. And it does that job very well. But it only will be relevant for you if you happen to professionally work with a piece of software that has Lua integrated.
  • Python – For me, Python is today what BASIC used to be in the 1980s. Heck, it even looks and feels like a BASIC dialect. It has found a niche in the scientific sector (thanks to some very strong libraries that are, hm, written in C and thus fast enough for the job at hand). It won’t become a language for desktop applications anytime soon, simply because of its deployment nightmares, poor performance and its shitty support for multi-threading. Also, the world is still waiting for something like Visual Python that would actually make developing desktop applications in Python fun and productive. You know, just what Visual Basic did for the world back in the day, and what tools like GAMBAS excellently replicated (too bad that GAMBAS only exists for Linux and too bad that there is no commercial entity behind it – GAMBAS is fucking awesome).
  • BASIC, Pascal, xBase dialects, COBOL: You always hear that these languages are dead. The reality is that Delphi or its open source sibling Free Pascal are very popular in Eastern Europe even today. The truth also is that an unbelievable amount of individual business solutions have been written in various BASIC or xBase or Pascal/Object Pascal dialects and versions and they are still being used and maintained today. Just like the COBOL applications that are still running on big iron at big banks or insurance companies. Nobody might be picking up those tools and languages to write some big new application from scratch. Unless, of course, we’re talking about a development team whose job it is to use those languages on a daily basis to maintain some legacy applications – why would they make their own life worse by adding another language and toolchain to their toolbox in which they have not remotely the level of experience they have with the legacy beasts that they still have to use? Depending on where they work, people will still have to learn and use those so-called dead languages for decades to come. As Lovecraft wrote: “That is not dead which can eternal lie, and with strange aeons even Death may die.”

 

Dead programming languages and the best programming languages to learn

After having read about a million articles, blog posts and discussion threads about the above topics, here are my own 2 cents.

In the real world, there is no such thing as a dead programming language. Somebody somewhere will always have a legacy application that’s still being used for actual business purposes that’s written in an obscure, ancient language and that still needs to be maintained and kept alive.

Businesses don’t rewrite a working software solution just because it’s not written in the hyped programming language du jour. When it comes to technology, actual businesses use the tools they invested in for as long as possible, and hopefully even longer. You don’t replace a hammer just because a new one hit the market. You don’t replace a truck that only has driven 300,000 miles so far just because a new one was introduced to the market. You don’t replace a commercially used Pizza oven just because there’s a new one. The list goes on.

And with custom software solutions, there is another elephant room that needs to be addressed: It almost never makes any sense whatsoever to rewrite an existing, working solution. The costs are prohibitive and the benefits run against zero.

As long as there is a tool or a business application out there that’s being used and that’s written in Algol, Fortran, COBOL, BASIC, Pascal, Ada, PL/1, mainframe Assembler, dBase, Clipper, FoxPro or whatever else might cross your mind, there will always be a job for somebody who knows how to speak those so-called dead languages. And that person will have a competitive advantage over the developers that only know the fancy new languages and software stacks. Everybody knows Python these days. (If you ask me, Python is one of the more obscure modern BASIC dialects.) Only few still have actual experience with the older technology.

That being said, there’s another thing that most people don’t want to hear or talk about: Basically, once you know how to program in one language, you can pick up the semantics of most other languages relatively quickly. Most of the time, it’s more about understanding paradigms and concepts and how things generally work than the language itself.

But since software developers love to reinvent the wheel, you’ll also need to learn all the attached frameworks and tools – that, at the end of the day, all do similar things, most of the time they’re just different for the sake of being different.

So if everything is so similar, why are there so many different languages, frameworks, APIs and what-not?

The answer is quite simple, actually: The industry constantly needs to sell new buzzwords to keep the lights on. And for developers, of course, it’s always more fun and more interesting to invent a new programming language for a specific problem or task or project than it is to just do the job at hand. That is why all those different languages usually excel at one very specific feature – and in every other aspect look very similar to everything else that was already available when the language was designed.

Prolog, just to give one example, had backtracking built into the language; at that time and in the context of what was hyped in the tech world at that time, that was the one (killer) feature that set it apart from the rest of the pack. You did not have to implement an own backtracking algorithm when you used Prolog, you could elegantly do backtracking in a single line of code. (Disclaimer: My last encounter with Prolog was almost 30 years or so ago – my memory might betray me here. There probably was more interesting stuff in the language; but I also remember that it was missing a lot of other things that were easy to do in other language ecosystems; simple flatfile database access, for example, which was how we usually wrote to disk and indexed it in the day.)

Now that I’ve made my case for programming languages not dying, what should you learn in this time and age?

Technically, it doesn’t matter, simply because in my experience you cannot properly prepare yourself for the reality of the next job that you take.

For example, for my current job, I had to learn Lua, and I had to learn it very fast. Before that job, Lua had only appeared in the context of game development on my radar, which means it was only a word and an abstract idea for me. I don’t work in (modern) game development. And in the past, I have only written and finished a couple of text adventures that used my own natural language parser as the game interface. Those parsers and games were written in Turbo/Power Basic and Turbo/Borland Pascal. Lua was not even invented back then. Neither were things like Python, Perl, PHP, Java or JavaScript for that matter. C++ was still in its infancy; if anything, people were still using C and, yes, Turbo Pascal instead.

If you would have told me less than half a year before I even interviewed for that job that I would have to learn Lua very soon, I would probably have laughed at you in total disbelief. Learning a total niche language – AGAIN – for a new job? Really?

I’ve been there before quite often. I even worked for a company that made a living writing and selling compilers and development tools for their very own niche language: Xbase++, the designated successor to Clipper, which in turn was the designated successor to dBase.

Actually, whenever programming became a part of whatever job I was working at the time – I almost always ended up having to use niche or dead programming languages.

My simple personal reality is this: Python, Java, C, C++, Java, not even PHP, Pearl or Ruby when they were still a thing ever made me any money or opened the door to an interesting job. The various dead Pascal, BASIC or xBase dialects did – or the ability to quickly pick up something like Lua and become productive enough in it.

That is important, I think: You almost never need to actually master anything, you just need to know something well enough to get something done in or with it. The day after you’ve finished a project, you can usually kiss all the problem specific details that you’ve learned goodbye and the next task at hand will force you to learn a huge load of other, totally unrelated things. At least that’s how it always was for me. At best, I could take some general concepts and abstracts with me. At best.

Throughout the years, it has always helped me that I had started at a time when system memory was still counted in Kilobytes, when 5,25″ disks were still a thing and my first 10 Megabyte(!) hard disk was still a few years in the future.

Programming a computer in a BASIC dialect was the default way of using a computer at that time.

Back then, we were forced to understand some fundamental technical concepts first in order to actually do something – anything – with those machines. There were no GUIs. There were no apps for everything. There was no public Internet. And still, everything was way more fun back then as it is today. If something didn’t work, we knew we had only ourselves to blame.

Nowadays we sit on millions of lines of codes others have written and we can only pray that their stuff works and does its job. Things have become brutally complex and the ugly truth is that nobody fully understands anything anymore. For example, a Raspberry Pi with Linux and Python on it is an astronomically more complex machine than what we had in the early 1980s: The Pi blows even mainframes of that time out of the water – it has more compute power, more memory and it runs an operating system that is more complex than the Unix systems that were being developed in the Bell Labs back in the day; those systems did NOT have Gigabytes of RAM at their disposal like the Pi does. And they had to run on one single CPU core. And no, those were no 64 Bit machines either…

But I am grateful that I started back at that time and got to see and experience all the developments first hand. I learned BASIC on a Casio PB100. My first “real” computer was a Sinclair ZX81 with one (!) Kilobyte of RAM. I learned 6502 Assembler and UCSD Pascal on an Apple ][ clone. I witnessed the rise of DOS. I was around when IBM OS/2 and Novell Netware both lost to Microsoft Windows NT. My first online experience happened on a 300 baud acoustic coupler – and that had nothing to do with the Internet, that was a just dialing into some computer somewhere far away. I saw the browser war and how Internet Explorer won over Netscape. I installed my first Linux distribution long before the Linux kernel had reached version 1.0 and before the company called Red Hat was founded. I learned COBOL, JCL and SQL on an actual IBM mainframe running MVS.

I burned a lot of time chasing the latest technological fad and buying into marketing hypes and saw many, many hyped products and technologies come and go.

The problem with it all is that I cannot give you the advice to only learn technologies or languages when somebody is paying you for it. This, unfortunately, is also not how it works. The problem in the IT business is that you constantly need to learn and read and study these thick “phone books”, also known as documentation and reference manuals – and nobody will ever appreciate that constant effort. What we do in IT simply is too abstract for the imagination of most people. They just don’t understand what it is what we’re doing and the huge amount of new stuff that we constantly need to learn just to be able to stay in the job.

There are no silver bullets for anything in our line of work. And to get back to the headline of this post, that also means that there is no best programming language to learn in this year or any other year, ever. In your spare time, learn what interests you, learn what right now might be fun for you. During your daytime job, you will be forced to learn enough crap that does not interest you at all, so don’t waste your unpaid, personal time on these things.

For me, playing with yet another BASIC dialect has always peeked my interest and always qualified as a good time for me. Basic has always been designed for productivity and efficiency, and modern BASIC dialects – and there are plenty of them out there – fully support OOP, networking, multi-threading, multi-platform development or even 2D/3D game or desktop GUI or web development while at the same time providing execution times on the same level as optimized C. You just have to get over the fact that all so-called professionals in their ignorance and arrogance will laugh at you for using some BASIC dialect – and not something as horrible as C++ or an “enterprise” language like Java (none of which do one single thing for you that you could not do in, let’s say, FreeBasic, PureBasic or BlitzMax-NG; those BASICs even have their compilers written in their own language, as far as I know Java still does not have a compiler that’s written in Java).

I also still like to observe the progress of FreePascal, even though I haven’t produced any new Pascal code in years. I never understood why developers ran to C instead of Pascal – just like C, Pascal can also be used as an actual system programming language and operating systems have successfully been written in Pascal. Pascal also found wide adoption in robotics, for example. Compared to Pascal, C is just an ugly, cryptic language in my eyes and ears, and C++ is an order of magnitude uglier than Object Pascal. I know that I am not alone in my dislike for those curly braces languages, and many people have filled books with reasons why, for example, C++ is actually a very bad and poorly designed programming language. But just like fast food is bad for you, people still keep eating it, no matter how well you make your case against it.

Update: You also need to consider that while there might be many job offerings for certain programming languages, you need to ask yourself the very important question whether you want to spend your life in the specific industries or business environments in which those languages are mostly being used. Do you really want to be on the team that maintains that extremely boring accounting and banking software? Or help sell insurances? For the rest of your professional life? A lot of that boring stuff is written in C++, C# and Java. Interesting things like operating systems and compilers, on the other hand, are usually written in plain old C.

Going mainstream and learning and doing what everbody else is learning is usually not the right answer to anything – you will only be competing with a lot of people who all read the same blogs or learned whatever their universities were influenced or flat-out bought to teach by an industry that is always looking for cheap developers. Developers that are fluent enough in a mainstream programming language are a dime a dozen. Whenever possible only invest your time in learning mainstream things when you’re getting paid for it.

Seasoned experts with domain specific knowledge are hard to find and expensive, and what they know is never taught at any college or university – they gained their experience through actual work in the field.

We only have limited time on this planet and life always finds a way to mess up any plan you might have had. At the end of the day, the only thing that should matter to you is to have as much fun as possible on your way. Don’t waste your time on fads or doing what everybody else does. Spend your time with what interests you and what you enjoy the most.