Nov 12 2023

Improvise. Adept. Overcome.

Published by under Thoughts

This post contains some random but valuable job tips that people that I respect have given me over the years.

A former boss of mine used to say this: “I have completely given up on planning my day because it was too frustrating. Whenever I enter the office, somebody comes with something from somewhere and that always throws my plans for the day out of the window. But once I had accepted that fact and switched to exclusively working in ‘push-mode’, my job life had become much more comfortable.”

I fully agree with this, because later down the road, when I was a manager myself, I can confirm that this is the only successful way to get through a day when your job is to manage people and not to be a techie yourself anymore.

Here’s some golden advice from another colleague of mine, and this advice is valid when you’re new to a job but out of nowhere the situation calls for you to be “the expert”: “Fake it till you make it.”

The reality is, and that you need to accept, is that there will always be things you know nothing about be thrown at you and you somehow must make it through it. You might take some comfort in the truth that nobody else in any company you will ever work at actually “knows it all”. At first, some people might appear like some frightening guru with superior knowledge that you will never achieve – and they will play that role very convincingly and enjoy a bit too much playing it for you. But once you have worked with those folks for a while – and once you have had some time to build up your own domain knowledge – that impression will diminish and you will see that everybody only cooks with water, as the saying goes. Then you will start seeing the cracks in the knowledge of those people that appeared so scary in the beginning, and when that happens rest assured that you have become a master of that specific niche of knowledge yourself.

It is said that becoming a master at something always takes at least 10,000 hours. It also requires that you have “the bite” for it and the willingness to learn, of course. Having “the bite” is important if you really want to become good in any job or industry. But in IT, having “the bite” is not just the key to becoming good, it actually is the key to survival. You will constantly need to read those “phone books” filled with stuff you need to get the job at hand done – and the next day, once the job is done, you will never need that knowledge again but instead you will have to move on to the next set of “phone books” you need to read and understand. This part of working in IT severely sucks and it -will- drain you over the years. Especially because nobody will ever appreciate that you need to do this to do the job they’re asking you to do. IT is an ungrateful business.

Another CEO I’ve worked with offered a simpler approach to this problem: “If you’ve read one page more of the documentation than the customer has read, then that’s usually good enough.”

Still, I’m pretty sure that you will feel discomfort, stress and insecurity in a situation where things are thrown at you that you do not master, yet. I know that I still do – almost always, even after all those years. The feeling that the expectations are even higher than the piled obstacles has also never left me, regardless of where I work, and that usually goes along with self-doubt and the nagging question “am I good enough to pull this off?”

In a moment like this, step away from the keyboard, take a deep breath and relax. And remember how many challenges you already had to master before you even got here. And remember that you’re only here because you already mastered impossible missions before and that is why you got this job in the first place – because whoever interviewed you for this job you’re in now believed you can pull it off.

Another former boss of mine, back in the late 1990’s, gave me this piece of advice: “When I go into a room with people that I know will try to scrutinize me and who intentionally will try to tear apart everything that I am going say, I also get nervous upfront. But then I just remember all that I know and that I am capable to do, and then I’m okay again.” (Note: Those people he was referring to usually were Venture Capitalists.)

Pulling this one off requires quite a bit of self-confidence and security – and faith in your abilities. As for that faith, I think the US Marine Corps has got it right, and if you follow one of their mottos, you will always succeed:




Comments Off on Improvise. Adept. Overcome.

Oct 07 2023

Platform and programming language predictions

Published by under Programming,Thoughts

My totally non-scientific view and opinions, based upon decades of experience in the software industry:

  • Windows: It’s not going anywhere. It will remain the most popular desktop OS, it will remain very relevant in the server room as well — there’s just way too much mission critical software running on that platform and too much accumulated knowledge about the platform that nobody will ever want to waste. Besides – the reality also is that the platform is stable and predictable. And Microsoft isn’t going anywhere either. Also, Microsoft and their platform cater to grown-up worker-class people that need to support their families, have kids that go to school, have downpayments on houses and cars and that have plenty of things beyond computers in their lives. These people just need to pragmatically get the job done – which is, let’s face it, something that just is not on the agenda of any other platform out there.
  • macOS: It had its brief moment when tech people wanted to play with it, because they had heard that there is a Unix underneath a relatively user-friendly GUI. While there are still tech folks out there that might want to use a Mac for exactly that reason, the majority has gone back to Windows in the meantime. Unlike macOS, Windows actually pays the bills. And unlike Apple hardware, Windows hardware is an affordable commodity.
  • Desktop Linux: It does not have – and won’t ever have – a place on the mainstream desktop, that means the desktop of a regular, non-technical user that is not a tech nerd and that does not have a gigantic tech support department behind him or her that keeps the system running. End of fantasy.
  • Server-side Linux and Linux on appliances: Linux has a place in the server room, on web servers, compute and storage clusters and for hosting highly customized software and applications. It is a valid and great choice for anything that’s super-custom. However, it you want to build actual appliances or if you are looking for an open foundation for something that you want to sell later as a proprietary solution, use FreeBSD or any other BSD instead. The GPL will be your legal enemy, but the BSD and MIT licenses were explicitly designed to enable businesses. There’s a reason why Apple (all of their operating systems) and Sony (Playstation) and Nintendo (Switch) chose FreeBSD – and not Linux – as the foundation of their respective platforms.
  • Cloud computing: The “trillion dollar mistake” everybody seems to be falling for. Using “the cloud” just means you’re using someone else’s computer and that you entrust your valuable business data to someone else that you don’t even know. The only reason why big corporations fall for the cloud hype is that they believe they might save some costs down the road because they don’t have to hire all these IT people themselves and thus can reduce at least some of their many HR issues (read: employment contracts, law suits launched by disgruntled ex-employees and severance packages). The cloud exists so that companies can eliminate a bit of the unwanted human factor. And it exists so that Microsoft, Amazon and Google can become the compute infrastructure of the world – with all the power that comes along with it. Basically, the entire industry has run full circle back to the 1950s/1960s when the only way to use a computer was to rent one from IBM. (In case you don’t know, IBM never sold those machines – you literally could only rent them from IBM.) In the time and age of open source software, offering software as a cloud-based service is also a smart way to avoid the open source trap: If you can no longer sell software licenses, then make it too complicated for customers to self-host the software and instead make sure that they need to subscribe to your (cloud/hosted) services in order to successfully and productively use your open source software.
  • C – still the most used system language for compilers and operating systems. It’s not an application language anymore (if it ever even was one), but it also won’t go anywhere because of its strong entrenchment in the systems space. Engineers who need to work close – really close – to the actual iron of a machine still choose this language for very valid reasons. (Even though, I might add, their life would be better if they chose Free Pascal instead. Kernighan’s essay on Pascal has long lost its validity, in case somebody wants to bring that up now; it was only true for the original Niklaus Wirth implementation.)
  • C++ – as horrible as it is, it just cannot ever go away. Too much stuff that should have been written in any other language – because the software would then have been better and more maintainable and less bug-ridden – has been written in C++ and nobody will ever rewrite that software. The C++ gremium will keep adding all the language features they can find to the language, making it an even bigger and uglier monstrosity, and some poor bastards will still feel be forced to learn at least a subset of the damn thing to write code in it and earn a living. May God have mercy on their souls, because the average software house doesn’t.
  • Rust – one of those super-hyped languages that was created to replace C and C++. It is not the first one to try this and it won’t be the last. Make a realistic guess how much success that language will have on the long run. It’s like day-dreaming that French or German will replace English as the global business language in this century.
  • D – See my remarks on Rust. The exact same story.
  • Go – Go won’t ever replace C or C++. Thus far, it mostly managed to replace Python in certain problem domains or for certain tasks simply because it can be compiled, is faster than Python and is adequate for many things that system administrators do in their daily jobs: Go found a niche and became a dev ops language. Just don’t believe for a second that it will ever become mainstream enough to become a language for the next mobile or desktop GUI application or game. And it really doesn’t matter that somebody wrote Qt or Gtk or whatever-floats-your-boat bindings for it – in the real world outside of your basement nobody uses Go for that.
  • PHP – Is anybody still using it outside of the WordPress community? And if there are such folks, how many of them are not trying to replace it?
  • JavaScript – yeah, unfortunately, it’s the Lingua Franca of web frontends. I heard that TypeScript made it somewhat bearable. But lucky me, I don’t have to do web development — HTML, CSS and JavaScript are already three of the reasons why I hate web development with a passion. (Think of me what you want: I liked Flash and its own version of EcmaScript, ActionScript, and still firmly believe that while the implementation could have been significantly improved, the idea behind Flash was good and made much more sense than this whole HTML 5 bullshit. Flash was killed by Steve Jobs because back at that time it threatened the dominance of the iPhone; things that were created in Flash/ActionScript could run on any platform, including Android – so that was dangerous to Apple and needed to be killed. Too bad that too many “decision makers” bought into the Apple FUD at the time.)
  • Lua – A language I actually use on a daily basis in my job. It’s integrated into our product, so for me Lua is super-relevant. If you had told me this a few years ago before I switched jobs, I either would have laughed or looked at you in total disbelief. But here we go – Lua is a bread and butter tool for me now. It’s also a bread and butter tool in certain industry niches and many other products have embedded Lua as their scripting language of choice, too. That’s what it was designed for: An embedded scripting language for applications. And it does that job very well. But it only will be relevant for you if you happen to professionally work with a piece of software that has Lua integrated.
  • Python – For me, Python is today what BASIC used to be in the 1980s. Heck, it even looks and feels like a BASIC dialect. It has found a niche in the scientific sector (thanks to some very strong libraries that are, hm, written in C and thus fast enough for the job at hand). It won’t become a language for desktop applications anytime soon, simply because of its deployment nightmares, poor performance and its shitty support for multi-threading. Also, the world is still waiting for something like Visual Python that would actually make developing desktop applications in Python fun and productive. You know, just what Visual Basic did for the world back in the day, and what tools like GAMBAS excellently replicated (too bad that GAMBAS only exists for Linux and too bad that there is no commercial entity behind it – GAMBAS is fucking awesome).
  • BASIC, Pascal, xBase dialects, COBOL: You always hear that these languages are dead. You hear the same about COBOL. The reality is that Delphi or its open source sibling Free Pascal are very popular in Eastern Europe even today. The truth also is that an unbelievable amount of individual business solutions have been written in various BASIC or xBase or Pascal/Object Pascal dialects and versions and they are still being used and maintained today. Just like COBOL applications running on big iron at big banks or insurance companies. Nobody might be picking up those tools and languages to write some big new application from scratch. Unless, of course, we’re talking about a development team whose job it is to use those languages on a daily basis to maintain some legacy application – why would they make their own life worse by adding another language and toolchain to their toolbox in which they have not remotely the level of experience they have with the legacy beasts that they still have to use? Depending on where they work, people will still have to learn and use those so-called dead languages for decades to come. As Lovecraft wrote: “That is not dead which can eternal lie, and with strange aeons even Death may die.”


Comments Off on Platform and programming language predictions

« Prev - Next »