Not-learning is a skill too

To be successful in tech, it’s well known that you must keep your skills up to date. The onus is on each individual to do this, no-one will do it for you, and companies that provide ongoing personal development are few and far between. Many companies would rather “remix our skills”, which means laying off workers with one skill (on statutory minimum terms) and hiring people with the new skill. Which is short-termist in the extreme; the new workers are no better than the old, they just happened to enter the workforce later, and the churn means there is no accumulation of institutional knowledge. If you were one of the newer workers, why would you voluntarily step onto this treadmill and if you were a client, why would you hire such a firm when it provides no value-add over just hiring the staff you need yourself? Anyway, I digress.

It is clear that C++11 was a enormous improvement over C++98. The list of new features is vast and all-encompassing, yet at the same time, backwards compatibility is preserved. You can have all the benefits of the new while preserving investment in the old (“legacy”). Upgrading your skills to C++11 was a very obvious thing to do, and because of the smooth transition, you could make quick wins as you brought yourself up to speed. That is just one example of the sort of thing I am talking about. You still need to put the effort in to learn it and seek out opportunities to use it, but the path from the old to the new is straightforward and there are early and frequent rewards along the way, and from there to C++14, 17, 20…

But I look around the current technology landscape and I see things that are only incremental improvements on existing programming languages or technologies and yet require a clean break with the past, which in practice means not only learning the new thing, but also rebuilding the ecosystem and tooling around it, porting/re-writing all the code, encountering all new bugs and edge cases, rediscovering the design patterns or new idioms in the language. The extent to which the new technology is “better” is dwarfed by the effort taken to use it, so where is the improved productivity coming from? Every project consists of either learning the language as you go, or maintaining and extending something written by someone who was learning the language as they went, perhaps gambling on getting in on the ground floor of the next big thing. But things only get big if people stick with them is the paradox!

So I am pretty comfortable with my decision to mostly ignore lots of new things, including but not limited to Go, Rust, Julia, Node.js, Perl6 in favour of deepening my skills in C++, R, Python and pushing into new problem domains (e.g. ML/AI) with my tried and trusted tools. When something comes along that is a big enough leap forward over any of them, of course I’ll jump – just like I did when I learnt Java in 1995 and was getting paid for it the same year! I had a lot of fun with OCaml and Haskell too, but neither gained significant traction in the end, also Scala. I don’t see anything on the horizon, all the cutting edge stuff is appearing as libraries or features for my “big 3” while the newer ecosystems are scrambling to backfill their capabilities and will probably never match the breadth and depth, before falling out of fashion and fading away. I’ll be interested in any comments arguing why I’m wrong to discount them, or any pointers to things that are sufficiently advanced to be worth taking a closer look at.

About Gaius

Jus' a good ol' boy, never meanin' no harm
This entry was posted in C++, data science, Haskell, Ocaml, Python, R. Bookmark the permalink.

3 Responses to Not-learning is a skill too

  1. Yawar says:

    I think for people who made the change, the desire for change came from within. Maybe they got tired of the AbstractProxyBeanFactory in Java; or the neverending array of language features in modern and legacy C++; or the total mutable free-for-all in Python. They got burned out on the imperative and OO messes they saw in codebase after codebase. And they decided to try something different. Some went to functional programming, some went deeper in compositional programming with category theory, some went to type-driven development, some went to design by contract.

    And over time pieces of all these alternative paradigms bled back into the mainstream of C/C++/Java/etc. That’s good in the sense that things are slowly improving, but bad in the sense that it’s always an iterative improvement, bringing just enough of the benefits from the new paradigms to keep justifying their use, and not enough to bring the radical benefits.

    • Gaius says:

      I think the iterative improvement is the only practical way to make progress as a field. It takes perhaps 3-5 years for an engineer to develop real maturity with a particular language (or technology or platform or ecosystem), several substantial projects taken from concept/prototype all the way through to the maintenance phase (“full SDLC”, × n). It’s good to maintain an awareness of what’s going on in the new-language scene and maybe dabble here and there, but any cycles shorter than that are one step forwards, two steps back in terms of concrete productivity. Maybe it is different in webdev or the startup scene where projects (and even companies) tend to be short-lived anyway but I’d argue even there, if you are trying to do something challenging then you need to control as many variables as you can, and learning a new language at the same time as trying to build a new product could be biting off more than you can chew!

  2. gentzen says:

    Since Julia does cover the application domains you plan to dive into by Python and R, features both raw speed and good foreign language interfaces (including both Python and R), and has some unique libraries, it could be a nice complement to your current mix of languages (C++, Phython, and R). Since even the JuliaPro distribution is now at version 1.0.1 (I waited with this comment until it was out), enough of the initial dust is probably already settled that diving into it now is sufficiently low risk. However, I agree that diving into it without a specific library or application in mind just for learning the base language would most likely just be a waste of time.

    For me, the unique library was that implements many algorithms discussed at and in the linked papers. I integrated Julia into a C/C++ project, and was positively surprised how well Julia handled the communication stuff. I later ported the important part of the library to C++, and this is so far the most competitive general solver (by factors > 2) for the stuff I need it for, at least with respect to the other solvers that I have benchmarked so far.

    I also found it a nice exercise in both numerics and learning the Julia language to work through and verify the performance comparisons for myself.

Leave a Reply to gentzen Cancel reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s