« Don’t force it… get a bigger hammer. » Arthur Bloch
There are many technologies available, and a lot are updated in typical 6 months timeframe. There is an almost equal number of licensing models, subscription-based, cloud-based, and so on.
With 20+ years of experience, I’ve learned how not to be tied to a particular vendor, avoiding the risk of service deprecation, fees increase or code-breaking changes costs that are passed to the customer.
The major platforms are working very hard to optimize their business model with various cloud solutions that provide huge convenience. This is a strategy to tie customer’s data in a way that a migration to another platform, may the currently one not be adequate anymore, is increasingly difficult and costly.
On the contrary, I believe that the data belong to my customers, that the code must never be used to keep them captive.
- R and RStudio are my tools of choice for everything related to data science, with the latest and greatest from the Tidyverse.
- Heavy computational tasks are accelerated using parallelization techniques.
- C/C++/C# is also used when maximum performance is critical.
Machine learning - AI
- Python and SciPy form the basic toolset with PyCharm.
- Microsoft ML with .Net 5.0.
- Keras is my favorite to build neural networks.
- When the data is large GPU computing is leveraged.
Desktop applications & special tasks
- C# and Avalonia are used to program cross-platform applications with JetBrain’s Rider.
- Sometimes, an intermediate language like C# provides much better performance for special cases.
- MacOS and Linux as operating systems.
- PostgreSQL as database server.
- GitHub for source code management.
- Hugo for static web sites.