Dino Esposito: AI is software, period.
With a career spanning over 25 years, Dino Esposito became synonymous with technological education. He published 20 books and over 1,000 articles, so it’s an understatement to say that his legacy in the technology world is priceless.
He skillfully navigated the transition from MFC to .NET, promoting innovation even when cloud was just a meteorological term. His visionary approach led to early versions of what we recognize today as cloud photo storage, long before the age of Google Photos.
Dino is currently the technical director at Crionet. He is the brain behind the platforms for professional tennis and Padel tournaments. In addition, he is a software development consultant at KBMS Data Force, a data-oriented company in the healthcare sector, with a portfolio of solutions for patient monitoring and digital therapy, which, among others, has been installed at the Gemelli Hospital in the Vatican.
We had the privilege to talk with Dino about the tech industry from his youth and today, whether fundamentals or specific technologies are more important, and he shared his advice for developers regarding AI.
“30 years ago we coded in digital caves”
Dino: It was a much more closed environment with a few big players to choose from, whether Microsoft, IBM, or Sun. Fascinated by Bill Gates, I always wanted to go with Microsoft. Thirty years ago, in the early 1990s, it was like the stone age. My first company was a real startup by today’s standards, and we were doing digital imaging in what was later called “cloud” when the JPEG format was emerging.
Today, it is a jungle. And that could be an interesting parallel. Like early humans, we used to code in digital caves, relying on hunting and gathering information for sustenance. We crafted tools and weapons from C, Assembly, and C++, with convoluted language-specific SDK to run a simple database query.
Our existence as developers was closely tied to our passion and attitude, with limited technology and community structures. Today, it’s more like a jungle. A jungle is a lush ecosystem rich in diverse forms of technology. Towering companies with broad cloud environments create a continuous, shaded tunneled vision. Various products and platforms thrive beneath to give birth to an array of wildlife software, from exotic technologies to large frameworks. The tech industry is today a neat biodiversity hotspot. What after the jungle? Hopefully, urbanization.
Fundamentals or specific technologies?
Dino: I graduated in 1990 from what, at the time, was one of the few places in Europe to study computer science. The degree course was not even set up in its own Computer Science faculty. It was a leg of the more classical faculty of Mathematics, Physics, and Natural Sciences. Right or wrong, our education was centered on computer science’s mathematical aspects–the logic behind programming languages and grammar, engineering principles, and algorithms.
I graduated without ever approaching a computer, except typing a few words in an IBM terminal. So, my obvious first answer can only be “fundamentals.” I never got back to fundamentals, though, after my degree – with very few occasional exceptions when I switched to a significantly new branch of software.
About his title, “A Cloud Visionary”
Dino: I never believed in the cloud as a salvific technology. Remember, the cloud started gaining ground around 2008 (Azure was announced at that time), right after the Lehman Brothers crash, when the entire IT world (and not just that) was about to sink with money physically drying out.
People wanted to believe in the new “cloud” as a supernatural entity capable of bringing back jobs and money. I never embraced this salvific vision of the cloud. With a small team of five people, we used to save/download pictures from the cloud back in 1995, except we didn’t call it “cloud.”
Today, the cloud is everywhere as an effective support for applications like real clouds. I don’t live in the clouds; I live on the ground and love clouds. I have never seen the power of the cloud at some point. Instead, I naturally started leveraging it. Today, I don’t believe at all in what many call cloud-native apps also because aside from names, I don’t see a substantial difference with cloud-b.
Advice for developers on AI
Dino: I have two favorite quotes about AI that emerged naturally out of presentations. One is “AI is just software”. The other is “in the beginning is only randomness; in the end is only a matter of error.”
AI is software, period. As software, it gets numbers and returns numbers. All concerns the media emphasize are the outcome of software pipelines that may be set up to make decisions. AI is a black box that returns a number – what is done with that number is the same “other” problem we still face today.
What if we set an algorithm that returns 1 if a button is pressed on a number of even milliseconds? What if we link getting 1 to launching a nuclear weapon? Is it all because the algorithm returned 1 on an even millisecond? This said, in the past year, we assisted in a “real” giant leap. Machine learning alone is not sufficient to change the face of the world–in fact, when I graduated in 1990, some of my colleagues defended their thesis on neural networks!
The real breakthrough is LLMs. For one reason – they shorten the communication between humans and software by bringing software closer to the human way of communicating. It’s phenomenal statistics, but it cut short the distance between human wording and formal reasoning of software. Developers still sit in the middle doing the invisible work of connecting humans and software – at a higher level of abstraction, though.
LLMs open a whole new world of business opportunities for developers, but wait – they only apparently are easy opportunities to seize. Study and trial-and-error are FUNDAMENTAL. Beware those (mostly marketers and influencers) who make it too easy.
About his new book
Dino: Unfortunately, it’s only a bit more than a project. I have a few chapters written, but no publishers are interested. My idea is to go through the history of software since the days in which formal reasoning was devised (back to the Middle Ages and Renaissance) and then turned into mathematical reality (1930s) and finally engineered (1950s). All in all, software as we know it today is the waste product of much more ambitious research–automating human reasoning.
The AI we experience today is a fake in this regard. These are phenomenal statistics, but they are far from thinking machines. Readily available computers of today have power in the order of gigs/hertz. The human brain is slow in the number of operations per second (around 100/sec), but it has a massive set of connections parallelizable operations. In the end, it’s about nine orders of magnitude of difference.
The day computers could reason at the speed of the human brains, it will be a different world. And maybe computers could fall in love. Why not?