Developer Tools 2.0
Embracing Probabilistic Systems and the Future of Software Architecture
Greetings Readers.
Recently, I've observed the term "Developer Tools 2.0" making the rounds on social media. The conversation primarily focuses on code generation or enhancing existing developer tools through Large Language Models (LLMs). While both of these opportunities are extremely exciting, I believe that this debate is prone to a “law of the hammer” type of fallacy; meaning that it departs from an understanding that the nature of software will remain the same in its core, but just the tools will change. In this post I will propose that LLMs will not only change the way we write software (Copilot etc) but more importantly will become an integral part of the stack of the actual software we deliver. I argue that if we integrate machine learning models as active agents (for example as intent understanding api routers or even as full backend replacements) into our existing software stacks, our software will evolve from deterministic to probabilistic architectures. This drastic shift requires new tools that can effectively address the significant challenges introduced by these new probabilistic architectures. Let's delve deeper into the subject!
The Emergence of Probabilistic Software Architectures and the Role of LLMs
The ever-changing landscape of software development demands that our tools evolve accordingly. Prior to the rise of machine learning in recent years, traditional developer tools have mainly focused on deterministic software, which yields predictable outcomes. However, software architectures increasingly incorporate probabilistic elements, a trend that is primarily driven by the growing integration of, and reliance on, machine learning into software stacks. We will need a new generation of developer tools—Developer Tools 2.0—that can effectively tackle the challenges arising from these systems.
LLMs provide a good example of the ways probabilistic systems are being woven into software development. The reason for this is that LLMs have several exciting applications besides code generation:
1. API Routing: LLMs can serve as routers that select the appropriate API endpoint based on user input (eg. detecting the intent and selecting the right action - imagine a button which always does what the user wants), streamlining user experience and reducing development complexity.
2. Full Backend Replacement: LLMs can function as an entire backend for an application, processing requests and returning relevant information to the frontend. This approach allows developers to concentrate on frontend development while reducing the need for complex backend systems.
3. Adaptive Interface Generation: LLMs can generate adaptive user interfaces based on user preferences or behaviour, enhancing user experience.
4. Schema Mapping: LLMs can map external data sources to internal SQL structures, enabling developers to easily integrate external data sources into their applications without manual data wrangling.
These early examples illustrate the great potential of LLMs in software architectures, and we are only just beginning. In the coming years, I expect numerous use cases to emerge, solidifying probabilistic systems as a vital component of modern software architectures.
The Essentials of Developer Tools 2.0
Given the opportunities promised by LLMs, we need a new generation of developer tools (and paradigms) capable of handling the challenges posed by probabilistic systems. I believe that these Developer Tools 2.0 should address the following key aspects:
1. Variability Management: Developer tools must provide ways to analyse, monitor, and optimise the performance of probabilistic systems, which introduce an element of randomness / indeterminism.
2. (Un)certainty Visualisation: Developer Tools 2.0 should include capabilities for visualising uncertainty and providing explainability for probabilistic system behaviour.
3. Probabilistic Debugging: Traditional debugging techniques may not suffice for probabilistic systems. Developer Tools 2.0 should offer new approaches to debugging, such as generating multiple potential execution paths, and accounting for likelihood and impact (this path is unlikely to run but if it does contain significant risk for the system).
4. Seamless Integration: Developer Tools 2.0 must facilitate seamless integration between deterministic and probabilistic systems.
5. Ethical Considerations and Bias Mitigation: Developer Tools 2.0 should provide mechanisms to identify, quantify, and mitigate biases in probabilistic systems, promoting the creation of more inclusive, fair, and ethical software.
6. Performance Optimisation: Developer Tools 2.0 should include tools for optimising resource-intensive LLMs, such as model compression or efficient deployment strategies which optimise by hardware, platform, other resource constraints.
7. Testing and Validation: New testing methods are needed for probabilistic systems, such as incorporating statistical methods or simulations that allow tests to pass on certainties rather than on expected values / booleans.
As we embrace the future of software in its movement towards probabilistic systems driven by machine learning and LLM advancements, it's vital that we provide developers with the tools they need to effectively tackle the unique challenges posed by probabilistic software architectures. In this regard, I believe the future of software development will be shaped by probabilistic elements and efficient human-machine-collaboration beyond code generation.
If you're working on a startup in this space and are seeking funding, I'd love to hear from you! Don't hesitate to reach out to me (morris@lunar.vc), and let's explore the possibilities together.