Qt AI Assistant 0.8.8 Experimental Released
January 23, 2025 by Peter Schneider | Comments
We have released Qt AI Assistant to help you in cross-platform software development now. The Qt AI Assistant is an AI-powered development assistant that runs in Qt Creator and works with multiple Large Language Models (LLM). This blog post gives a little "behind-the-scenes" view of its making.
Why The Qt AI Assistant Is What It Is
When we started to scope the Qt AI Assistant, we wanted to focus on what we believe matters most to developers: being creative and writing great code.
In many conversations with developers three areas repeatedly came up where Gen AI should help them: getting expert advice, creating unit test cases, and writing code documentation. Furthermore, we often hear that coding assistants should give good examples of the latest QML and Qt Quick functionality. So, we made it our mission to focus with our coding assistant on automating “boring” repetitive developer tasks and providing the latest Qt technology insights while still delivering a coding assistant with generic C++ and Python programming capabilities. By automating complementary tasks such as unit test cases and code documentation, we return a bit more time for the thing that made developers choose the profession in the first place: programming.
Another stakeholder that influenced the software architecture of the Qt AI Assistant was product managers of products built with Qt. Being able to protect the product's competitive advantage is crucial to them. Product Managers are concerned about leaking code in the context of prompts or the outputs of LLMs. Maintaining control by running a private cloud deployment of a Large Language Model was considered essential to protect the company from accidental or intentional intellectual property leaks. By providing the option to connect to privately hosted Large Language Models, we give our customers control over their data. As a matter of fact, we are one of the few coding assistants that do allow self-hosted cloud and local LLMs.
Implementing the MVP of our AI Assistant
We started the Qt AI Assistant by developing two things: Firstly, we built the end-to-end pipeline to an LLM that creates good QML code without fine-tuning. Secondly, we implemented the basic code completion functionality that the GitHub Copilot plug-in already offers in the Qt Creator IDE.
The end-to-end pipeline consists of the Qt Creator plug-in, which we built on top of Qt Creator 15, an open-source Language Server, which is already part of the Qt software stack, and Anthropic’s API to Claude 3.5 Sonnet. We decided to call the “Language Server” the “AI Assistant Server” because it consists of much more than a standard language server, including the LLM APIs, prompt routing, prompt engineering, context filtering, and, later on, the multi-agent capabilities.
For the code completion work, we started with implementing the suggestions triggered by the keyboard shortcut. We chose this starting point because there was increasing noise in the online forums that some developers find auto-code completing disrupting their usual workflow. We had already implemented the UI elements for the code completion tab bar for the GitHub Copilot plug-in, and therefore, adding that one for the Qt AI Assistant was straightforward.
Nevertheless, no coding assistant is complete with automatic code suggestions. So, automatic code completion was the next feature to be developed. Because automatic code completion can burn quickly through many tokens and disrupt the workflow of some developers, adding the assistant settings was next on the agenda. Initially, we kept it simple, allowing users to turn the assistant and the automatic code completion on and off across all projects.
Next was the development of the Inline Prompt Window. This required the UI design team to create a functional model in Figma for some usability tests. Once the design was settled, we started implementing a basic human language prompt interface for expert advice and prepared a drop-down menu for smart commands.
For the upcoming smart commands, we started to implement the logic in the AI Assistant Server to route prompts based on the type of the smart command and the programming language to different LLMs. Ultimately, we want to enable developers to choose their favorite LLMs by purpose, so we allow different LLMs for code completion for QML, code completion for other programming languages, prompts regarding QML, and prompts regarding other languages.
With the R&D team growing at this stage, we started implementing the connection to Llama 3.1 70B for prompts, i.e., expert advice. The Llama "herd" of models is currently the best-performing "royalty-free" language model for QML programming (I will write more about QML coding quality benchmarking in another blog post). Using a Llama 3.1 model is more complex than entering an authentication token into Anthropic’s API. Still, our AI Engineer ran it quickly in a Microsoft Azure cloud.
Optimizing the Smart Commands
Another AI Engineer optimized the prompts for the different smart commands, including creating unit test cases in Qt Test syntax. This required setting up a dataset of Qt Test cases for benchmarking the LLM responses and developing the pipeline from the Qt AI Assistant Server to the LLMs for test cases. We focused the prompt engineering on Claude 3.5 Sonnet because it performed the best without fine-tuning. We still need to optimize OpenAIs GPT-4o and Meta’s Llama 3.3 70B for this use case at a later stage.
How to Try Out the Qt AI Assistant
Trying out Qt AI Assistant is straight forward:
- Ensure that your Qt Creator is capable of using the Qt AI Assistant - check the version, it needs to be at least 15.0.1 or newer (without it Qt AI Assistant won't work)
- In Qt Creator's Extensions settings, ensure that "Use external repository" is checked
- In Qt Creator, go to the Extensions view and select the AI Assistant
- Install and activate the Qt AI Assistant extension
- Accept Internet access and the installation of the AI Assistant Server
- Select installation scope (individual user/ all users) and enable loading always at start-up
- Connect to at least one Large Language Model
Note: You need to have a valid premium Qt Development license (Qt for AD Enterprise or better). If you don't have one, sign up for a Qt Development evaluation license. That will work, too.
What’s Next for the Qt AI Assistant
Obviously, there is still plenty to do and the Qt AI Assistant evolves rapidly.
There is a good amount of work left to optimize the capabilities of various LLMs that are currently marked as Experimental. We need to optimise the prompts for LLMs such as GPT-4o. We are the furthest with optimising the prompts for Claude 3.5 Sonnet and I always suggest to try out Sonnet for the best AI Assistance experience.
One of the big questions is the requirement for an in-built chat. Many of you will say that it would be a great feature to have and I agree. But it is a major development effort, both in terms of UI development and also back-end development (managing a memory of the conversation). It would come at a trade-off for other features such as a DiffView or collecting more context from other than the current file. There are great chat solutions for general-purpose content generation and we do not have much to add except the convenience factor of running within the IDE.
We Want Your Feedback
But most of all, we want to hear your feedback on what you are missing or what we should do differently. Launching the MVP as an experimental product allows us to consider your input and evolve rapidly while setting up the future backlog. Let us know what you want in the Qt AI Assistant in the comments below or through your Qt contact.
More About The Product?
Check out product pages.
Blog Topics:
Comments
Subscribe to our newsletter
Subscribe Newsletter
Try Qt 6.8 Now!
Download the latest release here: www.qt.io/download.
Qt 6.8 release focuses on technology trends like spatial computing & XR, complex data visualization in 2D & 3D, and ARM-based development for desktop.
We're Hiring
Check out all our open positions here and follow us on Instagram to see what it's like to be #QtPeople.