Category Archives: Gen AI
We’re no mind readers, so from time to time, we like to do polls. Polls are quantitative in nature, so coming up with the right question is not enough – we need to do a bit of mind reading in coming up with the alternatives.
Quick development of text based RAG apps
Our hypothesis was that RAG is the cool thing to do with vector based databases, and specifically text based RAG. The conference talks we’ve given on MariaDB Vector (such as at the 24th SFSCON in Bozen, Südtirol, Italy on 8 Nov 2024) have stressed the value of easily being able to develop AI applications that answer user prompts based on knowledge in a specific text mass, not on the overall training data of an LLM.
…
Continue reading “What do you expect from vector storage in databases?”
The day has come that you have been waiting for since the ChatGPT hype began: You can now build creative AI apps using your own data in MariaDB Server! By creating embeddings of your own data and storing them in your own MariaDB Server, you can develop RAG solutions where LLMs can efficiently execute prompts based on your own specific data as context.
Why RAG?
Retrieval-Augmented Generation (RAG) creates more accurate, fact-based GenAI answers based on data of your own choice, such as your own manuals, articles or other text corpses. RAG answers are more accurate and fact-based than general Large Language Models (LLM) without having to train or fine-tune a model.
…
Continue reading “Try RAG with MariaDB Vector on your own MariaDB data!”
We’re here, we’re open source, and we have RDBMS based Vector Search for you! With the release of MariaDB 11.6 Vector Preview, the MariaDB Server ecosystem can finally check out how the long-awaited Vector Search functionality of MariaDB Server works. The effort is a result of collaborative work by employees of MariaDB plc, MariaDB Foundation and contributors, particularly from Amazon AWS.
Previously on “MariaDB Vector”
If you’re new to Vector, this is what’s happened so far:
- We blogged a number of times about our view of where Gen AI belongs in MariaDB Server
- We showed a first demo in February at our FOSDEM Fringe Event
- We launched a project page on mariadb.org/projects/mariadb-vector/, containing a number of videos
- We went on stage at Intel Vision in London, with AI everywhere
- We blogged about Amazon’s take on Vectors and MariaDB, in “MariaDB is soon a vector database, too“
The main point: MariaDB Vector is ready for experimentation
…
AI was everywhere at Intel Vision this week in London. Nearly every keynote and breakout presentation was centred around AI. I had the honour of being interviewed by Intel’s jovial Chief Commercial Officer Christoph Schell, who is just about as stereotypically German as his former neighbour from Stuttgart Jürgen Klopp (whom he referenced on-stage), namely: not at all.
Staying German but perhaps a tad less Klopp-like, Thomas Bach was one of many interviewed on-stage by Christoph. The president of the International Olympic Committee nevertheless impressed me by his quick-witted reply to Christoph’s question as to how AI would have made an impact if it had been in place during Thomas Bach’s fencing career.
…
Continue reading “MariaDB Vector at Intel Vision – AI Everywhere”
We say: Put your AI vectors into your RDBMS …
Relational databases are where AI data belongs. Users need their vectors along with the rest of their data, in a standard database which offers performance, scalability, and all the other traditional virtues, such as ACID compliance.
This is why we are developing MariaDB Vector. Expect to see a first preview release later this month.
… but don’t take our word for it – ask Amazon!
Now, we’re not alone in advocating the above logic. That’s probably because the logic makes sense. The best articulation of the logic of “you want your Gen AI integrated in your relational database” I’ve heard is by MariaDB Foundation Board Member Sirish Chandrasekharan, General Manager of Amazon Relational Database Services.
…