

Thanks. My general strategy regarding GenAI and reducing the amount of hallucinations is by not giving it the task to make stuff up, but to just work on existing text - that’s why I’m not allowing users to create content without source material.
However, LLMs will be LLMs and I’ve been testing it out a lot and found already multiple hallucinations. I built in a reporting system, although only reporting stuff works right now, not viewing reported questions.
That’s my short term plan to get a good content quality, at least. I also want to move away from Vercel AI & Gemini to a Langchain Agent system or Graph maybe, which will increase the output Quality.
Maybe in some parallel Universe this really takes off and many people work on high quality Courses together…
In theory not an issue. I use Supabase, which you can self host as well.
You can also self host the Mistral Client, but not Gemini. However, I am planning to move away from Gemini towards a more open solution which would also support self hosting, or in-browser AI.
Since I use Supabase, it runs on PgSQL and Supabase Storage, which is just an Adapter to AWS S3 - or any S3, really. For Auth, I use Supabase Auth which uses OAuth 2.0, that’s the same as OIDC right?