🔥 Get ready for the next era of documentation tooling...
We’re launching a set of new features — all designed to help you write better technical content faster and ultimately answer questions instantly for your users, developers or team.
It's all made possible by LLMs — and we're using some of the best ones out there: OpenAI GPT-3.5 hosted in Microsoft Azure for max data safety and privacy, augmented by Llama 2 for extra reliability.
It's all opt-in so you can decide at your pace when you activate and use it on your organization's documentation.
→ Write better technical content with your team and... LLMs
Deeply integrated into our document editor, we have made it possible to provide LLMs with the context of your current document (and related documents) so that you can generate accurate content about your product, API or process.
→ Answer user questions instantly
Wouldn't it be cool for your documentation search function to give your users an answer instead of a list of documents where they need to drill into to find the actual answer?
It would save a lot of time and frustration... and we've built it!
On top, we enable you to analyze user question & answer sessions and adjust the documentation to provide LLMs even better context for better answers.
→ Answer API & dev questions instantly
LLMs understand code SDKs and even OpenAPI spec files, so questions will get answered instantly for wandering developers trying to integrate your product. Never lose developers on complex integrations again!
→ Answer billing questions...
Many questions like this would end up in your customer support channels. No need to when your users can receive instant answers.
→ How do you activate it?
Because we know some data is sensitive, only admins can activate this feature (and possibly exclude some spaces from AI indexing).
Ask an admin in your workspace to activate Archbee LLM here 👇