top of page
Search

Celebrating Zeta Alpha's Five Year Anniversary with Transformers at Work 2024 from the Bay Area and New Product Releases.

20 September, Amsterdam & San Francisco.


Dear Friends, Last week marked our fifth anniversary as a company, and it still feels like we are only getting started. We are celebrating with a big splash: Transformers at Work 2024 is kicking off today in the San Francisco Bay Area and we have a bucketful of new features, upgrades and product releases for you!


The Zeta Alpha Crew at the fifth anniversary "Trends in AI" edition last week.

Transformers at Work 2024 in Berkeley, California.

We are proud to present a truly stellar lineup of speakers on the present and future of LLMs, RAG and Agents for the Enterprise. We will also be launching the new RAG Agents SDK (more below) and our recent Microsoft partnership, and MS 365 and Copilot integration.


You can still join us for the online edition which we will be live streaming via Zoom and our usual social media channels. We kick off at 1pm PT (22:00 CEST). Don't worry, we've got you covered. If you sign up we'll let you know when the recordings become available. The only thing you will be missing is the party with live music from Huney Knuckles.


Introducing the Zeta Alpha RAG Agents SDK + Enhanced Chat Features

🔧 Brand new RAG Agents SDK

Bring your own LLM agents powered by the Zeta Alpha search engine!

We are now releasing our new and open source RAG Agents SDK. This project is designed to provide a flexible, scalable, and efficient framework for building, testing, and deploying LLM agents, which are seamlessly backed by our search API. And the good thing is, you can just continue to use your favorite RAG frameworks, like e.g. Langchain, and leave the retrieval to a solid enterprise-ready foundational search layer like Zeta Alpha.


Follow our quickstart guide to implement your first customized RAG Agent, and reach out to us to deploy your agent to your own Zeta Alpha tenant. 🤝 Microsoft partnership, MS 365 and Copilot plugin

Find our solution on the Azure Marketplace, as we are officially partnered to deliver Generative AI usecases powered by fine-tuned domain-specific search.


We can now also integrate the Zeta Alpha LLM agents and search platform for user to interact to MS Teams via the MS 365 and Copilot plugins in order to use our solution within the Microsoft ecosystem.

🚀 Ingestion API improvements We are now able to use agentic LLM processors in our ingestion pipeline, in order to extract tenant-specific metadata that will further enhance your search experience.

🔍 Search API improvements

We keep on enhancing the customizability of our Search API with:

  • custom index schemas allowing also for nested documents, so you can keep using the data structured data of your domain.

  • facet filters and counts, supporting different aggregations on various field types, including date ranges.

  • tenant-specific boosting configurations, tuned to improve search quality.


📘 New documentation site

If you are searching for either a quickstart guide or a deep dive to Zeta Alpha ingestion, retrieval and chat APIs, head over to our brand new documentation page.


🧠 RAGElo - improvements

RAGElo is our open source Elo tournament style LLM-as-a-judge RAG evaluation toolkit, and we have updated it so that:

  • it now goes beyond Q&A evaluation, allowing for evaluating RAG chatbots on longer conversations.

  • the LLM-as-a-judge system is also capable to run on local models deployed via Ollama.


🌐 Open Zeta_Alpha_Mistral_e5 model

We are always pushing the quality of our neural search capabilities. Now, we have released Zeta-Alpha-E5-Mistral, our first public and open embedding model. Based on a Mistral-7B, an open LLM, Zeta-Alpha-E5-Mistral is one of the best open models available, at the moment of release ranking 8th on the competitive MTEB leaderboard for Retrieval tasks and 11th overall. Going one step further In our commitment to openness, our research team also made the entire training recipe available, making it not only one of the most capable, but also one of the most transparent open embedding models available.

🗨️💬 User interface improvements to the conversational AI experience in Zeta Alpha

  • Chat with Search Results

    After an initial answer based on your search results, you can expand the chat in a side bar on the screen to interact further with the insights from indexed documents and notes, making it easier to dive deep into the information you need. You can do this both in your own private documents that only you have access to, the document shared in your organization’s team, all of the documents in your tenant, or through our Federated Search in external research or web content.

  • Chat in the context of My Documents

    Simplify your workflow with private documents-specific chat. You can now engage with collection of your private documents directly, obtaining insights and summaries without leaving your My Documents page.

  • Dynamic Context: Stay informed with relevant document content automatically loaded into your conversation when needed by a new agent type.

  • Real-Time Streaming: Experience faster interaction with real-time streaming of chat messages, making your conversations more dynamic and responsive.

Upcoming events

Next week you can meet us at OpenSearch Con San Francisco (24-26 September) and Festo IO in Esslingen. And 4th October you can see the next edition of Trends in AI. Enjoy Discovery! -- Jakub & The Zeta Alpha Crew

241 views0 comments

Comentarios


bottom of page