Company Nvidia introduced Chat with RTX — its own chat-bot based on artificial intelligence, which, unlike ChatGPT and its ilk, works locally on the PC of the specific user.
The chat-bot is based on TensorRT-LLM and Retrieval Augmented Generated (RAG) and requires tensor cores to function. At the same time, video cards GeForce RTX 20 for some reason are not supported, so the novelty is available only for owners of adapters lines RTX 30 and RTX 40. There is also a condition that the video card has at least 8 GB of memory, so some mobile models do not fall into the list of supported. , and of the desktops it seems, only the fresh RTX 3050 6GB. Additionally, you need at least 16 GB RAM and Windows 11.
Very important is that Chat with RTX — this is not just another chat-bot, even if it is local. It can be given access to various files on PC (.txt, .pdf, .doc, .docx, .xml), it is able to analyze videos in YouTube, and also it can be connected to large language models Mistral and Llama 2.