At the heart of ChatGPT’s capabilities are large language modelsor large language models (MA), and more specifically the so-called “generative” LLM. LLM is not new, but its scope, capabilities and features are evolving by leaps and bounds. They can even revolutionize the search for projects, albeit in a somewhat roundabout and surprising way.
A look behind the scenes of artificial intelligence
chat or theGenerative artificial intelligence Are they doomed to kill search engines starting with Google? This is the question many asked themselves once ChatGPT was made available to the general public. It is true that ChatGPT is a bit like a search engine. You interact with it through successive questions formulated in natural language. Then he answers you in sure, well-written prose, and for the most part with tremendous, even impressive, precision. But unlike search engines, ChatGPT does not fetch content because it has no real memory of the trained data. Its role is to create a reflection of the knowledge coming from a trained subset of the web. ChatGPT finds nothing, as it generates, as surprising as it might seem. This observation proved, how a MS Obstetrics Can they really replace or supplement search engines? Can a summary generated by a giant neural network replace a search engine, whether it is for Internet research or corporate information research?
A testing ground for AI and LLMs
Let us accept from the outset that the role of the search engine is to accurately determine the presence of information, whether it is published online or in an internal corporate context, while backing it up with very real links. towards sources containing it. These clear links are the guarantor of the trust one has in the search engine because the reader then has the leisure to decide whether or not this information can be considered correct, according to the cited source. This tracking is essential. As clever as it is, we would remove from the generative LLM this supposed ability to act as a search engine. What the generative model based on LLM allows to accomplish above all is to write a script, say a synthesis, which would probably constitute not replacement but SEO as we know it. From very large LLMs such as ChatGPT to LLMs that are more specialized in specific tasks (such as machine translation or text summarization in certain areas), LLMs will really transform and personalize pre-existing search engines as well. The future will show us how engines like Google or Bing will offer new user experiences while maintaining their search engine privilege as defined above. In business, information retrieval will also be a real testing ground for AI and masters. Indeed, unlike the web world, business applications have higher requirements regarding the accuracy of the information provided or the protection of non-public information. The adoption of generative AI in enterprise information retrieval will require specific approaches to address their specific challenges. Here are some of the themes that will shape this region this year:
- LLM improves user experience
Lists of advisory answers have never been so user-friendly, although other prior technologies have already made it possible to extract the essentials in the form of visualization or navigation tools. Last year, when the first companies began integrating their first LLM into their foundation research, technology took a huge leap forward. As better LLMs became available and existing LLMs were specialized to perform certain tasks, enterprise search engines began to see the potential for new functionality. Today, and even more so in the future, generative LLMs provide opportunities for presenting relevant content because they will provide more digestible results while providing the user with a certain level of reading comfort.
- Corporate search engines help combat loss of business knowledge
In a recent survey of 1000 IT managers big companies, 67% Some of them expressed concern about the loss of business knowledge and experience, particularly due to staff departures. For a Fortune 500 company with 4,000 employees, improving search and information retrieval solutions will save within 2 million dollars a month, based solely on lost productivity. An intelligent enterprise search solution helps prevent this. An advanced company information search platform makes it possible to connect each employee in a massive way with all the knowledge and experience that he can, in the course of his duties, claim access to. The massive connection of disparate silos of information makes it easier to discover them, in favor of immediate innovation and productivity capabilities.
- Troubleshoot application explosion and digital friction
Today, employees are overwhelmed with the number and complexity of IT tools. According to a recent study conducted by ForesterLarge organizations use the average 367 different programs, creating data warehouses and disrupting the fluidity of exchanges between teams. As a result, employees spend more than a quarter of their time searching for information instead of concentrating on their own production. Smart search saves organizations time while standardizing employee experience and knowledge. Add to this that easily measurable gains in productivity and efficiency are often complemented by benefits of a very different nature, such as better risk management, customer loyalty, improved decision-making processes, and making businesses compliant with increasingly complex regulations. Then we talk about very high value added search engine based business applications, or Search based application (SBA).
- Search becomes more relevant
Despite the efforts of the designers of corporate search tools, a third of employees still say they “never find” the information they’re looking for. It is customary to blame the lack of intelligent engines, or hide behind the incredible complexity of the digital world in which companies operate. But beyond that and to address this issue, LLM technology is very promising. These models based on the use of large-scale neural networks make it possible to incorporate context allowing for increased relevance, when compared to previous generations of tools. Even better, the combination of semantic (and vector, so-called wedding) search approaches with statistical and linguistic keyword research capabilities provides unparalleled relevance across a wide range of usage scenarios. Search has become “neurological” and is enjoying the most significant improvement in relevance for several decades, as proven by modern standards.
- Research becomes a conversation
Have you ever wished your company had a search engine that works like Google? Remember that language models like BERT and its derivatives are just one of the steps that have allowed the user experience when searching the internet to improve over time. Answers become accurate, extracted, provided directly. We are far from the only lists of answers that can be filtered by facets and advanced search criteria. Their contribution to the company is now a reality. Moreover, the conversational characteristics of large generative LLMs now provide greater convenience of use, giving the user a glimpse into the possibility of complying with even the most unusual requests.
Conversational interfaces applied in the business world are still in their infancy, but the technology is developing rapidly. Increasingly, the adoption of various AI technologies makes it especially possible to accurately respond to the questions posed, while combining the expression quality of ChatGPT with the traceability, data security and business knowledge of the best engines.Business research.
Innovation in knowledge access service results in the ability offered to the employee to interact with every silo of company content, in a secure manner, while connecting them to each other. Thanks to advances in artificial intelligence, enterprise search solutions are entering a whole new era of convenience, convenience, and accuracy. The world of smart search engines and the world of text-generating forms are now coming together, like two distant galaxies that may finally form one. Is it a matter of initiating a reasonably sized LLM of a specialized nature within a business engine or calling online LLM services competing in gigantism and performance? And why not both? The future will tell, but the solutions are already on the market and already deserve very special attention.
Tribune By Luke Manigot, Vice President, Center of Excellence at Cinqua
<< اقرأ أيضًا: ChatGPT: البيانات في قلب عملية الذكاء الاصطناعي >>>