Quora this week launched Poe, a system that allows users to ask questions, receive prompt responses, and participate in back-and-forth discussions with AI chatbots, in an effort to show their interest in text-generating AI technologies like ChatGPT.
A Quora official explained in a text message to TechCrunch that Poe, which stands for "Platform for Open Exploration," is an invite-only, iOS-only framework that is "designed to be a site where folks can easily interact with a range of different AI agents."
Poe, then, isn't an attempt to build an AI model from scratch that is comparable to ChatGPT. ChatGPT chatbot has generated criticism because it can occasionally give answers that are factually incorrect yet seem persuasive. ChatGPT is gifted at responding to inquiries on topics ranging from poetry to code. The coding-focused Q&A website Stack Overflow briefly banned the posting of content created by ChatGPT earlier this month on the grounds that the AI made it too easy for users to submit answers and overflowed the site with dubious ones.
For instance, Quora might have gotten into trouble if it had trained a chatbot using the vast collection of user-submitted questions and answers on its website. Given that some AI systems have been demonstrated to repeat specific bits of the material that they were trained on, users may have objected to the use of their information in that way (e.g., code). Some organizations have taken issue with generative art methods like Stable Diffusion and DALL-E 2, in addition to code-generating systems like GitHub's Copilot, which they claim are plagiarising and making money off of their work.
For instance, a class action lawsuit alleges that Microsoft, GitHub, and OpenAI violated copyright laws when they permitted Copilot to reuse parts of licensed code without properly attributing the source. Users started loudly protesting by uploading "No AI Art" photos to their portfolios on the online art world ArtStation, where AI-generated art was first made accessible earlier this year.
At launch, Poe provides users with access to a variety of text-generating AI models, including ChatGPT. Poe lets users communicate privately with the models in a manner akin to text messaging software for AI models. Within the chat interface, Poe provides a range of conversation starters and use cases, such as "writing help," "cooking," "problem resolution," and "nature."
Poe only has a small number of models when it first opens, but Quora hopes to soon provide a way for model suppliers, such as companies, to submit their designs for inclusion.
It is commonly recognized that AI chatbots like ChatGPT have the potential to create malicious code in addition to offensive, biased, or other damaging information. Quora relies on the businesses that provide the models for Poe to manage and restrict the content rather than taking steps to stop it.
"The model vendors have put a lot of work to avoid the bots from giving harmful responses," the spokesperson alleged.
The spokesperson was very clear that Poe is not presently a part of Quora and neither will it necessarily be in the future. Quora sees it as a unique, separate project that it aims to iterate and enhance over time, much like Google's AI Test Kitchen.
The spokesman declined when asked about Poe's business objectives, citing that it was still early. On the other hand, it's not hard to envision how Quora, which mostly makes money from paywalls and advertising, will include paid features in Poe if it grows.
Quora asserts that for the time being, its focus is on finding out scalability, getting feedback from beta testers, and fixing any problems that may develop.
"The area is evolving pretty quickly right now," the spokesman said, "and we're more eager in finding out what difficulties we can address for individuals with Poe."
Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp
_____________
Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.