Artificial Intelligence

Snap Introduces AI Tools for a New Era of Augmented Reality

Snap introduces AI tools for Next-Gen AR technology

Supraja

On Tuesday, Snap released the latest version of generative AI technology, which allows users to view more realistic special effects to film themselves using phone cameras, as the company strives to maintain its competitive advantage against its counterparts in the social media segment.

Snap has been at the forefront in introducing the market to augmented reality or AR; this is a technology where a camera is used to display the real environment before overlaying computer images on the images. Snap is pinning its hope on the generation of newer and more imaginative opt-in features called lenses, which will draw more traffic to Snapchat as well as advertisers.

As Snap introduces AI tools, AR developers are now capable of building intelligent lenses the same way, and Snapchat’s users will be able to incorporate lenses into their content, the firm’s announcement suggests.

Snapchat’s parent, Snap Inc., revealed an enhanced version of the developer program called Lens Studio, and its headquarters is located in Santa Monica, California.

Taking over the Snap introduces AI tools, Snapchat’s CTO Bobby Murphy promised increased capability to create AR effects through Lens Studio, increasing effectiveness from a few weeks to a few hours and the capability to develop more complex work.

“What’s fun for us is that these tools both stretch the creative space in which people can work, but they’re also easy to use, so newcomers can build something unique very quickly,” Murphy told in an interview.

Generative AI tools also appear in Lens Studio, for example, Lens Brand & Dev QQ – if a developer needs an answer to a question, the AI can help him. There will be another tool that will enable artists to input a textual description of an object, which will in turn produce a 3D object that the artist can immediately use to create an AR lens without having to start drafting the model from the ground up.

Earlier versions of AR technology were only able to provide simple changes, such as putting a hat on the head in a video. It will enable AR developers at Snap to design even more convincing lenses than they currently do. For instance, have the hat pop animation move with the head of a participant and have the hat on a level with the video lighting, Murphy revealed.

Snap has also planned its future use cases beyond simple facial mapping and mapping the entire body to generate, let’s say, the outfit of a particular look, which is not easy to generate currently, Murphy mentioned.

Bitcoin Inches Closer to $100K, XRP Surges 30%

Investing $1,000 in DTX Exchange Is Way Better Than Dogwifhat (WIF): Which Will Make Higher ATH This Cycle

Top 6 Best Cryptos to Buy in 2024 for Maximum Growth

Don’t Miss Out On These Viral Altcoins Before BTC Price Hits $100K; Could Rally 300% in December

5 Top Performing Cryptos In December 2024 You’ll Regret Ignoring – Watch Before the Next Breakout