Facebook Debuted Its Overhauled AR Studio for Facebook Camera at Its F8 Conference

Visual programming capabilities and Sketchfab integration were among the highlights

Examples of creations via Facebook AR Studio
Facebook

Facebook introduced a revamped version of its AR Studio for Facebook Camera at its F8 annual developer conference in San Jose, Calif., Tuesday.

The social network debuted AR Studio at F8 2017 last April with the goal of accelerating augmented reality efforts across the social network.

Engineering director Ficus Kirkpatrick detailed the changes to AR Studio in a blog post.

Visual programming capabilities were added to AR Studio, enabling creators to drag and drop elements such as custom animations, interactions and logos into their scenes without the need for coding in JavaScript.

Facebook also revealed a partnership with Sketchfab, saying that once its integration is complete, creators will be able to search a library of downloadable 3-D models directly within AR Studio and add those models to their projects. Kirkpatrick added that “in the near future,” creators will be able to add their own custom libraries to AR Studio via a simple JavaScript library application-programming interface.

And the social network’s camera effects are being expanded beyond its flagship mobile applications to platforms including Instagram, Messenger and Facebook Lite, with Kirkpatrick writing that this experience is launching in closed beta on Instagram and Messenger, with Facebook Lite to be added soon.

Kirkpatrick also detailed more new tools that are being added to AR Studio:

  • Tracking: Movements can be followed and tied into experiences via AR Target Tracker, body tracking, hand tracking and high-fidelity face tracking.
  • Patch Editor: No code whatsoever will be required for actions such as controlling audio, manipulating materials, adding interactions and creating shaders.
  • Free-to-use assets: Creators can add ready-made sound files from Facebook’s free library and, soon, 3-D models via the Sketchfab partnership detailed above.
  • Background segmentation: People can be separated from their backgrounds, and experiences can be created to transport people to different places.
  • Location AR: AR effects can be tied to real-world locations, and predetermined experiences can be made available when users are in those locations.
  • Semantic scene understanding: Creators can create experiences that are “contextually aware,” with Kirkpatrick offering the example of having heat waves rise up when a coffee cup is recognized within a scene.
  • Analytics for AR effects published by Facebook pages: Page administrators will soon be able to track impressions, captures and shares.
  • 3-D posts in Camera: Users will be able to take 3-D posts from News Feed and experience them in their AR worlds via Facebook Camera on their mobile devices.