How To Prototype AR Experiences In One Hour Using Reality Composer

How To Prototype AR Experiences In One Hour Using Reality Composer

For the past year at Augmentop, we have built dozens of Augmented Reality prototypes for clients, and for fun, and we often needed a week or two to finish each prototype: laying out 3D scenes, creating user interactions, and to adding basic animations to 3D objects.

But during this year’s WWDC, Apple announced a new tool that completely changed the way we prototype AR experiences, and that we believe will radically change the way AR creators work.

And that’s Reality Composer.

We have been testing the tool for the past month, and we found that what normally required a week or more of prototyping work ended up taking less than a day to finish and test!

The following prototype was created in less than 1 hour in Reality Composer, including user interaction and animations, and without writing any code!

No alt text provided for this image

An augmented reality prototype for book covers created in 1 hour in Reality Composer

Bringing a familiar face to Augmented Reality

In a nutshell, Reality Composer is like Apple Keynote for Augmented Reality.

With Reality Composer, Apple is enabling anyone with no experience in 3D design or coding to create interactive animated AR experience in a few hours.

And just like Apple Keynote is being used to prototype web and mobile apps, Reality composer will similarly help product designers who want to get into AR, product managers who want to create requirements and specs for AR apps, and entrepreneurs who want to prototype new AR ideas without having to hire outside help to do it.

The tool brings the familiarity, simplicity and ease of use of Keynote into Augmented Reality, enabling you to focus on the basic tasks of prototyping, without getting lost in the detail of modeling, animation, and writing code

No alt text provided for this image

Reality Composer Interface


No alt text provided for this image

Apple Keynote Interface


At a high level, prototyping AR apps in Reality Composer is very similar to prototyping mobile apps in Keynote: create scenes (instead of slides), place shapes, add interactions, and then create animations.

No alt text provided for this image


In addition to the MacOS version, Reality Composer comes in iOS version, which makes it convenient in testing AR experiences directly on device.

How to prototype an AR experience in Reality Composer

Here is a step-by-step guide of our augmented reality prototyping process, which can be used to prototype most AR experiences in a couple of hours.

Keep in mind that Reality Composer is currently in Beta, which means it’s still slow, and some features might change by the time the final version is released.

This tutorial is for the MacOS version, as we haven’t received our invite to the iOS version yet, but from what we’ve seen online, the differences between both versions are minimal.

We also included a couple of feature requests for Apple that we believe would make the tool more awesome for us and other AR creators.

Want to watch our detailed step-by-step AR prototyping process? We made a 1-hour video for you to follow along.
No alt text provided for this image


Step 1: Sketching

Every good design starts on paper, and AR is no exception.

Before jumping into Reality Composer, create a couple of sketches where you capture the high level layout and design of your experience, and to storyboard the user interactions and annotate animations.

For instance, here are the sketches we used for our book cover prototype:

No alt text provided for this image

Sketch of AR UI

Useful hack: Even though paper is an excellent medium to sketch your 3D/AR experiences, we found that sketching some AR experiences in VR using TiltBrush and Google Blocks was more fun, more productive, and more accurate. If you have a VR headset, we recommend using it to sketch AR experiences in 3D space.

Step 2: Scene Layouts

Once you have your sketches and storyboards, you need to transform them into basic 3D scenes and layouts.

Create a new project in Reality Composer, and select the type of anchor you want to have the scene attached to (horizontal surface, vertical surface, image, face or scanned 3D object). Each scene in your Reality Composer project can be attached to a different anchor, but you should generally keep it consistent across all themes. You may use different anchor images for different scenes to show how the prototype works for various examples.

For our prototype, we created a scene with an image anchor, since the 3D UI will be attached to a book cover, which we added as an asset as shown below

No alt text provided for this image

Anchor Image in Reality Composer

If you already have 3D assets that your artists have created, do not start by importing those assets into Reality Composer. Instead, start by creating basic shapes to define where everything will be in the scene, and their relative sizes to each other, then use the “replace” menu command to swap those basic shapes with the high fidelity models.

While it’s possible to create your entire AR experience in a single scene, and hide/show different shapes using behaviors, we found it a lot more manageable to create a different scene for each “state” of the app, and to transition between those scenes using behaviors, as shown in the next step.

For our AR prototype, we have an empty startup scene, a main scene showing the initial UI with the prices, reviews and button, and a third scene for the shopping cart.

Once you have the basic layout of all your scenes, you can start modifying colors and materials, and importing your own 3D models to replace existing “grayscale” versions with new models.

We highly recommend naming all your objects by selecting them on the scene, and editing their names in the properties panel. This make selecting them faster from the right click menu, when the scene gets more complex and more crowded.

You can also group 3D shapes together using CMD+G, which makes it easier to move them and animate them together.

We also found that creating 2D assets in Sketch or Figma, exporting them as PNGs with transparent backgrounds, and then dragging-and-dropping them into Reality Composer, saves a lot of time that would be otherwise required to create their 3D counterparts.

No alt text provided for this image

First scene in Reality Composer

No alt text provided for this image

Second scene in Reality Composer

Useful hack: You can actually prototype an entire AR experience in Reality Composer by dragging in a bunch of PNGs from the web, or by creating them in Photoshop or Sketch, importing them into Reality Composer, adjusting their location, size and orientation, and adding interactivity and animations to simulate the full experience. This is similar to creating 2D cutouts of 3D models to pre-visualize a game or a movie. Once everything looks good, replace those images with 3D assets, and fine-tune them.

Feature requests:

In case anyone from Apple is reading this, here are some features we would love to see in RC that would make creating 3D scenes much easier:

  • List of objects and groups in a side panel (to select, hide/show, lock/unlock, etc.)
  • Rotate view to X, Y, Z perspectives with a single click
  • Lasso select 3D objects
  • Reusable symbols with corresponding actions (similar to prefabs in Unity)
  • Master scenes (similar to master slides in Keynote)
  • Copy/Paste object properties
  • Distribute objects along one or more axes
  • Particle generator
  • Grid layout generator across X,Y and Z axes to generate and

Step 3: Adding Interactivity

In Reality Composer, adding interactivity to AR experiences is accomplished through behaviors.

Each behavior consists of a trigger (when to execute the behavior) and a sequence of actions (what to do when the trigger is triggered). For our prototype, the triggers we used the most are on Scene Start and on Tap,and the action we use is Change Scene.

To show the Behaviors panel, click the Behaviors button on the right side of the toolbar.

We use Screen Start triggers to hide some shapes once the scene is displayed, or to animate them to their initial positions. We use Tap triggers to transition from one scene to another using Change Scene action. This is similar to creating hyperlinks in Keynote to transition from one UI screen to another.

Here are the behaviors that we created for our prototypes.

No alt text provided for this image

Transition behavior from main scene to shopping cart when tapping the Add to Cart button

No alt text provided for this image

Transition behavior from cart back to main scene when tapping the Return to Book button

Useful hack: When previewing an AR project in AR Quick Look, you often see a ghost version of the scene before the anchor is detected, and we didn’t particularly like that experience. To avoid showing that ghost 3D model before the anchor is detected, we create a blank startup scene, and add a Scene Start trigger to transition to the main scene, which is executed once the anchor is detected.
No alt text provided for this image

Use a blank startup scene to hide your AR experience from the screen until the image is detected in real life

Feature requests:

  • Highlight behaviors for a selected object
  • Animate object color
  • Adding more interaction triggers like swipe, pinch, spread, etc.
Want to watch our detailed step-by-step AR prototyping process? We made a 1-hour video for you to follow along.
No alt text provided for this image


Step 4: Creating Animations

You can prototype most AR experiences without needing to add any animations.

However, some experiences, like games and simulations, require animation in order to be tested with users. Other experiences can benefit from animations to keep users in context while transitioning from one scene to another.

In Reality Composer, you can add one or more of the following animations to your scene:

  • Show/Hide one or more shapes
  • Spin a shape
  • Orbit a shape around another
  • Move/Scale/Rotate a shape
  • Emphasize a shape
  • Make a shape constantly face the camera
  • Play an existing animation on an imported model
  • Play sounds

Animations are added the same way interactions are: by creating behaviors, triggers and action sequences in the Behaviors panel.

Action sequences for animations are executed left to right, so each action will wait for the previous one to finish before it starts. If you want an action to execute before another, drag it and drop it before that one in the same sequence. And if you want two actions to execute simultaneously, drag one on top the other, creating a group of actions.

For our sample prototype, we have a Scene Start behavior, which executes once the anchor image is detected, and animates each shape from its initial position to its final one, as shown below

No alt text provided for this image

Behaviors and triggers for the first scene in Reality Composer

If you need to add two animations on Scene Start, one that repeats forever (e.g. spin an Earth globe object), and another that happens only once (e.g. pop up labels), you can separate these two action sequences into two different behaviors with Scene Start triggers.

No alt text provided for this image

Behaviors and triggers for second scene

Useful Hack: You cannot hide your shapes below the anchor image, to make it look like they are sliding out from underneath it. To accomplish that, create a duplicate version of the anchor image, and place it right on top of the anchor image in your scene, with a Y-Position of 0.1cm. This will allow it to cover other shapes below the anchor image. Make sure you set the Y-position of other shapes to <0.1cm so that the newly added image would show on top of them.
No alt text provided for this image

An overlay image on top of tracking image to hide UI beneath it

Feature requests:

  • A Keynote-like MagicMove: Duplicate a scene, move, scale and rotate objects on the second scene, and Reality Composer automagically animates the transition between both scenes. This would save a lot of time creating AR interfaces.
  • Occlusion material to hide shapes behind other transparent shapes

Step 5: Exporting and Testing

If you’re working with the iOS version of Reality Composer, all you need is to hit the Play button on the toolbar, and start testing the prototype instantly on your device.

And if you are using the MacOS version of the app, you can play your experience on the screen to test it and tweak it until it feels right, and then export it to Xcode to build it and deploy it to your device, or embed it in an HTML page to preview it using AR Quick Look in iOS Safari.

We found a third option that was much faster: export the project into a .reality file, save it to iCloud or DropBox, and then use the iOS Files app to open it using the integrated AR Quick Look (this requires iOS 13 or iPadOS 13)

AR Quick Look currently supports all types of anchors, interactions, animations, and multiple scenes, so nothing will be lost in the export.

No alt text provided for this image

Opening .reality files directly from Files app

Once you prototype looks good and works well on device, you can import it into Xcode to add code logic and interaction using notifications, which we will cover in an upcoming post.

One of the best things about working with .reality files is that you can still open them in Reality Composer after being imported into Xcode, make changes, and those changes reflect instantly in Xcode.

This makes collaboration between artists, designers and developers very efficient, which we will also cover in an upcoming post.

Our favorite gestures and shortcuts

  • Move two fingers on trackpad to pan around the scene
  • Pinch/spread on trackpad to zoom the scene in and out
  • Rotate two fingers on trackpad to rotate scene around Y axis
  • Click-and-drag on trackpad to rotate scene around other axes, depending on the current viewing perspective
  • Hit space bar to play/stop scene
  • Disable/enable a behavior from its right click menu, instead of deleting it
  • Drag and drop actions on top of each other to make them run simultaneously
  • Turn on Snap on the toolbar to easily layout and arrange shapes in 3D
  • CMD+E to export a project to iCloud/DropBox to open from iOS Files App

What’s next

This was a quick overview of our AR prototyping series. Over the next weeks, we will posting articles about the following topics, and will be updating the topic links below accordingly:

  • How to run AR Design Sprints to go from idea to prototype in 5 days
  • How to sketch AR experiences in VR
  • How to create complex AR scenes for apps and games using basic shapes, imported 3D models, and procedural shapes in XCode
  • How to create complex animations and interactivity using Reality Composer and RealityKit
  • How to manage collaboration between artists, designers and developers in an AR project
  • How to do user testing for AR games and apps
No alt text provided for this image


Closing Thoughts

There are several ways to measure the success of a creative tool: simplicity, ease of use, low error rate, the time it requires to get something up and running, etc.

For us, we measure the success of a tool by how long we feel in the FLOW while using it. This is often accomplished when the tool disappears in the background, letting us create content and interact with it directly. By that measure, Reality Composer is a success!

We believe that Apple made a great tool for prototyping augmented reality experiences for people with no 3D modeling, animation or coding experience. A tool that bridges the gap between artists, designers, and developers in a seamless workflow. We are confident that Apple will continue to make it better, and we hope that our feature requests will help the Reality Composer team in that regard.

We also believe that the best tools to prototype and create Augmented Reality experience need to work in 3D space, rather than on-screen. We have experimented with sketching AR experiences in VR using TiltBrush and Blocks, and it was pretty intuitive and fun. One of the most time consuming task in Reality Composer was constantly rotating and scaling the scene to select and manipulate shapes. In VR, you accomplish that in a fraction of the time by moving your body and hand around in 3D space.

We have been experimenting with a very early tool to prototype AR experiences in VR and AR, and we will be sharing our insights and early bits from those experiments in our weekly AR newsletter.

Recommended Resources

BRIGHT LIVINGSTONE

Director - Digital Marketing & Sales

1y

Amir, Thanks for sharing!

Like
Reply
Suzan Oslin

UX Product Leader ➸ Infusing User-Centered Design into Emerging Augmented Reality Solutions ➸ Driving Collaborative UX Integration Across Disciplines in the AR Product Development Lifecycle

4y
Like
Reply
AHMET ACAR

Learning about AI and NoCode technologies. | innovation advisory | digital transformation | training

4y

That is pretty interesting. Is it possible to create component libraries for quick LEGO like assembly?

Like
Reply
Todd Kovalsky

Enterprise Program & Technical Product Manager. Strategy, Development, and Delivery of AI and Analytics solutions for agile business teams.

4y

thanks for sharing. the current state of ar is really interesting because the tools and front end are very accessible

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics