Unity launched its Editor VR service in an alpha form late last year, and the engine-maker appears to be pretty happy with the results so far.
Editor VR is an in-VR section of the larger Unity engine that lets you edit virtual scenes in games and more. Taking to the stage at the company’s GDC 2017 press conference this morning, Principal Designer on the Labs team, Timoni West, revealed that the platform has had over 6,000 downloads since its launch. That might not sound like a huge amount, but given the platform’s alpha phase it’s a good start. Some of those users have already contributed code back into the platform’s code base too.
West then showcased some of the Tools that are now included in the alpha version. These are creative apps and middleware that are imported into the Unity engine, to literally provide developers with more tools to make VR experiences. They include animation apps like Tvorii, and ProBuilder, which allows you to build, texture and edit meshes inside VR.
The company wants yet more Tools, though, and to encourage developers to contribute their work, Unity is hosting a competition with an undisclosed cash prize and a chance to showcase at May’s VR and AR-focused Vision Summit.
Unity isn’t just leaving the development of Editor VR up to the community, though. West’s biggest announcement was its own Tool, the Cross Reality Foundation Toolkit (XRFT). This was described as a “framework” for just about anyone interested in working in VR, AR and MR, allowing them to “get up to speed” without starting from scratch.
“We want to give you the building blocks for interaction and locomotion,” West said, “and everything else you need.”
Included in the XRFT will be elements like cross-platform controller input, customizable physics systems, AR/VR-specific shaders and cameras, object snapping and building systems, debugging and profiling tools, and support for all major VR and AR platforms. The toolkit will be released as an open-source beta “over the next few months.”
It sounds like this could be the next big step in Unity’s plan to make VR development accessible to everyone and anyone.
Jamie has been covering the VR industry since 2014 having come from a gaming and technology background. While he loves games, he's most interested in experiential VR that explores narrative, human connection and other such themes. He's also the host of Upload's VR Showcases, which you should definitely watch.Twitter
Registration is disabled.
Once the Editor VR window opens, pop on your VR headset and get started!
Activate the ProBuilder tools
Using the hand you prefer to build with, trigger-click the Unity logo at the base of the other hand’s controller.
When the “Rotating Box Menu” appears, rotate it till you see the face labeled “ProBuilder”. Here, use the ray to trigger-click the “Create Shape” button.
The ProBuilder VR “bracelet” GUI should now appear on your hand- congrats, you are ready to build!
The ProBuilder VR GUI
Currently, there are only two buttons:
The furthest button enables Shape Creation, and is cleverly disguised as a 3D shape.
The closest button displays a triangle with edges, and enables Face Editing controls.
Point the ray at a surface- the Shape Drawing Grid will appear on that surface, at a grid size relative to it’s distance from you.
Trigger-Click and hold to begin drawing a new Shape. Drag to size, then release the trigger to commit the base. Now, move/point the ray to set the Shape depth, and trigger-click to commit the final Shape. Done!
Point the ray at a surface- it will highlight blue to indicate selection.
Trigger-Click and hold, then move the selected Face by pointing your ray where you’d like it to move to- this sounds strange, but try it and you’ll understand easily 🙂
Alternatively, you can dip the Unity “control cone” into the surface- it will highlight in purple this time. Now, you use Trigger-Click like before, but move your hand and the selected Face will move with it. Neat!
That’s it, folks…
…for now! Lots more coming soon, give it a try and let us know your thoughts and suggestions, thanks very much!
Virtual reality (VR)has seen many recent gains in popularity, and as headset makers work to keep up with hardware demand, developers are working to keep up with users’ need for engaging content. VR isn’t the only technology that’s seen an increase in popularity. In today’s professional world everyone is using live video streaming to connect and collaborate. This creates an interesting opportunity for developers to develop applications that can leverage Virtual Reality along with video streaming to remove all barriers of distance and create an immersive telepresence experience.
VR developers face two unique problems:
- How do you make VR more inclusive and allow users to share their POV with people who aren’t using VR headsets?
- How do you bring non-VR participants into a VR environment?
Most VR headsets allow the user to mirror the user’s POV to a nearby screen using a casting technology (Miracast or Chromecast). This creates a limitation of requiring one physical screen per VR headset, and the headset and screen must be in the same room. This type of streaming feels very old-school given that most users today expect to have the freedom to stream video to others whom are remotely located.
In this guide, I’m going to walk through building a Virtual Reality application that allows users to live stream their VR perspective. We’ll also add the ability to have non-VR users live stream themselves into the virtual environment using their web browser.
We’ll build this entire project from within the Unity Editor without writing any code
For this project, I will use an HTC Vive Focus Plus because it allows me to build using Unity’s XR framework, which makes it relatively easy to setup a VR environment. And Vive builds to an Android target, so I can use the Agora Video for Unity SDK to add live video streaming to the experience.
This project will consist of three parts. The first part will walk through how to set up the project, implementing the Vive packages along with the Agora SDK and Virtual Camera prefab.
The second part will walk through creating the scene, including setting up the XR Rig with controllers, adding the 3D environment, creating the UI, and implementing the Agora Virtual Camera prefab.
The third section will show how to use a live streaming web app to test the video streaming between VR and non-VR users.
Part 1: Build a Unity XR app
Part 2: Create the Scene
Part 3: Testing VR to Web Streams
Note: While no Unity or web development knowledge is needed to follow along, certain basic concepts from the prerequisites won’t be explained in detail.
Part 1: Build a Unity XR app
The first part of the project will build a Unity app using the XR Framework, walking through how to add the Vive registry to the project, download and install the Vive plug-ins, install the Agora Video for Unity Plug-in, and implement the Agora plug-in using a drag-and-drop prefab.
Set Up the Unity Project
Start by creating a Unity project using the 3D Template. For this demo, I’m using Unity 2019.4.18f1 LTS. If you wish to use Unity 2020, then you will need to use the Wave 4.0 SDK, which is currently in beta.
Note: At the time of writing, I had access to the beta SDK for the new HTC headset. I chose to use the Wave 3.0 SDK. This project can be set up in exactly the same way using the Wave 4.0 SDK.
Enable the XR Framework and HTC Modules
Once the new project has loaded in Unity, open the Project Settings and navigate to the Package Manager tab. In the Scoped Registries list, click the plus sign and add the Vive registry details.
Once the Vive “scopedRegistries” object and its keys have been added, you’ll see the various loaders importing the files. Next open Window > Package Manager and select Packages: My Registries.You will see the VIVE Wave. If no package is shown, click Refresh at the bottom-left corner.
Click through and install each of the Vive packages. Once the packages finish installing, import the PureUnity and XR samples from the Vive Wave XR Plug-in and the Samples from the Essense Wave XR Plug-in into the project.
After the packages have finished installing and the sample has finished importing, the WaveXRPlayerSettingsConfigDialog window will appear. HTC recommends to Accept All to apply the recommended Player Settings.
Next, open the Project Settings, click in the XR Plug-in Management section, and make sure Wave XR is selected.
Now that we have the Wave SDK configured, we need to add Unity’s XR Interaction Toolkit package. This will allow us to use Unity’s XR components for making it possible to interact with Unity inputs or other elements in the scene. In the Package Manager, click the Advanced button (to the left of the search input) and enable the option to “Show preview packages”. Once the preview packages are visible, scroll down to the and click install.
Note: If you are using Unity 2018, you will need to configure the input Manager using the presets provided by HTC. The inputs can be defined manually, or you can download this . Once you’ve downloaded the preset, drag it into your Unity Assets, navigate to the InputManager tab in the Project Settings, and apply the presets.
When working with the Wave XR Plug-in, you can change the quality level by using . HTC recommends setting the Anti Aliasing levels to in all quality levels. You can download this from HTC’s Samples documentation page.
For more information about Input and Quality Settings, see the HTC Wave Documentation.
Download the Agora Video SDK and Prefab
Open the Unity Asset store, navigate to the Agora Video SDK for Unity page, and download the plug-in. Once the plug-in is downloaded, import it into your project.
Note: If you are using Unity 2020, the Asset store is accessible through the web browser. Once you import the asset into Unity, it will import through the Package Manager UI.
The last step in the setup process is to download the Agora Virtual Camera Prefab package and import it into the project.
When you import the Agora Virtual Camera Prefab package you’ll see that it contains a few scripts, a prefab, and a renderTexture. The two main scripts to note are (which contains a basic implementation of the Agora Video SDK) and (which implements and extends the AgoraInterface specifically to handle the virtual camera stream). The folder contains a few helper scripts: one for logging, another to request camera/mic permissions, and one to handle token requests.
Also included in the package is the , an empty GameObject with the attached to it. This will make it easy for us to configure the Agora settings directly from the Unity Editor without having to write any code. The last file in the list is the , which we’ll use for rendering the virtual camera stream. The tools folder contains the scripts , , and .
We are now done with installing all of our dependencies and can move on to building our XR video streaming app.
Part 2: Create the Scene
In this section, we will walk through how to set up our 3D environment using a 3D model, create an XR Camera Rig Game Object, create the buttons and other UI elements, and implement the Agora VR Prefab.
Set Up the 3D Environment and the XR Camera Rig
Create a scene or open the sample scene and import the environment model. You can download the model below from Sketchfab.
Once the model is imported into the project, drag it into the Scene Hierarchy. This will scale and place the model at with the correct orientation.
You may have noticed that the scene looks really dark. Let’s fix that by generating lighting. Select Window > Rendering > Lighting Settings and then select the Auto Generate option at the bottom of the window and watch as Unity adjusts and updates the Lighting settings. The lit scene should look like this:
Next we’ll add a room scale XR rig to the scene. Start by deleting the Main Camera Game Object from the scene, and then right-click in the Scene Hierarchy, navigate to the XR options, and click “Device-based > Room Scale XR Rig)”.
Add the XR Controllers
The last step for setting up our XR Rig is to set the HTC prefabs for the and Game Objects. To do this, we’ll need to import the HTC Essence Controller package. Open the Project Settings, navigate to the Wave XR > Essence tab, and import the “Controller” package.
Once Unity has finished importing the “Controller” package, in the Assets panel navigate to the Wave > Essence > Prefabs folder. The Prefabs folder contains prefabs for the Left and Right Controllers.
Select the LeftController Game Object from the Scene Hierarchy, and drag the WaveLeftController prefab into the Model Prefab input for the “XR Controller (Device Based)” script that is attached to the Game Object.
Next, navigate to the Wave > Essence > Scripts folder and drag the CustomizeWaveController script onto the LeftController Game Object.
Repeat these steps for the RightController Game Object, adding the WaveRightController prefab and the CustomizeWaveController script.
That’s it for setting up the controllers to automatically show the appropriate 3D model for the controllers associated with the HMD and the controllers the end user is running.
One thing to keep in mind: Since this is VR, there will be a desire to add some 3D models for the user to interact with. If you plan to add models to the scene, add 3D cubes with collider boxes just below and whatever stationary or solid Game Objects you’d have in your virtual environment, to avoid the models falling through these 3D environment objects when the scene loads. Later in this guides, we’ll add a floor with a collider and teleportation system.
Note: concepts are beyond the scope of this guide, so I won’t go into detail, but for more information take a look at theUnity Guide on using interactables with the XR Interaction toolkit.
Add the Perspective Camera and the Render Texture
The final modification we will make to our XR Rig is to add a Camera Game Object with a Render Texture, as a child of the Main Camera. This will allow us to stream the VR user’s perspective into the channel. For this demo, I named the Camera Game Object as PerspectiveCamera and added the AgoraVirtualCamRT (from the Agora Virtual Camera package) as the Camera’s Render Texture.
Add the UI Buttons and Remote Video
Now that the XR Rig is set up and the scene is properly lit, we are ready to start adding our UI elements.
We need to add a few different UI game objects: a join button, a leave button, a button to toggle the mic, a button to quit the app, and a text box to display our console logs in the Virtual Reality scene.
Untiy’s XR Interaction framework has XR-enabled UICanvas Game Objects that we can use to quickly place our buttons. Right click in the Scene Hierarchy, and in XR select the UI Canvas.
We’ll repeat this process two more times so at the end there will be three canvas Game Objects in our scene.
The first canvas will be used for our console logs. We can rename this Game Object to . Then set the width to , the height to , and scale to so it fits within our 3D environment. Add a Panel Game Object as a child of the , and then add a Text Game Object as a child of the Panel, and name it . The will display the output from the logger. We want the logs to be legible, so we’ll set width to and height to . Last, position the in the scene. I’ve chosen to place it on the wall with the four window panes.
The next UI Canvas will contain the Join, Leave, and Microphone buttons. Right-click with in the Scene Hierarchy, add another to the scene, rename it , set the width to , the height to , and the scale to . Add three Button Game Objects as children of the . Scale and position the buttons in the canvas however you want. Below is how I positioned the buttons on the shelves:
The third UI Canvas will contain the Quit button to allow the users to exit our application. Right-click in the Scene Hierarchy, add another to the scene, rename it , set the width to , the height to , and the scale to . Add a Button Game Object as a child of the . Scale and position the button in the canvas however you want. Below is how I positioned the button:
This is a pretty big space, so let’s add a teleportation system so the user can move around in the virtual space. To do this, select the XR Rig in the Scene Hierarchy and add the Teleportation Provider script and the Locomotion System script to the Game Object. Make sure to set the script references to the XR Rig. While it’s not required, setting the reference helps reduce the load time. The last component to enabling teleportation is to add a teleportation area and scale it as large as the floor. The teleportation system allows the user to point and click to move anywhere on the floor plane.
The last two Game Objects we need to add to the scene are the Remote Video Root, and the Screen Video Root. Add an empty Game Object and rename it . This will be the root node to which new video planes will spawn as remote users connect to the channel. I’ve chosen to place the on the wall of just windows.
Last, add an empty Game Object and rename it . This will be the root node to which the video plane will spawn whenever the screen is shared into the channel. I’ve chosen to place the on the glass wall opposite the .
Implement the Agora Prefab
Now that we have all of our environment and UI elements added to the scene, we are ready to drag the Agora Virtual Camera Prefab onto the Scene Hierarchy.
First, we’ll add our Agora AppID and either a temp token or the URL for a token server. The Agora prefab assumes the token server returns the same JSON response as the Agora Token Server in Golang guide.
We also need to select a channel name. For this example I will use . It’s worth noting that for later when we connect from the web.
Note: For users to be correctly matched, the AppID and Channel name must match exactly because channel names within Agora’s platform are case-sensitive.
The next steps are as simple as they sound. Drag the various Game Objects that we just created into the Prefab input fields.
There is one last step before we can test what we’ve built. Add an On Click handler to the Join, Leave, Microphone, and Quit buttons. For each of these, we’ll use the Agora Virtual Camera prefab as the On Click reference and select the corresponding function for each button. For example, the JoinBtn On Click would call AgoraVirtualCamera > JoinChannel.
At this point, we can test the work we’ve done by building the project to our VR HMD. If you are using a Vive headset, the build settings should look like this:
After you initiate the build, the project will take a few minutes to build and deploy to the device. Once the app is deployed, you’ll see a loader screen before you are dropped into the virtual environment.
Click the Join button to join the channel. Once you have joined the channel, the microphone button will turn green and some logs will appear in the debug panel. When you click the Mic button it will turn red to indicate that the mic is muted.
Part 3: Testing VR to Web Streams
In the last part of this guide we’ll use a live streaming web app to test the video streaming between VR and non-VR users.
Set Up a Basic Live Streaming Web App
The Agora platform makes it easy for developers to build a live video streaming application on many different platforms. Using the Agora Web SDK, we can build a group video streaming web app that allows users to join as either active participants or passive audience members. Active participants broadcast their camera stream into the channel and are visible in the VR scene. Users can also join as passive audience members and view the video streams from the VR headset along with the streams of the active participants.
In the interest of brevity, I won’t dive into the minutiae of building a live streaming web app in this guide, but you can check out my guide How To: Build a Live Broadcasting Web App.
For this example, we’ll use a prebuilt example similar to the web app built in my guide How To: Build a Live Broadcasting Web App. If you are interested in running the web client locally or customizing the code, you can download the source code.
Test the Streams Between VR and the Web
Use the links below to join as either an active or a passive participant.
Once you’ve launched the web view and connected to the channel, launch the VR app and click the Join button. Once the VR headset joins the channel, you will be able to see the VR user’s perspective in the web browser.
Wow, that was intense! Thanks for following and coding along with me. Below is a link to the completed project. Feel free to fork and make pull requests with any feature enhancements. Now it’s your turn to take this knowledge and go build something cool.
For more information about the Agora Video SDK, see the Agora Video Web SDK API Reference or Agora Unity SDK API Reference. For more information about the Agora RTM SDK, see the Agora RTM Web SDK API Reference.
For more information about the Vive Wave SDK, see the Vive Wave Unity Reference.
I also invite you to join the Agora Developer Slack community.
Start building your VR collaboration app today
I bite into them, and she pulls off my shirt and begins to unbutton my pants. Carried away by my chest, I did not notice how the pants fell to my feet. There were only cowards that prevented my overflowing gun from breaking free. But not for long. Lena sat down and pulled her panties off with both hands, a member jumped out of them and.
Editor vr unity
Then, as usually happens on evening trains, a joint dinner of fellow travelers from home supplies took place, we secured a bottle of beer and a collection for the night. The old woman ate separately and went to bed first on the lower shelf. Later, by midnight, I made a bed on my top shelf and lay down to read a magazine. At dinner I was fascinated by Natasha, she did not contradict.
Her boyfriend, but still she did not quite like such an emphasized attention of her gentleman.Introduction to VR in Unity - PART 1 : VR SETUP
You will finish when I allow, he threw to me, watching my face, ask me about it. Tell me Lord, can I cum. " Everything inside protested.
You will also like:
- Hip side tattoo
- Gemstone polishing machine
- Tapered ring compressor
- Stihl weed eater
- Reddit unsolved mysteries
- Stihl chain saw
- 18 folding table
- Deck signs ideas
- Scrolling text app
- Glitch monster legends
- Rabbit sound mp3
- Zillow open houses
- Diy dslr rigs
Not rush. I'm almost ready, but I still want to enjoy your body. I obediently froze, although I wanted a stormy continuation. But are such men refused.