Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog

Channel Description:

Virtual Reality News

older | 1 | (Page 2) | 3 | 4 | newer

    0 0

    We’ve seen some pretty crazy VR-focused motion controllers at Road to VR over the years, but this latest offering from UK based VRGO probably makes the top 5. A wireless motion controller using inertial input that you sit on. Their £20,000 Kickstarter campaign has just gone live, we take a closer look. A solution to the question of how you control virtual reality experiences is still very much unanswered at this time. Solutions from Oculus Touch to the HTC Vive Steam VR controllers have their own take on how best to tackle the issue, but there’s still room for innovation in the field. One company has taken the challenge of tackling the issues of yaw-control induced sickness (right analogue stick for most of us) which plague traditional games presented in VR, with an entirely approach. VRGO offers a seat, loaded with inertial sensors that let you control your VR avatar’s locomotion and rotation entirely by moving the seat under your own weight. VRGO is a faintly bizarre looking, egg-shaped ‘controller’ which attaches to the device powering your VR experience, be that a PC or mobile phone, via bluetooth and delivers what the Kickstarter campaign page calls an “intuitive and consumer friendly hands free movement controller for virtual reality.” Tilting forward and back moves your character in the corresponding direction, whilst side to side does the same. Because the chair can also be freely rotated, a la swivel chair without the backrest obstruction, in theory this solution offers a less obstructive and more naturalistic way to move around virtual worlds. Connectivity with your host device is initiated via a ‘one touch’ control panel on the front of the device. The chair somewhat interestingly breaks apart too, revealing a central compartment for you to store your VR headset – or anything else that fits in the cavity I guess. The videos of the device in use certainly emphasise its uniqueness, although there’s not much in the way of fast, 180 / 360 swivels, and no mention of Gear VR compatibility (merely ‘mobile phones are mentioned) – an obvious candidate for the device. It certainly seems to work, although I can’t help but feel precise control (including sharp starts and stops) might be tricky to pull off when input is based entirely on your own body weight. The device is by default set to deliver input greater than 1:1, purportedly to resolve wire-tangling accidents on PC based headsets. Having said that, we haven’t tried it and it might well offer a useful solution for those irritated or indeed affected by traditional yaw-control / rotational gamepad input. The Kickstarter pledge tiers for backers to get their hands on a VRGO chair start at £175, for which you’ll receive one VRGO controller chair and at a supposed 30% discount. As far as exploring the undiscovered country that is VR input, we’re all for innovative ‘out of the box’ thinking, VRGO is nothing if not that. We’ll reserve judgement on it’s efficacy for now however, but would love to hear from readers who have backed the campaign and/or have tried it for themselves.

    The post VRGO is a Wireless VR Controller You Sit On, Kickstarter Campaign Live appeared first on Road to VR.


    0 0

    VRGO, the wireless motion controller using inertial input that you sit on, has passed it’s £20,000 Kickstarter goal and has announced a stretch goal to provide a customised version of the chair. VRGO offers a seat, loaded with inertial sensors that let you control your VR avatar’s locomotion and rotation entirely by moving the seat under your own weight. The company has now passed it’s £20,000 Kickstarter goal and have introduced a stretch goal to produce a unique version of the chair with very specific look. “Since reaching their target, VRGO have introduced a stretch goal of £23,000 and if reached, backers who have pledge for a chair will have the chance to choose from two new colour designs; Kiwi and Mars.” See Also: VRGO is a Wireless VR Controller You Sit On, Kickstarter Campaign Live Additionally, VRGO will introduce a new referral scheme to encourage take up of the unusual controller device. “…backers who have pledged for a chair can become a ‘Super Backer’ and receive a ‘Super Backer Kit’ for getting someone else to do the same.” You can find more information on the new stretch goal and the referral program over at the VRGO Kickstarter website here

    The post VRGO Passes Kickstarter Goal, Announces Stretch Goal and Referral Program appeared first on Road to VR.

    New ColoursNew Colours

    0 0

    Sixense’s wireless motion controller system STEM and its protracted development process seems finally to be nearing an end. To that end, the company have released 3 new videos of demo games early adopters will receive along with their hardware. STEM’s troubled development history is by now well documented. An early example of crowdfunded success followed by prototype design and certification issues a reminder of the drawbacks for this form of product funding and development. However, the project looks to be nearing end, with multiple recent updates from the firm seeming to indicate the wireless, motion control system which offers a unique approach to VR input, may soon be in backers hands. Now, Sixense have released videos of 3 demo’s that will ship alongside the hardware; a shooting range, gold and the now famous lightsaber demo (now christened ‘Slash’ for obvious reasons). We’ve seen first hand the shooting range offering before, when it was used to demonstrate Sixense’s use of sensor fusion, using IMU’s to counteract magnetic distortion which plagued earlier STEM units. Both Ben Lang and Reverend Kyle came away suitable impressed at the system’s new intuitive accuracy See Also: Hands-on: IMU Added to Sixense STEM VR Motion Controller Underscores Impressive Performance Below the new video demonstrates just why this Sixense’s STEm tech still offers the potential for some incredibly cool VR experiences. Quick, responsive and above all accurate shooting (something tells me this guy may have played it before) are joined with fun gun-juggling and two-hand clip reloading. It’s like Duck Hunt from the future! Precisely when backers will receive their STEM systems however is still a mystery, although it does seem imminent. Either way, once we receive our units, we’ll be sure to report our findings.

    The post Sixense Reveals 3 New STEM VR Controller Demo Videos appeared first on Road to VR.

    sixtense stemsixtense stem

    0 0

    On the path to eventually releasing the STEM VR motion controller, Sixense has released the Sixense Core API and SixenseVR SDK with support for the still-kicking Razer Hydra. Included in the release is 5 STEM demos which are playable with the Hydra. Eager for some motion input action in your VR ahead of the release of the release of next-gen motion controllers? Sixense has included Razer Hydra support in the release of their Sixense Core API and SixenseVR SDK, meaning you can fire up your Rift and Hydra to try out the five included demos: SixenseVR Archery There is a bow floating to your left and you can grab it by reaching to it with either hand and pulling the trigger. After you have the bow, you can grab an arrow by reaching your free hand over your shoulder, and then squeeze and hold the trigger. With the arrow in your hand, bring it to your bow, pull your arrow hand back, then release the trigger to shoot. SixenseVR Home Run Derby In Home Run Derby the player is allowed 10 practice hits before the derby begins. Once the derby begins, the player tries to hit as many home runs as possible. Each hit that is not a home run counts as an out. The game is over when the player has 10 outs. Tap home plate with the bat to start the game again. Difficulty levels include Arcade, Normal and Simulation. When the difficulty is set in Simulation mode keys 0-9 adjust pitching speed from 70 to 90 mph. SixenseVR Shooting Range The user has a holstered gun on their hip and magazines on the opposite hip. To grab the gun, place the hand near it and pull the trigger. The gun can be fired by pulling the trigger again. Eject the magazine by pressing any face button on the controller holding the gun, then grab and place a new magazine near the bottom of the gun to load the new magazine into the gun. Magazines are reloaded with bullets when they are placed on the hip or dropped to the floor. SixenseVR Slash There are several swords floating in front of the user. The user can reach out with each hand and grab a sword by pulling the trigger. The sword can then be activated by pressing any face button on the controller. Use the swords to deflect the laser blasts and strike the drone when it gets too close. SixenseVR Golf – Putting Green To change the height of the club, press any face button to extend the putter head to the ground, or you can use the joystick up and down for fine adjustments. Pull the trigger to drop a ball. The user has 10 balls to get the highest score possible. Each hole is worth a unique amount of points; the farther the hole, the higher the points. Although the demos are included, this SDK release is primarily aimed at developers, and Sixense is requesting that those interested email for access. Back in the early days of this new era of VR, people quickly realized that motion input was going to be a major boon to virtual reality interaction. The Razer Hydra became the defacto standard for early VR motion input development as one of the only commercially available 1:1 motion input controllers at the time, having launched a year before Oculus’ 2012 Kickstarter. Though produced and sold by Razer, the Hydra was built with technology from Sixense, who went on to fund the creation of a next-generation VR motion input controller. STEM, which successfully raised over $600,000 on Kickstarter in 2013, promises more range and accuracy than the Hydra in a wireless package, as well as support for additional trackers. STEM has hit a number of unfortunate snags along the road to release, with the latest estimates from Sixense putting the launch in April, 2016. The release of the Sixense Core API and SixenseVR SDK with Hydra support offers a platform for those with the controllers can begin experimentation and development for STEM and other next-gen VR motion input controllers before any of them become commercially available.

    The post Sixense Releases 5 STEM Demos and SDK Compatible with Razer Hydra appeared first on Road to VR.


    0 0

    This what the retail version of 3DRudder, a unique motion controller designed that aims to “free the hands and lets the user do more”, looks like. And this is how much it’ll cost when it finally goes on sale at retail in March 2016. We first encountered 3DRudder, a peripheral from France which aims to free up your hands in VR for … other things, back in CES in January. Going ‘feet on’ with the device was an interesting experience, but the core idea behind the peripheral certainly seemed to work. Now, the company has announced that not only will the latest version of 3DRudder be on show at CES 2016 in January, the device will be available at retail in March 2016 at $175. The 3DRudder “VR Edition” controller is a feet-based VR motion controller, used while seated. It enables users to move in virtual worlds for work or gaming with their feet while seated in their favorite chair or couch, at least that’s the theory. I tried the device at CES in January this year and despite a somewhat steep initial learning curve, there was definitely something unique about controlling your view of a virtual world with only your feet. The example of use I was given then were of a 3D artist, working on a modelling project, using the controller to control his or her view into the world whilst using more familiar controls (keyboard, mouse) simultaneously, in theory giving you more scope for control over your workflow. My time with the device was too short to prove whether this theory holds up in practice, but it seems clear 3DRudder are confident enough to have pushed through the prototype stage to what you can now see as a near finished product. Road to VR will be on the ground at CES 2016 in January to track down a demo with the latest iteration of 3DRudder.

    The post 3DRudder VR Controller Priced at $175, Ships in March 2016 appeared first on Road to VR.


    0 0

    Oculus has announced that their ‘Touch’ VR controllers will be released in the second half of 2016. Previously the company had expected the Oculus Touch release date to come in the first half of the New Year. After recently announcing a slight delay of Oculus Rift headset pre-orders, the company today said in a blog post that the accompanying Touch controllers will see a release date in the second half of 2016 rather than the first half, as previously stated earlier in 2015. See Also: Oculus Rift Pre-order Slips to Early 2016, Q1 Launch Window Still on Track The company had also said earlier this year that Touch pre-orders would open alongside the Oculus Rift pre-orders (now coming “soon after the New Year”); with the new announcement saying that pre-orders for Touch will open “a few months prior to launch,” it would seem that this will no longer be the case. “The feedback on Touch has been incredibly positive, and we know this new timeline will produce an even better product, one that will set the bar for VR input. We appreciate your patience and promise Touch will be worth the wait,” reads the update. Oculus further says that they’re creating “larger numbers of pre-production runs” to get early Touch hardware out to developers to begin crafting content for the VR controller.

    The post Oculus Touch Release Date Pushed to 2nd Half of 2016 appeared first on Road to VR.

    0 0

    After revealing the Touch ‘Half Moon’ prototype earlier in 2015, Oculus has continued to hone the already impressively designed VR controller. Alongside an update stating that Touch would be shipped in the second half of 2016, the company also released a new photo of the latest design. Throughout the course of the young company, Oculus has revealed a number of named prototype devices and development kits which represent a culmination of the latest hardware developments. For the company’s headsets, we saw the progression of the original development kit, the ‘DK1′ to the ‘HD Prototype’ then on to the ‘Crystal Cove’ prototype and the ‘DK2′, eventually seeing the Crescent Bay prototype followed by the so-called ‘CV1′, the consumer version of the Oculus Rift. Development of the company’s Touch VR controller seems to be following suit, with the ‘Half Moon’ prototype being the first to be shown earlier in 2015, sharing a lineage among some 300 prior internal permutations. Since the release of Half Moon, Oculus has continued refining Touch. The company claims that the latest version, which was teased with a recently released photo, brings “significant advances in ergonomics,” and further improves upon the VR controller’s ability to recognize hand gestures. The most notable difference between Half Moon and the new Touch prototype is that the IR tracking LEDs have been covered over with an IR-transparent plastic, much like we saw starting with the Rift DK2. The cutlass shape is now also sporting a white Oculus logo. More subtle changes in the shape of the controller can be seen most easily in the thumb sticks which now appear with a tilt in the direction of the thumb. The handle part of the controller appears to be even smaller than prior prototypes. When I had my first chance to try the Touch Half Moon prototype back at E3 2015, I was thoroughly impressed with the intuitive and ergonomic design. Oculus’ insights go beyond mere ergonomics; proprioceptive parity was a clear priority of the controller’s design prompt, making Touch the most comfortable and natural VR controller I’ve ever used. The company’s positional tracking prowess means they are tracked nearly flawlessly. Unknown price aside, Touch is poised to be the best VR motion controller out there. Read More: Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds We’d later learn that not only can Touch track the position of the controllers and a limited number of hand gestures, but it can also sense when buttons on the controller are touched. In the update which included the new Touch design, Oculus says “[we] promise Touch will be worth the wait.”

    The post Oculus Teases Improved Touch Controller with “Significant Advances in Ergonomics” appeared first on Road to VR.

    recommended lead-feature photooculus-touch-new-design-2016recommended lead-feature photooculus-touch-new-design-2016

    0 0

    After teasing us last week, Oculus has released more revealing photos of the latest design of their Touch VR controllers. Not seen on the previous ‘Half Moon’ prototype is a mysterious pattern to the left of the buttons, as well as a new button. Moving on from the Touch ‘Half Moon’ prototype which the company revealed in mid 2015, Oculus is now showing what appears to be a near-final version of the Touch VR controller. Since the Half Moon prototype we can see that the IR tracking LED’s have been hidden underneath what’s most likely IR-transparent plastic (just like the consumer Rift). There’s also now a new button with an Oculus icon on it, which will likely be used as a ‘Home’ button to return to the Rift’s pre-app VR environment. In addition to a more mature looking industrial design and modified thumbstick, we see a mysterious new feature added to the top surface of the controller next to the buttons. At first glance it looks to be a speaker of sorts, but with the Rift most often likely to be used with headphones (and Oculus spending so much time working on 3D audio through software) a speaker makes little sense to be included on the controller, unless for some non-VR applications (Nintendo’s Wiimote actually did this, albeit poorly). A microphone is also plausible, but with the Rift already including a built-in mic, placing one on each controller doesn’t seem likely. With that in mind, the new area may be simply a tactical hint, giving users an obvious area to place their thumb when not using the Rift’s thumbstick or buttons. Although many game console controllers expect users to keep their thumbs on the sticks at all times, Oculus made Touch as a hybrid device, expecting that many uses won’t need the sticks or buttons at all, but wanting to keep them there for the experiences that would benefit from them. If the newly seen feature on Touch is indeed a tactile surface, it also seems like that is it capacitive (able to sense the touch of a user’s finger). The majority of the buttons and triggers on Touch are also capacitive, allowing the system to understand where you are touching the controller even if you aren’t actually pressing any buttons. This can help with understanding the user’s actual hand position, allowing for gestures like pointing and thumbs-up. While Oculus Rift pre-orders will open this week, the company says that Touch won’t ship until the second half of 2016, with pre-orders coming a few months prior.

    The post New Oculus Touch Photos Show Unidentified Feature & Matured Design appeared first on Road to VR.

    oculus touch new feature design (5)oculus touch new feature design (6)oculus touch new feature design (3)oculus touch new feature design (5)oculus touch new feature design (6)oculus touch new feature design (3)

    0 0

    Leap Motion continues to refine their hand-tracking tech for intuitive controller-free interactivity. The company’s latest focus has been on an ‘Interaction Engine’ which usurps standard physics engines when it comes to defining interactions between user input and virtual objects. Leap Motion‘s Caleb Kruse demonstrated the company’s work on the Interaction Engine, which is said to form the foundation of intuitive and accurate interactions with a variety of objects. With that foundation in place, developers can focus on creating useful experiences rather than having to having to find out the best way to program interactions from the ground up. The Interaction Engine is still internal at Leap Motion, but the company tells us that they plan to release it widely to developers in the near future. The Interaction Engine is a sort of intermediary between the user’s input and the physics engine, Kruse told us. Left to physics alone, grabbing an object too tightly might cause it to fly out of your hand as your fingers phase through it. The Interaction Engine, on the other hand, tries to establish your intent (like grabbing, throwing, or pushing) based on what the Leap Motion tracker knows about your hand movements, rather than treating your hand in VR like any other object in the physics simulation. The result is more intuitive and consistent control when interacting with objects in VR—something that’s been a major hurdle for Leap Motion’s computer-vision based input. Now it’s easier and more predictable to grab, throw, and push objects. While developing the Interaction Engine, Leap wanted to be able to quantify the efficacy of their hand input, so they created a simple demo task in VR where users reach out to grab a highlighted ball and place it in a randomly indicated position. Through testing hundreds of users, Kruse said the company found people to be around 96% accurate in this task when using the Interaction Engine. Another demo which utilized the Interaction Engine allows you to create cubes of varying sizes by pinching your thumb and index finger together to form a recognizable gesture. Then, when moving your hands close together, the outline of a cube forms and you can move your hands back and forth (like a pinch zoom) to set your desired scale. When I tried these demos myself, I noted how the system was impressively able to understand that I was still holding objects even when I occluded my fingers with the back of my hand. The cube demo was fun an easy to use (especially with gravity turned off) and while I wasn’t quite as adept as Kruse in manipulating objects, his skills are a demonstration that it’s possible to get better at using the system over time (which means, by necessity, there’s a vital aspect of consistency to the system). Grasping virtual objects which have no physical representation is still a strange affair, but the Interaction Engine definitely enhances predictability and consistency in object interactions, which is incredibly important for the practicability of any input method.

    The post Leap Motion’s ‘Interaction Engine’ Aims for Effortless VR Input appeared first on Road to VR.

    leap motion at indiecadeleap motion at indiecade

    0 0

    Leap Motion formed prior to the VR craze, but it turns out that their goal of allowing you to use your own hands as a way to interact with computers jived quite nicely with virtual reality. In recent years the company has made a major pivot toward VR and it culminates today with ‘Orion’, an overhauled hand tracking engine built from the ground up for VR. When Leap Motion first dreamed up their gesture tracking device, the intended use case was that the unit would sit on a desk facing upward and detect a user’s hands as they held them above the device, using it to control activity on a computer monitor. Fast forward a few years and the company says they are “100% focused on VR”, moving the major use-case to a head mounted gesture tracking device which of course sees hands from a different angle entirely (and has a tracking space that’s dependent on where the user is looking). The company says that Orion, their new made-for-VR hand tracking engine, has been in the works for at least a year and that the engine is a major improvement, calling it “radically smoother, faster, more reliable, and far more capable than even the best of what’s existed before.” The company summaries some of the improvements: Orion starts tracking faster, and keeps tracking farther, with lower latency, and in situations where no previous software could keep up. Unprecedented sensitivity allows us to maintain reliable hand tracking even in high angle, high occlusion scenarios. Incredible progress has also been made in separating the hand out from cluttered backgrounds, allowing you to bring your hand close to or even in contact with other surfaces. This advance has also improved performance in all lighting conditions, making it extremely difficult to find an environment where you won’t have high precision hand tracking. The Orion engine can be experienced on the company’s existing hand tracking camera (which they call ‘The Peripheral’) through a beta release of the software made available today at the Leap Motion developer portal. Orion also sets the stage for a new sensor that the company says will be embedded in VR headset this year. The tiny commodity chip is easy to build into such devices, though the company maintains, “99% of the value comes in the [hand-tracking] software.” Last month at CES, we got to preview the company’s new ‘interaction engine’ which was designed to enhance user interaction within the virtual world. The interaction engine sought to improve understanding of user intent, leading to more consistent actions like grabbing, releasing, and throwing objects. We were impressed by the improvements it brought to the table and fortunately Leap Motion says that Orion will have the same capabilities, along with a number of other performance enhancements.

    The post Leap Motion Launches Overhauled Hand Tracking Engine That’s Made for VR appeared first on Road to VR.

    leap-motion-orion-blocks-demoleap motion CEO Michael Buckwaldleap-motion-orion-blocks-demoleap motion CEO Michael Buckwald

    0 0

    This week Leap Motion released ‘Orion’ a brand new made-for-VR hand tracking engine which works with their existing devices. Time and again we’ve seen promo videos from the company showing great tracking performance but then found it to be less than ideal across the broad range of real-world conditions. Now, an independent video showing the Orion engine in action demonstrates Leap Motion’s impressive mastery of computer vision. Leap Motion has always been neat and it’s always worked fairly well. Unfortunately in the world of input “fairly well” doesn’t cut it. Even at 99% consistency, input becomes frustrating (imagine your mouse failing to click 1 out of 100 times). And this has presented a major challenge to Leap Motion who have taken on the admittedly difficult computer vision task of extracting a useful 3D model of your hands from a series of images. The company has been working for nearly a year on a total revamp to their hand tracking engine, dubbed ‘Orion’, which they released this week in beta form for developers. The new engine works on existing Leap Motion controllers. A video from YouTube user haru1688 shows Orion running on the Leap Motion controller mounted to an Oculus Rift DK2. Importantly, this video was not staged for ideal conditions (as official videos surely are). Even so, we see impressively robust performance. Hands are detected with incredible speed and seemingly immune to rapid movement. We see almost no ‘popping’ of fingers, where the system misunderstood what it was seeing, often showing the 3D representation of the fingers in impossible positions. There’s also now significantly less loss of tracking in general, even when fingers are occluded or the hands are seen edge-on. Clapping still seems to be one area where the system gets confused, but Leap Motion is aware of this and presumably continuing to refine the interaction before moving Orion out of beta. Compare the above video to earlier footage by the same YouTube user using a much earlier version of Leap’s hand-tracking (v2.1.1). Here we see lots of popping and loss of the hands entirely. The original Leap Motion hand tracking engine was designed to look up at a user’s hands from below (with the camera sitting on a desk). But as the company has realized the opportunity for their tech in the VR and AR sectors, they’ve become fully committed to a new tracking paradigm, wherein the tracking camera is mounted on the user’s head. Because the tracking engine wasn’t made specifically for this use-case, there are “four or five major failure cases attributed to [our original tracking] not being built from the ground up for VR,” Leap CEO Michael Buckwald told me. “For Orion we had a huge focus on general tracking and robustness improvements. We want to let people do those sorts of complicated but fundamental actions like grabbing which are at the core of how we interact with the physical world around us but are hard to do in virtual reality,” he said.

    The post Independent ‘Orion’ Video Shows Massive Improvements in Leap Motion Hand Tracking appeared first on Road to VR.


    0 0

    Natural motion input is the clear future of VR. And while the Rift would seem to lead in price over the Vive, factoring the controllers into the equation will create a very narrow price gap, if any, between the two systems. While Oculus has its Touch motion controllers in development, they won’t be ready by the time the Rift begins shipping at the end of March. So the company opted to include an Xbox One gamepad with each Rift as a common input device between headsets. Touch isn’t expected to ship until the second half of 2016. And that means the Rift’s $600 price doesn’t factor in the cost of motion controllers. On the other hand, the HTC Vive will ship with no gamepad, but will instead include motion controllers out of the box. Comparing the Vive’s $800 price to the Rift doesn’t make much sense unless you consider Oculus Touch as part of the Rift package. Without knowing the price of touch, this comparison is difficult, but unless you think Touch will be exceptionally cheap, the total cost between the two systems could turn out to be negligible, if not identical. There’s a $200 price gap between the Rift’s standalone $600 price and the Vive’s $800 price (which includes motion controllers). We conducted a straw poll back in January to gauge our readership’s price estimates for Touch. The majority guessed it would land between $100-$199. (The majority also correctly guessed that the Vive would cost more than the Rift). Assuming the intuition of our audience is correct, even at the low end of that spectrum ($100), that would place a mere $100 gap between the total Rift package and total Vive package. At the high end of the spectrum ($199), it would eliminate the gap all together. Given that there aren’t many analogs, it’s tough to guess how much the price of Touch will close that gap. The biggest hint we have may be the Vive itself. When considering the total package of each system, both include one headset, two sensors, and two motion controllers. As an outside observer, the components of each headset seem quite similar: two lenses, two custom OLED displays of the same resolution, embedded hardware for tracking, a series of cables, and the body/shell of the headset itself (each headset has one notable wildcard that the other lacks: integrated headphones on Rift and integrated camera on Vive). Unless either company has some significant price advantage among these components or in the manufacturing process, the headsets are likely to be quite evenly matched in how much they contribute to the overall system cost. That would leave the remaining cost in the hands of the two motion controllers and two tracking sensors. Again, unless there’s some significant cost advantage in components or manufacturing between Oculus’ IR LED-based ‘Constellation’ tracking and Vive’s laser-based ‘Lighthouse’ tracking (which we haven’t heard suggested by either company), then the trackers themselves are likely to contribute equally to the cost. And that leaves the controllers. Assuming the above is roughly true, simply comparing the $600 cost of the Rift (with no controllers) to the $800 cost of the Vive (with controllers), might just give us a solid estimate of how much they contribute to the package: $200. One thing I think we can be sure of is that Touch can’t cost more than $200, even if Oculus has to eat a loss on it at that price; launching their motion controllers after the Vive’s while creating a higher total package cost would put them at a major disadvantage in attracting new users to their hardware platform. Look for both companies to try to out-bundle and out-discount each other as we approach holiday 2016.

    The post Including Controllers, Vive and Rift Could be Evenly Matched on Price appeared first on Road to VR.


    0 0

    Oculus Touch won’t launch until the second half of 2016, but the company continues to iterate on the impressively ergonomic VR controller. We went hands-on with their latest prototype at GDC 2016 last week. This article discusses the changes between the Touch Half Moon Prototype (mid-2015) and the latest Touch 2016 prototype. For a more substantial overview of my thoughts on Touch as a whole and how it stacks up to the competition, be sure to read our initial hands-on:  Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds Although Oculus focused the spotlight on their gamepad-only launch titles at GDC, some of the most impressive Rift games we saw were built exclusively for Touch. When Oculus first revealed Touch back in mid-2015 they showed us the ‘Half Moon Prototype’. What I’m calling the ‘2016 Prototype’ is the version of their VR controller that they first teased us with on the last day of 2015 (and were never shown in the flesh until after the New Year); the 2016 Prototype is visually distinct and no longer holds the ‘Half-Moon’ designation. Technically the version we tried was Touch ‘Engineering Sample CO6AC’. First and foremost we can see that the Touch 2016 Prototype has had its IR-LEDs (part of the tracking system) covered over with IR-transparent plastic for a more sleek look. The rest of the shape has been tweaked slightly, most noticeably on the triggers and handles which are more rounded. The biggest changes come to the thumbsticks and button layout. Although not included on the initial Half Moon prototypes, later Half Moon variants would see an ‘Oculus’ button included on the controller, and this has carried over to the 2016 Prototype (this will be used to access the Oculus Home menu). The Oculus button, along with the A/B/X/Y buttons (A/B on right controller, X/Y on left), have been scooted aside to make way for a small patch of tactile bumps. The bumps seem to serve as an indicator of the intended default position of your thumb, whereas the thumb’s resting position on Half Moon was actually on the buttons themselves. My guess is that Oculus opted to move the thumb’s resting position away from the buttons to prevent people from accidentally pressing them when using the controller’s ‘hand-trigger’ to grip objects (as the thumb is a natural part of the gripping gesture). While the buttons, triggers, and thumbsticks on Touch are capacitive (touch-sensitive) to aid in posing the user’s in-game hand, it isn’t clear to me yet if the tactile area will be capacitive as well. The resting angle of the thumbstick on the Touch 2016 Prototype has been tweaked slightly compared to Half Moon, as has its height. This seems to have made it somewhat easier to achieve a thumbstick ‘click’ when the stick is tilted. See Also: Preview – ‘Dead & Buried’ Action Packed Multiplayer Could be the Killer App Oculus Touch Needs When touching the 2016 Prototypes to each other, I could feel a magnetic attraction between the two controllers at the inside point where the tracking ring connects to the controller’s face. My best guess is that the magnetism has to do with the controller’s haptics, which may lend further support to the idea that Touch uses a linear actuator for haptics rather than the usual ERM motor that produces the rumble in many gamepads. The haptics themselves seemed perhaps more powerful than before, possibly due to a change in position of the haptic components (which might explain why I quickly noticed the magnetism). Alternatively, it could be that haptics were simply better utilized compared to when Half Moon first went out the door and developers were still learning the best ways to use the feature. See Also: Including Controllers, Vive and Rift Could be Evenly Matched on Price At GDC 2016 we also got a peek at a box in which Oculus appears to be distributing Touch Half Moon Prototypes to developers. With styling akin to that of the consumer Rift case, we imagine Touch will eventually ship in something similar. Although the company is distributing the device to select developers, they don’t intend to launch an open dev kit. To my hands the Touch 2016 Prototype still has class-leading ergonomics (even compared to the Vive’s newest VR controllers) and the tracking works as well as ever. The ‘hand-trigger’ in particular is an exceptionally well executed idea, affording users an intuitive ‘grab’ function while leaving their trigger finger and thumb free for further interaction. The HTC Vive controllers of course have a similar grab button along their length but its placement doesn’t lend itself to being continuously held as a virtual ‘grab’ while still allowing natural use of the controller’s remaining buttons. Oculus plans to release their Touch VR controllers in the second half of 2016. The cost of the controllers (and additional tracking sensor) is still unknown.

    The post Hands-on: Oculus Touch 2016 Prototype Brings Refinements to an Already Elegant Design appeared first on Road to VR.

    oculus touch 2016 prototype hands on gdc (3)oculus-touch-new-design-2016oculus-touch-2016-prototype-half-moon-prototype-comparisonoculus touch 2016 prototype hands on gdc (5)oculus touch 2016 prototype hands on gdc (2)touch-boxtouch-box-2oculus touch 2016 prototype hands on gdc (3)oculus-touch-new-design-2016oculus-touch-2016-prototype-half-moon-prototype-comparisonoculus touch 2016 prototype hands on gdc (5)oculus touch 2016 prototype hands on gdc (2)touch-boxtouch-box-2

    0 0

    At GDC 2016 we got to take a look at the latest STEM VR controllers from Sixense. With units purportedly on track to begin shipping next month, the company is also committing resources to create a promising cross-platform multiplayer experience called Siege VR which could turn into something much bigger than a mere demo. Throughout STEM’s unfortunately troubled development timeline, one thing has surprised us: Sixense has always had highly functioning and fairly well polished demo experiences for their motion controllers. That probably comes with the territory; after all, the company designed the tech behind the Razer Hydra which hit the market years ahead of motion input controllers like Oculus Touch and the Vive controllers even having been announced. They also created the Portal 2: In Motion DLC which brought 20 new levels specifically built for motion controllers to the game. So I suppose it shouldn’t be surprising after all that the many little tech demos they’ve made over the years to show off motion input mechanics have felt ahead of their time (see this one from 2014). With that in mind, it was great news to hear at GDC 2016 last week that the company not only plans to finally ship the first STEM units to backers of their 2013 Kickstarter campaign, but they’re also developing a new game called Siege VR that’s going to have cross-platform support among headsets and motion controllers. I Don’t Care What Platform It’s on, I Just Want to Play Siege VR Unlike many of the short STEM experiences we’ve seen from Sixense over the years, Siege VR is more than a demo. What we saw at GDC was a prototype of what the company says will become a full game, which will be free to all backers of the STEM Kickstarter and will also be made available for the Oculus Rift, HTC Vive, and PlayStation VR (whether that’s using each platform’s own VR controllers or STEM). Siege VR is a first-person multiplayer castle defense game which (in prototype form) had me and an ally wielding bows side-by-side, trying to stop hordes of enemies from reaching the castle gates. In addition to shooting regular arrows, there are two special arrow types: an explosive arrow (from the quiver seen on the left wall), which when ignited by a nearby torch explodes on impact; and an artillery arrow, which fires a smoke round which designates a target for your friendly catapult to fire upon. The former is great for taking out dangerous groups (including enemy archers who will aim for you directly) and the latter works against enemy catapults. The special arrow types regenerate over time, but you’ll want to use them sparingly, especially as the artillery arrow stockpile is shared between both players. The game is still very early, but the creators say they’re considering having many more than two players all working together to defend the castle. Perhaps—they conjectured—there would be teammates at more forward towers who could aim back at the castle to prevent enemies from scaling the walls, while the players on the wall itself would be focused on enemies in the field. Maybe—it was suggested—it could be a multi-stage experience where, if the enemies break through the main gate, you and your team fall back to using melee weapons. Some earlier prototypes included the ability to pour buckets of boiling oil onto would-be castle crashers, though that and some other features were cut for the time being to add a bit of simplicity and polish for the GDC demo. Between the lot of us excitedly chattering about ‘what about [insert super cool gameplay]? Or how about [more super cool gameplay]?’ it was clear that Siege VR could have legs well beyond a simple demo, and that’s where Sixense says they plan to take it. Forgetting It’s There is a Good Thing As I played Siege VR using STEM with a Rift DK2, I got that wonderful feeling of forgetting about the technology and simply having fun playing the game. That means that everything was working together to make a fun and intuitive experience which kept me immersed. When I came out, the top of my mind was filled not with questions about STEM, but about the scope and potential of Siege VR. STEM itself was integral to getting us to the stage of talking not about limitations, but about possibilities for Siege VR. I’ve used the system at many points along its oft-delayed development, and while it’s always felt good, this time around it felt better than at any point in the past; even after using Touch and Vive controllers all week throughout the rest of GDC. For me the thing that pushed the needle most significantly was the headtracking performance. STEM has additional tracking modules which can be affixed to head and feet (or elsewhere, up to 10 tracked points). For their demos Sixense often eschews the Rift’s own headtracking in favor of using a STEM tracking module. Having used the Rift plenty, it always felt like there was something a little ‘off’ about the STEM-based headtracking—whether it was latency or positional accuracy, I’m not quite sure. But this time around I actually had to ask to clarify if they were using the Rift’s tracking camera or if it was STEM: it was 100% STEM. I point to headtracking because it’s easier to tell when something isn’t right with your head, compared to your hands; light drift of a few millimeters or inaccuracy on your hands can be very hard to spot. When the placement of your virtual eyes depends entirely on the tracking though, it’s easy to feel when things aren’t working right. So what I’m saying is that because the headtracking was solid, that means the rest of STEM’s tracking is solid too (as there’s no difference in tracking a module on your head vs. a module on your foot). Particularly in an electromagnetically dense setting—like, say, the middle of the GDC expo floor—which can mess with the magnetically-based tracking, it was impressive that the headtracking felt that good. In fact, Sixense’s booth had a number of STEM basestations scattered about; there was one just a few feet away from the one I was using, and I didn’t spot any interference-based tracking issues despite competing magnetic fields. Sixense isn’t trying to quarantine itself from the competition either. They had both the Oculus Rift and HTC Vive (with Vive controllers) in action at their booth, and says their SixenseVR SDK will allow developers to create games that are ready for any motion controllers, not just STEM. The SDK allows for “code-free setup” for humanoid characters in Unity and Unreal Engine, and provides a full body skeletal pose based on sensor data.

    The post Multiplayer ‘Siege VR’ Prototype Highlights Solid STEM Tracking Performance appeared first on Road to VR.

    razer-hydrasixense-stem-archerysixtense stemrazer-hydrasixense-stem-archerysixtense stem

    0 0

    Both Oculus and Valve have gone on the record to say that they’d be opening up their respective tracking systems for third-parties to make use of, but after a year, neither company is ready to talk specifics. So called ‘6DOF’ (degrees of freedom) tracking is critical to virtual reality. VR systems need to know where your head is and precisely how it’s moving through space in order to render a virtual world which moves around you as you would expect to see in real life. 6DOF tracking is also important for adding motion input to allow users to effortlessly interact with the virtual world. Between the top desktop platforms, two leading systems have emerged. Oculus’ ‘Constellation’ tracking uses an array of IR-LEDs tracked by an external camera while Valve/HTC’s ‘Lighthouse’ system uses an array of photodiodes to track lasers emitted from two base stations. But extending those tracking systems beyond the head and hands has a wide range of uses; tracked third-party peripherals could open up a world of new opportunities for VR interactivity. One major use-case is simply mirroring the virtual item—be it a bat, golf club, sword, etc.—to the real object that the player is holding. This enhances immersion because not only is the object shaped and held just as it would be in real life, but the user benefits from all the expected forces like weight, leverage, and momentum from the object’s mass. StrikerVR, for instance, is creating a VR gun peripheral which includes a powerful force feedback module so that it doesn’t just feel like the player is holding a gun, but it feels like the gun is actually firing when the trigger is pulled. Third-party controllers with features that go beyond the first-party offering would also be viable with access to an established tracking system. Tactical Haptics, for instance, is creating a VR controller with ‘Reactive Grip’, a unique haptic solution which can create feedback not possible with rumble alone. Not to mention other VR headsets; FOVE, a Japanese firm making a headset with integrated eye tracking, had to renege on a Kickstarter promise to use Lighthouse tracking, instead opting to build their own tracking system for the time being. But companies like these—which would prefer to tie their devices into the existing tracking systems rather than create a redundant tracking system of their own—have been stymied by a lack of communication by Oculus and Valve regarding their plans for third-party access of Constellation and Lighthouse. Both companies have gone on record to say that they plan to open up their tracking systems, but have been extremely tight-lipped about timelines or specifics. I checked in with both companies recently and was told that there were ‘no details to share at this time’. Continue Reading on Page 2

    The post One Year Later, Oculus and Valve Still Mum on Timeline to Open Tracking to Third-parties appeared first on Road to VR.

    Yes, that's an HTC Vive controller taped to a sandaloculu rift teardown (1)striker-vr-arena-infinity-prototypetactical-haptics-htc-viveFove4Yes, that's an HTC Vive controller taped to a sandaloculu rift teardown (1)striker-vr-arena-infinity-prototypetactical-haptics-htc-viveFove4

    0 0

    Announced today at Google I/O 2016, VR team lead Clay Bavor revealed on stage that the company will be partnering with manufacturers to build Google-approved VR headsets based on an internally generated reference design. To boot, Google is also providing their new VR ecosystem with a handy Wiimote-style motion controller. Unlike Samsung Gear VR, which features a separate IMU from the phone itself, Daydream-ready phones will supposedly already be suitable for low latency VR by providing sub-20 millisecond motion-to-photon latency natively. It’s unsure if the Google-approved headsets will provide any benefit outside of good ergonomics. “There’s so many things you need to get right. It has to have great optics. I has to be comfortable. The materials need to feel good, and it needs to be really easy to put on and take off,” Bavor told the crowd. See Also: Samsung, HTC, LG, and More Bringing ‘Daydream Ready’ VR Phones to Android The newly revealed controller, which looks eerily similar to the Oculus Rift remote, provides a touchpad and clickable buttons that allow you to scroll and swipe through apps, and is presumably for light gaming applications. Google maintains that the new Daydream controller houses ‘orientation sensors’, but doesn’t specify if the controller is entirely IMU-based. The first headsets will be available starting fall 2016 from a number of yet unspecified manufacturing partners. Daydream-ready smartphones will be produced by Samsung, HTC, LG, Mi, Huawei, ZTE, Asus and Alcatel. This story is breaking. Check back for more Google VR news shortly.

    The post Google-Approved VR Headsets With Motion Controller Coming This Fall appeared first on Road to VR.

    daydream controllerdaydream controller

    0 0

    Oculus Touch isn’t here for consumers just yet, as the company just narrowed their launch window down to Q4 of this year. But if you’re a developer, you might just get your hands on a Touch dev kit in the very near future. In an apparent bid to win back consumer confidence in the face of the Rift’s initial “unexpected component shortage” back during in the March launch, which saw several months of delays as a result, Oculus today said in a blog post that the company is “on track to launch Touch and introduce true hand presence along with an amazing lineup of games and experiences later this year.” Though the Touch dev kits won’t be open for sale like the company’s first two headset dev kits, Oculus CEO Brendan Iribe tweet this week that “thousands of Touch dev kits are going to developers now.” Thousands of Touch dev kits are going to developers now. — Brendan Iribe (@brendaniribe) July 12, 2016 With a healthy dose of skepticism, one Twitter user asked if Iribe was certain about the announcement, and if Oculus really was sending the company’s natural input device to so many developers. Iribe’s response? Over 5,000 are headed to devs. @deepnoizer well over 5,000 will be sent out to developers before the consumer version ships. — Brendan Iribe (@brendaniribe) July 12, 2016 And with Oculus Connect, the company’s annual developer conference, coming at the beginning of Q4 (October 5-7th), we’re sure to hear more about Touch then. We’re crossing our fingers for Rift/Touch bundles, a definite price & release date, and the full list of 30+ Touch launch titles.

    The post Oculus CEO: “Well over 5,000″ Touch Dev Kits Shipping to Developers Before Launch appeared first on Road to VR.

    oculus touch 2016 prototype hands on gdc (3)oculus touch 2016 prototype hands on gdc (3)

    0 0

    Tactical Haptics, creators of the ‘Reactive Grip’ haptic technology have adapted their latest prototype for use with the Oculus Touch VR controllers. Reactive Grip is a novel haptic feedback technology that’s unlike anything you’ll find in modern day controllers which by and large rely on ERM motors or linear actuators to provide a rumbling sensation. Rather than rumble, Reactive Grip uses sliding bars positioned around the controller’s handle which put pressure on your hands to simulate and object moving within them. For certain use-cases, the effect can be very convincing. This (now quite dated) video does a good job of showing how it works: To show off their latest prototype, Tactical Haptics adapted it for the Oculus Touch controllers and took them to yesterday’s SVVR Meetup at the NVIDIA campus in Santa Clara, CA. There they let attendees try the controller and the latest demos. While the makeshift mounting makes the controller quite tall, Tactical Haptics founder Will Provancher says that it works quite well because of the Touch controller’s light weight. When I last had my hand on the Reactive Grip feedback, it was when the company had adapted their prototype for the HTC Vive controllers. It was then that I concluded, “Forced to choose between the two, I’d easily pick Reactive Grip over rumble, but ultimately the two complement one another, especially if the rumble comes from the more modern linear actuator approach which is great for subtle clicking and tapping effects as well as the usual ‘dumb’ rumble that we associate with the ERM motor rumble common in modern-day gamepads.” The end goal for Tactical Haptics is of course to have a single, sleek integrated controller that does both tracking and haptics in one, rather than having to mount a VR controller to the haptic controller. That should be possible once Oculus and Valve finally open up their tracking solutions to third parties, but at last check it seems both companies are content to take their time.

    The post Tactical Haptics Adapts Prototype Haptic Controller for Oculus Touch appeared first on Road to VR.

    oculus touch 2016 prototype hands on gdc (3)oculus touch 2016 prototype hands on gdc (3)

    0 0

    Striker VR’s ARENA Infinity is a haptic VR gun which can simulate various weapon fire modes and other haptic effects. After revealing the design of the accessory back in April, the company is now showing off a working prototype. Based on the awesome retro-futuristic design by Edon Guraziu, Striker VR is showing off the first working prototype of the Arena Infinity. Aimed at the Digital Out-of-Home VR sector, the wireless peripheral has on-board haptics based on a linear actuator. The gun is capable of impressively powerful kick, especially for an electronic system, adding a convincing recoil to firing a virtual weapon. The haptic engine in the gun can give feedback for the usual single, burst, and full-auto firing modes, but can also be used for other effects, like a sci-fi railgun that needs to be charged before firing (shown in the video at 0:28), or a chainsaw for hacking zombies apart (0:10). The Arena Infinity Prototype is currently using a temporarily affixed tracker, but the company plans to provide formal support for several tracking systems, giving location-based VR firms a choice in which tracking system is best suited for their use. The company says the Arena Infinity currently supports PhaseSpace and Sixense STEM tracking, and is also aiming to integrate Valve’s Lighthouse, Oculus’ Constellation, and PlayStation’s Move tracking systems. Striker VR says that the first Arena Infinity development kits will be delivered to select partners in Q4 2016, which will include the haptic gun, SDK, and haptic sandbox range as an SDK sample. The company says more broad delivery of the development kit will come “soon after” the initial rollout. While Striker VR hasn’t announced a consumer-facing version of their haptic VR accessory, they tease, “the Arena Infinity is a first step to a broad solution aimed at peripherals that are easily attached to the virtual environment and afford users an infinite array of possibilities.”

    The post Striker VR Shows off Working Prototype of ARENA Infinity Haptic VR Gun appeared first on Road to VR.


    0 0

    Oculus revealed Touch, their VR motion controllers, all the way back in June of 2015, but didn’t commit to shipping the controllers until the second half of 2016. Since the announcement, the controller has seen several design permutations, but Oculus says the controller is nearly complete, and better than ever. At Gamescom 2016 this week in Germany, Jason Rubin, Oculus Head of Content, told us that the company’s much anticipated Touch VR controllers have improved over time and are nearly complete. He confirmed that the version we saw at E3 was the near-final iteration. “…at E3 we shipped a new version of Touch that has better tracking, greater distance from the sensors,” he said. “It’s pretty much the final iteration. We’re doing little tweaks always, but we’re pretty much there.” Exactly how much the controller has improved since the first ever ‘Half Moon’ prototype was revealed back in 2015 is unclear, but Oculus has insisted from the beginning that the controllers are capable of ‘room-scale’ tracking like that of the HTC Vive, even if the company isn’t focusing on that type of usage out of the gate. As for the release date of Touch (which we know will be coming before the end of the year) and price, Rubin is still quiet, but teased, “Oculus Connect 3… that would be a great place to make some sort of announcement.” (Don’t worry, we’ll be there to find out). When we tried the latest version of Oculus Touch at E3 2016, we found subtle changes to the controller’s button placements and design. The controller’s ergonomics, slightly refined from earlier prototypes, are still impressive and class-leading by most accounts. This seems to be due not only to the shape of the controller, but also it’s size and close center of gravity. Earlier this year, Oculus confirmed that Touch would be bundled with Medium—the company’s virtual sculpting tool—and see more than 30 games at launch, with “hundreds of additional Touch titles in development…” Additional reporting by Scott Hayden.

    The post Latest Version of Touch has Better Tracking & Longer Range, Says Oculus appeared first on Road to VR.

    Progression of Oculus Touch development kitsoculus touch new feature design (1)prototype shot oculus touchProgression of Oculus Touch development kitsoculus touch new feature design (1)prototype shot oculus touch

older | 1 | (Page 2) | 3 | 4 | newer