Quantcast
Channel: VR Controllers – Road to VR
Viewing all 93 articles
Browse latest View live

Nimble Sense Nears End of Kickstarter Campaign, Reaches First $120,000 Stretch Goal

0
0

Nimble-VR-2Nimble Sense, the hand tracking controller designed to work with head mounted displays, has surpassed its first stretch goal of $120,000, nearly doubling its base funding goal of $62,500. With only a few days left until the crowdfunding campaign closes, the team behind the mountable sensor are undoubtedly rubbing their virtual hands together in anticipation of what comes next after such a successful Kickstarter — and for backers, that means getting a few more dials to play with that could substantially widen the device’s functionality.

See Also: Nimble Sense Kickstarter Aims to Bring Time-of-Flight Depth-sensing Tech to VR on the Cheap

Nimble Sense, a device not much larger than a pack of gum, houses an IR (infrared) laser and IR depth camera that was originally designed to sense object movement at the default range of 2.3 ft (70c m), or about the distance from the DK2’s onboard sensor array to the hand of a fully extended arm.

Now that the Kickstarter campaign has eked past its stretch goal, Nimble VR is expanding the device’s functionality (without overclocking it) and opening up the camera’s settings up to make it “a hacker/developer/computer vision-friendly camera.”

nimble sense virtual reality input

To keep within a nominal operating levels however (i.e. not lighting your head on fire) they’ve opted to keep the same power settings. To accommodate this limiting factor, they’re exchanging a lower frame-rate for a longer camera exposure time which will effectively give the device a longer viewing range. The team plans on supporting three specific ranges; 2.3ft/70cm, 3.25ft/100cm, and 4.92ft/150cm — most likely pushing the little USB device’s capability to the very edge of usability.

Nimble Sense Stretch Goal

Nimble Sense Backer ChartAlong with increased range, they will be supporting low level camera controls with frame rates ranging from 1Hz to 45Hz, and shutter speed controls ranging from 0.1ms to 4ms exposures, potentially allowing developers to use the depth-sensing device for a wide variety of applications not limited to its main feature of VR-centric skeletal hand tracking. It’s an effort  by Nimble VR to coax potential developers into using Nimble Sense to do things like guide wheeled robots, track props, or even the user’s feet (because … feet… that’s why).

To send the Kickstarter even further into high gear, Nimble VR has added 250 more spots to their level 1 pledge tier which starts at $99 — getting you a Nimble Sense, 7ft USB cable, and A DK2 mount that cleverly utilizes the DK2’s native cable cover.

It does seem that, despite being a relative newcomer to the scene, Nimble Sense are all set to give Leap Motion a run for their money. All they need now is some sort of 3D Jam to find some killer content.

The post Nimble Sense Nears End of Kickstarter Campaign, Reaches First $120,000 Stretch Goal appeared first on Road to VR.


CES 2015: Hands-On with ‘3D Rudder’ the Motion Controller for your Feet

0
0

3d-rudder-logoCES is officially underway and at the press preview event ‘CES Unveiled’  last night I got the chance to get my hands and feet on with the new 3d Rudder motion controller.

It always amazes me that a place as big as Las Vegas can host events that feel so short of space. CES Unveiled is a special press event designed to introduce up and coming technologies to the hungry journalist horde. Occupying one of the Mandalay Bay hotel’s ballrooms, it was standing room only in places. Nevertheless, we’re entirely fearless here at Road to VR and I managed to track down two intriguing devices which our readers may find interesting.

3D Rudder – The 3DoF Motion Controller for your Feet

I might well have missed 3D Rudder had their founder and CEO Stanislas Chesnais not overheard me talking and tapped me on the shoulder. The Marseilles based startup were tucked away but had managed to secure space enough to demonstrate their new motion controller – designed to be used with your feet.

See Also: Inside Look at 3DRudder, Feet-Controlled Navigation Device Headed to CES 2015

3D-Rudder-2The device is essentially a disc with a soft semi-sphere underneath which allows pivoting of the device. It has two pressure sensitive areas for your feet that can detect whether your feet are resting on the device of lifted off.

You spin the device for yaw, tilt the device forwards and back for pitch and tilt left to right for roll. The device can either map to keyboard presses and / or to joystick axis using software provided by 3D Rudder. Additional input can be attained by actions such as lifting one foot off the device. I got the chance to sit down with the controller and (almost literally) give it a whirl. It took a little getting used to, but after the system was calibrated I found it to be surprisingly precise.

3D-Rudder-1Rotating 3D Rudder rotates your view with tilting forwards and backwards moving you in and out of the scene. As mentioned, pressure sensors under your feet detect lift-off so moving your left foot to rest on the heel and the right on the ball of your feet moves the camera up and down on the Y axis. Remembering all of those combinations that meant there is definitely a learning curve involved, but I was beginning to get the hang of things towards the end of my short play-test.

Chesnais told me the device sports 3DoF capabilities and is fitted with a sensor array comprising gyros and a compass/magnetometer.

One of the issues I had, one which I suspect would be easily tweak-able in software, is finding your dead-zone – a resting position for your feet. It wasn’t always easy enough for me during the demo. The demo used at the event was a CAD or 3D modelling package, but the 3D Rudder will have an SDK available for native integration for the device.

Nick from UploadVR gives 3D Rudder a spin
Nick from UploadVR gives 3D Rudder a spin

As with most controllers aimed at offering the ability to orient and control 3D space, I think this native integration will be key to 3D Rudder‘s success as a compelling controller. As ever, it’s all about the software, a view Chesnais concurred with. He was keen to emphasise the effort the company is putting into this area.

If you’re interested in supporting the 3D Rudder you can snag yourself an early bird unit via their Indiegogo campaign for $110. The campaign runs until January 18th and has so far raised around $16k.

3D Rudder IndieGogo Campaign

The post CES 2015: Hands-On with ‘3D Rudder’ the Motion Controller for your Feet appeared first on Road to VR.

CES 2015: Sixense Shows off Distortion Correction with STEM Motion Controller (video)

0
0

stem distortion correction ces 2015 (2)

Sixense, creators of the tech behind the Razer Hydra, and the upcoming STEM motion controller, gave us a demo showing how STEM’s new distortion correction method can make a huge difference in environments with magnetic distortion.

In the early days of the first Oculus Rift development kit, the DK1, it was easy to find a fun VR demo that would make use of the Razer Hydra, an accurate motion input controller that uses magnetic tracking technology. And while the Hydra gets less use these days as developers wait excitedly for the next-gen STEM system, it seemed like the Hydra was either loved or hated depending upon who you talked to.

See Also: VRClay Beta Hands-on, the 3D Sculpting Software for VR

For every person who, like me, was in the ‘love’ camp on account of the unit’s impressively accurate, drift-free tracking, it seemed there would be someone who complained of inaccuracies. The issue, in many cases it seems, was blamed on magnetic interference from surrounding electrical and metallic objects. Such objects can distort the magnetic field employed by magnetic tracking devices like the Hydra.

With the forthcoming STEM system, Sixense has made an upgrade that appears to cut through distortion like a hot knife through butter. Sixense’s Danny Woodall showed us the difference in tracking with and without distortion correction.

From what I understand, thanks to the addition of an on-board IMU, the controller consistently knows the gravity vector (which direction is down). This gives it a reference point to compare against. When the magnetic tracking suggests that the unit is tilting up or down (in this case, due to magnetic distortion) the IMU functions as a double check. When the two agree, tilting gets the green light. When they disagree, the known downward direction from the IMU is used to as a reference to compensate.

stem distortion correction ces 2015 (3)

Regarding shipping, Sixense’s Danny Woodall says that “by the end of the month we should be shipping out all of the prototype units. And then right after that we’ll be starting to push out all the Kickstarter units to everyone else that’s pledged. I think probably early March hopefully most people will start having systems show up at their doorsteps.”

We’ll be looking forward to seeing how STEM’s distortion correction works across a number of varying environments as Kickstarter  backers start to get their hands on the system.

The post CES 2015: Sixense Shows off Distortion Correction with STEM Motion Controller (video) appeared first on Road to VR.

Oculus Rift VR Experiences Take Top Spots in Leap Motion ‘3D Jam’ Contest

0
0

leapmotion3djam

Leap Motion’s ‘3D Jam’ dev contest, presented by the independent games festival IndieCade, is now officially over and the results are in. Among the 20 semi-finalists, who were awarded over $75,000 in prizes by a jury composed of Leap Motion staff and members of the IndieCade community, only two teams have received a spot at this year’s IndieCade East festival taking place in New York City on February 13-15.

Drum roll please…

1st Place: Aboard the Lookinglass

In first place is Aboard the Lookinglass, chosen for its “sci-fi atmosphere, mind-expanding puzzles, and unique game mechanics.” Developer Henry Hoffman has nabbed $5,000 in cash and up to $2,500 applicable to travel costs so he can make his way to the IndieCade East event in New York City where he will be showing off his game.

Aboard the Lookinglass is a science fiction game that applies the hand-tracking capabilities of the Leap Motion Controller in a surprisingly novel way. Instead of requiring the pin-point accuracy needed for manipulating in-game objects, Hoffman’s demo turns your hands into a pair of semi-transparent portals: the left looks into the past, and the right looks into a disconcerting future—with you stuck somewhere in the middle with the abstract objective of transporting items through the two.

2nd Place: Weightless

In second place is Weightless, a zen-like experience based in the observation deck of an orbiting space station. The demo was created by Martin Schubert, who took home $3,000 (and up to $2,500 in travel costs to IndieCade East) getting him the silver medal for “its beautiful design aesthetic and overall experience.”

The piano stylings of Chris Zabriskie add to the serenity of the game, which puts you in the microgravity environment along with a scattered jumble of everyday items that float gently across the backdrop of planet Earth. You can lazily swat the lifelike cloud of bits and bobs around the cabin, and toggle different gravitational powers. The lack of traditional game objectives aside, Schubert’s Weightless is welcome repose following the bustle of daily life—or in our case—the hectic ongoings of CES 2015.

3rd Place: World of Comenius

Third place goes to World of Comenius, giving us a taste of what VR-assisted education may look like in the near future. Game developer Tomáš “Frooxius” Mariančík’s, also known for his game SightLine: The Chair, has netted himself a cool $2,000 with the game that translates complex concepts into visual and interactive experiences.

See Also: Why ‘Sightline: The Chair’ on the DK2 is My New VR Reference Demo

In an interview with Mariančík, he told us that World of Comenius “aims to utilize VR to show people things that weren’t possible before: play around with atoms and get intuitive ‘feel’ of their behavior on the quantum level, swim in the cell or meet with people from history and explore the environment they lived in, while having [the] feeling that they’re actually in there.”

Although not a requirement, all three winning entries have built their games to support Oculus Rift DK2—a testament to the transformative power of VR, and perhaps where Leap Motion sees their technology heading.

Each of the top 20 semi-finalists including three Honorable Mentions from the pool of 156 contestants will be receiving cash prizes.

The Top 20 Semi-Finalists (with download links)

  1. Aboard the Lookinglass
  2. Weightless
  3. World of Comenius
  4. Otherworld
  5. Magicraft
  6. ElementL: Ghost Story
  7. Tran;section
  8. Hollow
  9. Soundscape VR
  10. Press Bird To Play
  11. How Does That Move?
  12. Gooze
  13. Paper Plane
  14. Q
  15. The Crow
  16. Hauhet
  17. Let’s Make Fried Rice
  18. Corridor 17
  19. Deify
  20. Observatorium

Honorable Mentions: $500 Prize Tier

  • Coolest (and Most Complex) Hardware Setup – Dualcyon

All finalists will be included in Leap Motion App Store and will also receive a free Leap Motion Controller for each team member.

The post Oculus Rift VR Experiences Take Top Spots in Leap Motion ‘3D Jam’ Contest appeared first on Road to VR.

CES 2015: Leap Motion Co-Founders Talk ‘Dragonfly’ Made-for-VR Motion Input Camera

0
0

leap motion dragonfly ces 2015 interview (1)

At CES 2015 last week, we met up with Leap Motion co-founders Michael Buckwald and David Holz to try their latest prototype motion input camera, codenamed ‘Dragonfly’, and chat with the duo to learn more about their plans for making gesture input part of every VR experience.

Leap Motion VR Developer Mount on HMD
The current Leap Motion (‘Peripheral’ as its creators call it), has a dedicated mount for VR headsets. Going forward, the company hopes to build their motion input camera directly into VR headsets.

Leap Motion is a low-cost motion input camera that was announced in May, 2012, a few months before the Oculus Rift DK2 Kickstarter, with an initial aim toward desktop applications. In August, 2014, the company ‘Set Course‘ for virtual reality, releasing a ‘VR Mount’ which conveniently attaches the sensor to the front of any VR headset. At the same time, the company teased a prototype version of their gesture camera called ‘Dragonfly’, which aims to be “embedded by VR OEMs,” with “greater-than-HD image resolution, color and infrared imagery, and a significantly larger field of view.”

See Also: Leap Motion’s Next-gen ‘Dragonfly’ Sensor is Designed for VR Headsets

At CES 2015 last week, we got to try Dragonfly for ourselves and learn more about the prototype sensor and its aspirations from the company’s co-founders, Michael Buckwald and David Holz

The Dragonfly sensor prototype we saw at CES was a naked circuit board sporting two cameras, attached to an Oculus Rift DK2. Buckwald and Holz told us that differences between the made-for-VR Dragonfly and the existing Leap Motion device (which they call the “Peripheral”) is higher resolution (they said “better than native” on a 2.5k display, but wouldn’t specify further), the distance between the cameras is set to approximately the human IPD average of 64mm (compared to 40mm of the Peripheral), the cameras are capable of a wider field of view and higher framerate, and the unit also sports IR and RGB (color) sensors (whereas the Peripheral only has IR).

See Also: Phase Between the Real and Virtual World With Leap Motion and a Swipe of Your Hand

leap motion dragonfly ces 2015 interview (2)I tried the Peripheral and the new Dragonfly sensor back-to-back mounted on a DK2. The increase in resolution was noticeable in both IR and RGB modes. The IR mode on Dragonfly actually looked better in terms of resolution than the RGB mode due to quite a bit of graininess on the latter; Buckwald and Holz told me this was because they haven’t yet implemented any image-processing, as is typical with digital cameras, to clean up some of the noise and correct for other bits. They said such filtering is planned to be added soon.

Having used the low field of view monoscopic passthrough camera on Gear VR, which felt like looking through a small 2D window into the real world, Dragonfly’s passthrough view filled the entire Rift screen and provided depth perception thanks to the dual cameras, making it easy to reach out and grab objects.

Holz walked me through a series of alternative views, first removing one of the two cameras (effectively removing my sense of depth), then bringing the field of view down to that of a standard cell phone camera (which brought the outside world back to that small window I was used to with Gear VR). From there he re-enabled the second camera, giving me a small 3D window, then jumped back up to a vision-filling field of view. The whole process was very interesting and showed how much more usable a passthrough camera can be with a wide field of view and stereoscopic imagery.

Holz also demonstrated some augmented reality functionality with Dragonfly by loading floating spheres in the environment surrounding me; I could swat my hands through the air to disrupt and move the spheres, but more interestingly, I could move my real hand behind the virtual sphere, and the sphere would occlude it. This may not sound like a big deal, but it’s actually a rather challenging problem for augmented reality, and one that’s really neat to see done well in practice.

As we talked with Buckwald and Holz, it became clear that the company’s ultimate goal is to have Dragonfly embedded in every VR headset. They seemed less interested in packaging up Dragonfly into a gen 2 Peripheral, but said they’d consider it if necessary. The duo also reinforced that the software is a huge component of Leap Motion’s functionality and strategy; even if the hardware stays the same, the software continues to improve, they said.

At the end of our interview, Buckwald said that Dragonfly is “not an ending, it’s a beginning,” and teased that in the next few weeks we’d hear announcements from partners that will be incorporating new Leap sensors into their devices.

The post CES 2015: Leap Motion Co-Founders Talk ‘Dragonfly’ Made-for-VR Motion Input Camera appeared first on Road to VR.

Tactical Haptics’ Will Provancher Talks Updated ‘Reactive Grip’ and We Go Hands-on

0
0

As Ben Lang says in his interview introduction, Will Provancher is one of the OG’s of virtual reality. Tactical Haptic’s ‘Reactive Grip’ controller was one of the few peripherals shown back at GDC 2013 that aimed to enhance immersion compared with standard input devices. The Road to VR team caught up with Tactical Haptics again at GDC 2015 to chart the system’s progress over the last 2 years.

Tactical Haptics haven’t had the smoothest of rides since their GDC debut in 2013. Despite the company’s ‘Reactive Grip’ technology providing one of the biggest advancements to force feedback controller technology for some time, their Kickstarter faltered and they struggled to find a solid commercial footing with which to develop and market the device.
tactical haptics gdc 2015 (2)

At GDC 2015 however, Will Provancher was once again optimistic about the company’s fortunes, having won a grants from NASA and NSF (National Science Foundation), they’re currently in residence at StartX, a Standford-affiliated non-profit organization set out to accelerate entrepreneurs “through experiential education.”

Tactical Haptics has also shipped controllers to developers as part of it’s closed beta programme, something that Provancher alludes to in the interview; “While we have been working on some further technology developments, what we’re really trying to do is work shoulder-to-shoulder with some content developers so that we have really great content, for when we go back out and do pre-sales.”

The company is also investigating different development paths, one to enhance the technology to make for a more commercially attractive product to break into the consumer space. “The NSF one is actually looking at adapting what we’re doing to try and find out minimal device or our so-called MVP, Minimal Viable product for gaming.” The primary aim here is to streamline the Reactive Grip tech to be more attractive for inclusion in existing input devices or for licensing to companies like Sony, Microsoft etc.
tactical haptics gdc 2015 (1)

The NASA grant is for R&D into how Reactive Grip can enhance tele-operation, the practice of remotely controlling apparatus at distance – in NASA’s case, remotely operating tele-operated robots in space for example. “Let’s say you’re an Astronaut. Every time you go out into space there’s risk … if I can have a tele-robot have the same level of dexterity, or at least approaching your level of dexterity, one: I can take advantage of your human experience … and two: buy having those forces, I can not damage what’s in the environment…”

Road to VR’s Scott Hayden went hands-on with the latest prototype shown at GDC, here are his impressions.


I was blindly handed the latest prototype of Reactive Grip, a 3D printed affair resembling an un-hinged flight stick with three sliding plates on the grip section of the device. It was wired up to a brick-sized pack, which Provancher said would eventually disappear into the handle of the device in future iterations. I had heard the grip plates were supposed to do… something. Slide up and down? Just how that was going to translate into anything resembling virtual movement in the real world, I didn’t know for sure.
I cranked down on the grip, fingered the trigger and released a volley at a walking tin can of a robot in the DK2 demo I was currently playing. The device jerked in my hand, giving me a small but satisfying kick back. I used a Portal-esque gravity gun in the next demo and picked up 4 differently sized boxes, feeling the twist and a sensation that bordered on weight… that wasn’t exactly weight, but it wasn’t exactly not weight either. You might need to be a professor of mechanical engineering to explain it, which Provancher is. The third demo gave me the opportunity to set a fixed point, and like a rubber rand, stretch away from the point farther and farther, feeling increasing force in my hand.
 
Reactive Grip is the first device to give me something, anything in a landscape of devices that have so far shown to be nothing more than glorified rumble packs, and when it finally gets the spatial tracking it deserves, this is a device to watch out for in the new field of consumer haptics.

You can find more information on Tactical Haptics and their Reactive Grip technology at their website.

The post Tactical Haptics’ Will Provancher Talks Updated ‘Reactive Grip’ and We Go Hands-on appeared first on Road to VR.

‘Ground Control’ and ‘Stompz’, Two Natural VR Controllers for Your Feet

0
0

The VR community has been rattling its cage for intuitive input devices for quite a while now, and because nobody really knows exactly what that means yet, we’re entertaining anything and everything until we do. Here we take a look at two Kickstarter campaigns currently in progress addressing just that: ‘Stompz’ and ‘Ground Control’, two small form factor input solutions designed to be used with those dangly useless things hiding under your desk right now (your feet).

In wake of Valve’s recently announced Lighthouse positional tracking system for use with HTC Vive, the VR community has been set ablaze with the thought of ‘room scale’ interactions, a term describing SteamVR’s 15×15 ft area intended for positional tracking of headsets and controllers alike. But for many apartment dwellers, the dream of creating a ‘room scale’ holodeck is likely going to be closer to a holodesk in size more than anything. This is where clever peripherals come in, devices that you can store in a cabinet, or hide away in a shoebox for later use.

Ground Control

ground control

Ground Control wants to pry you from the WASD keyboard control scheme, because their pedals are like a pair of 4-axis analog joysticks for your feet that you use to traverse through a game. The experience is still seated, but if there isn’t a cordoned off VR room in your near future, a compact pair of devices like Ground Control might just fit the bill.

‘Ground Control’ Kickstarter Campaign

If Ground Control manages to reach its funding goal of $250,000 CAD ($198,000 USD), the number of VR and non-VR games that can be used with these foot controllers will be nearly limitless, as the device is recognized by Windows as a standard gamepad. Early bird backer tiers are still available at $210 CAD ($170 USD), with estimated shipping in January 2016, a time frame that will supposedly also see the release of the consumer version of HTC Vive headset.

Stompz

stompz

‘Stompz’ is another solution that’s telling us to “forget the keyboard,” billing itself as a universally connective foot tracker for both games and low-intensity fitness—and by that they mean jogging in a stationary position to simulate forward movement.

The makers of the device originally wanted to build an omni-directional treadmill like Virtuix Omni, or Cyberith Virtualizer, but then decided to ditch the heavy hardware for something more lightweight. The device in its present form is essentially a pair of wearable, programmable 9-axis IMUs that you strap to your feet, which can communicate wirelessly using the ever so ubiquitous 2.4 Ghz radio frequency.

‘Stompz’ Kickstarter Campaign

Stompz’ basestation plugs into any VR headset with a USB port that can accept input devices, which include Oculus DK2, Razer OSVR, and Android/jailbroken iOS phones via micro USB dongle. With no obvious reason why you would actually want to plug the KitKat sized wireless USB device into a tethered VR headset, the basestation of course works by plugging directly in to Windows, Mac, and Linux-based computers.

Stompz is asking for $100,000 for their project, and are going at $115 USD each (early bird special) for two trackers, required straps, and USB-stick basestation.

Even in the face of Valve’s ‘room scale experience’, pint-sized peripherals that let you stay seated, or require minimal floor space for operation are coming, motorized chairs included. And until the major VR headset providers can support a standard device for near-stationary locomotion, city folk, dorm room tenants, and anyone else living with restricted space requirements, will be on the look out for the device that can hit that sweet spot.

The post ‘Ground Control’ and ‘Stompz’, Two Natural VR Controllers for Your Feet appeared first on Road to VR.

Yes, You Can Juggle in VR with the HTC Vive

0
0

When Alex Schwartz told people he could juggle in VR with the HTC Vive and SteamVR controllers, they didn’t believe him. But now he has video proof: not only can he juggle quite effectively with the system, he thinks that it would be an excellent way to teach the skill to beginners.

Alex Schwartz, Chief Scientist at Owlchemy Labs told me that juggling is something he uses to zone out and clear his head. Having juggled now for 18 years, he must be able to achieve a zen-like state.

“Sometimes when my code is compiling I’ll grab my juggling equipment and use that as a little timewaster,” he told me. And now he can do it in VR. Although Owlchemy’s in-development Vive game, Job Simulator, wasn’t created specifically for juggling, Schwartz demonstrates his proficiency even despite the system’s restrictive cables. In the video above you can see the virtual action happening on the small monitor to the left.

cloudhead games htc vive steamvr development (4)
The SteamVR controllers have wowed users their their accuracy.

“The controllers are so accurate and low latency that you can represent your physical dexterity in VR one-to-one.” Schwartz said. “So if I was able to throw a ping pong ball from 10 feet away into a small cup in real life, I could map that one-to-one in VR, as long as you have your physics engine correct.”

Not only can Schwartz juggle in VR, he thinks that it might be the very best place to teach and learn the ability. He recalls a time from his childhood; his elementary school actually had juggling as part of the curriculum.

“…they used to teach kids to juggle in gym, but they used scarves so they fell very slowly so that you can understand the pattern. You could actually kind of drag them through the air in a slow motion pattern and not have to deal with the weight and speed that gravity implies,” he said. “Juggling in VR is a great stopgap to learning juggling in real life because it’s really hard to slow down gravity in an elementary school gym, but in VR it’s trivial.”

In fact, a Juggling Simulator game (I’m starting to notice a trend here) has been on Schwartz’s mind. He thinks that the fidelity of the SteamVR controllers is so good that juggling skills learned in VR would easily translate to real life—and let’s not forget that physics are under our control in the virtual realm, making it easy to slow things down for beginners.

htc vive steam vr austin room scale vr jam (5)
Alex Schwartz, Chief Scientist at Owlchemy Labs. Photo courtesy Lauren Ellis

“I have this theory that humans like to predict the trajectory of parabolas, as proven by the success of Angry Birds, which is really just about matching the ideal trajectory,” Schwartz told me. “That’s all juggling is—when it comes to down to each throw, you’re trying to match the exact height, width, and depth of a throw. When you describe it, it sounds really complicated… but if you just show a dotted line in VR, it would be simple to understand.”

Schwartz has even thought about ways to curtail some common beginner’s mistakes, like not throwing quite perpendicular to the ground, causing novices to walk forward to make the catch. “You could actually train in VR for people to throw closer to their body by changing the simulation and trying to get them to compensate,” he said.

It’s clear Schwartz has thought a lot about making a juggling game, but whether or not he’ll embark on that quest is still up in the air. “The problem is that when you use the Vive and come out of it… I now have 20 or 30 games I want to make… it’s almost choice paralysis.”

The time for Juggling Simulator may be approaching however; the HTC Vive Developer Edition that’s soon to be in the hands of devs will come with wireless controllers, making it even easier to juggle. If Schwartz does end up working on such a title, I think the marketing path is clear: demonstrate the efficacy of the game by using it to learn flaming chainsaw juggling, then do it in real life for the first time in front of a live audience!

Lead photo courtesy Lauren Ellis

The post Yes, You Can Juggle in VR with the HTC Vive appeared first on Road to VR.


Video Preview: Manus Machina’s Wireless VR Glove Looks Promising, Headed to E3 2015

0
0

Manus Machina, a tech startup based in the Netherlands, have released a video preview of what they’re calling ‘the first consumer VR glove’ ahead of its appearance at E3 2015 where the device will quite literally be on hand for demonstrations.

The Dutch VR Meetup at Gamescom 2014 with an early prototype glove.
The Dutch VR Meetup at Gamescom 2014 with an early prototype glove.

I first came across Manus Machina back in August last year. Bob Vlemmix, PR Officer at Manus Machina, introduced himself to me clutching a Tupperware box containing the beginnings of a new input device, a VR glove. It was a very early prototype, wires everywhere, reminiscent of a low-budget 80s sci-fi prop. We were attending an impromptu meetup organised by Daan Kip of the Dutch VR Meetup, the venue was Gamescom 2014 in Cologne, Germany – a month after Manus Machina was founded. Bob proceeded to demo the glove to all in attendance, and despite the crudeness of the demo itself, reactions were positive.

An early glove prototype
An early glove prototype

The company launched a Kickstarter campaign in August 2014 which was unfortunately unsuccessful. Undeterred, the team has made real progress, since growing to 15 staff with their prototype hardware evolving rapidly.

Fast forward to now, and the folks at Manus Machina are poised to unleash their latest prototype at the E3 Expo in Los Angeles next month. To whet appetites ahead of the show, the team have released a video demonstrating what we can expect to see at there.

The gloves are wireless and the teams press shots (one including a user wearing a Gear VR) seeming to to suggest Bluetooth connectivity. The video is quite intriguing, with some latency on show (although this could be the panel used in the video) but also what looks to be a good range of movement and finger motion tracking. The video features a prototype tech demo application allowing the user to morph their hands into rocket launchers – what’s not to like there? The system looks to use onboard IMUs to gauge hand movement and rotation in real space in lieu of any obvious optical tracking system.

As we enter the second era of virtual reality (redux), attention is fast switching away from visualisation and display issues to the problems of VR user input. Manus Machina’s troubled past may well be dovetailing perfectly as VR heads rapidly towards the consumer market. It’ll be interesting to see how the unit stacks up against competition from the likes of Control VR and ultimately Valve’s Lighthouse. And, lest we forget, Oculus is rumoured to be announcing details of their input solution very soon too. The next 12 months is shaping up to be the busiest ever for VR input.

We’ll be at E3 2015 next month to see the gloves for ourselves. In the mean time, check out the company’s website and their twitter feed.

The post Video Preview: Manus Machina’s Wireless VR Glove Looks Promising, Headed to E3 2015 appeared first on Road to VR.

Closeup with the HTC Vive Developer Edition Headset for SteamVR

0
0

The first Vive Developer Edition headsets and controllers for SteamVR are arriving and Owlchemy Labs have been kind enough to provide us with high resolution shots of the system, including the new wireless Lighthouse tracked controllers.

The road to consumer virtual reality just took another turn as Valve hits a major milestone for its Steam VR platform. As promised back in March, developers lucky (or worthy) enough to be selected to receive a Developer Edition system can rejoice, they’ve started shipping! The package includes two Lighthouse  base stations, one HTC Vive Headset, cables, and two wireless SteamVR controllers. The controllers are of particular interest as, up until now, units used in public demo’s have been wired only.

We don’t know how closely the HTC Vive Developer Edition system will resemble consumer hardware, but Chet Faliszek indicates it’s designed to be close enough; “This will allow developers to target the same system consumers will have in their homes later this year.”

Owlchemy Labs, who’ve been working with Valve and HTC’s system for some time now (currently developing Job Simulator) are one of the earliest recipients of the new, more polished hardware. Up to now, developers have had to make do with advanced prototypes.

‘Chief Scientist’ Alex Schwartz was kind enough to contain his excitement for long enough to model his Developer Edition for photographer Lauren Ellis of Masonry, comparing the former prototype with the finalized Developer Edition.

The HTC Vive VR Headset

The HTC Vive Developer Edition (right) next to the older dev kit prototype steamvr htc vive developer edition unboxing (14) The HTC Vive Developer Edition (right) next to the older dev kit prototype

Ever since its famous appearance at Valve’s Steam Dev Days event early last year, Valve’s hardware has impressed. Back then, the units dual panels in portrait orientation and the system’s inside out room-scale positional head-tracking system (using wall-mounted fiducial markers) was in stark contrasts to Oculus’ DK1. At the time, ‘The Room’ VR demo it was the first VR experience people exited claiming to have achieved sustained periods of ‘presence’ (a high level of psychological immersion). The guys from Owlchemy themselves were suitable impressed and told us so in a guest post just after the event.

In the above gallery, we can see the older dev kit (left) versus the new Developer Edition (right). At first glance, the design is very similar to the previous revision, except that it seems as if the front-facing camera’s seem to be either missing or blocked off in this latest version. As the placeholders for them still seem to be there, it may well be that they return in later revision kits and for the consumer edition.

See Also: Valve ‘Vive’ Developer Edition Kits Have Arrived – First Unboxing Images Appear

Note in the final shot, the view inside the headset clearly show the tell-tale rings and ridges of fresnel lenses. It seems this, along with dual display panels and resolution, indicate the similarities between Oculus’ most recent prototype the Crescent Bay and and Valve’s Vive headsets—technical heritage that’s perhaps indicative of the close relationship the two companies enjoyed up until relatively recently.

As far as we now, the new Developer Edition Vive VR headset doesn’t differ greatly in terms of technical specifications. So that’s sub-millimeter, room-scale tracking for the headset and controllers, Dual 1080×1200 OLED panels (portrait orientation)

Valve’s SteamVR Wireless Controllers

The Developer Edition wireless controllers A wireless dongle receives information from the controllers steamvr htc vive developer edition unboxing (9) steamvr htc vive developer edition unboxing (11) steamvr htc vive developer edition unboxing (10) steamvr htc vive developer edition unboxing (4) The prototype controller (left) next to the Developer Edition controller (right)

Although the VR headset itself was excellent, it was the laser-based tracking solution is what set Valve’s VR demos apart at GDC 2015. That tracking system, known simply as ‘Lighthouse’, is supposed scalable, meaning the system can track more objects in a 15ft x 15ft space than just the headset and controllers. In fact, Valve reckons the computational overhead associated with tracking devices with Lighthouse is so low, that multiple headsets and input controllers could be tracked within a single space, with little trouble (barring obvious issues of occlusion).

See Also: Yes, You Can Juggle in VR with the HTC Vive

Alex Schwartz told us he was thrilled to have his hands on the now-wireless controllers.

“I was overwhelmed with joy [when I first got the controllers] purely because I finally didn’t have to deal with the terrible wire-hell I was in for the past months. It simplifies so much of the put-on / take-off experience,” he said.

The Base Stations

steamvr htc vive developer edition unboxing (15)
The new, smaller Lighthouse base station (right) has a new LED layout than the prior unit (left)

A huge thank you to Owlchemy Labs and Laura Ellis for grabbing these images for us.

The post Closeup with the HTC Vive Developer Edition Headset for SteamVR appeared first on Road to VR.

Oculus Reveals ‘Oculus Touch’ Half Moon Prototype VR Input Controller

0
0

Today at the company’s ‘Step into the Rift’ pre-E3 event in San Francisco, Oculus revealed what they’re calling ‘Oculus Touch’, the company’s VR-specific input controller.

oculus-touch-half-moon-prototype-vr-input-controller

The prototype revealed at the event was dubbed ‘Half Moon’ by Oculus and uses the same “constellation” IR LED tracking technology as used by the Rift headset. The unit also includes an inward facing “sensor matrix” which can detect common hand gestures like waving or giving the thumbs up.

Oculus founder Palmer Luckey took to the stage at today’s event to reveal the prototype controllers which Oculus developers have been eagerly waiting for. Not relying on motion tracking or gesture input alone, the Oculus Touch Half Moon prototype controller has an analogue stick, two buttons, an index trigger finger, and what Luckey called a ‘hand trigger’ that rests near the middle finger.

oculus-touch-vr-input-controller-hand-trigger

Luckey suggested that this hand-trigger would become the common method for grabbing virtual items, leaving the trigger finger free for other actions like shooting or activating whatever was held.

The Oculus Touch controllers are mirrors of each other, Luckey said, as well as wireless. 6DOF motion tracking is achieved with both IR LED’s seen on the ‘crossguard’ style ring that envelops the user’s hand as well as an IMU. The Oculus Touch controllers will presumably be tracked using the same positional tracking camera that senses the headset.

oculus-touch-vr-input-controller-with-rift

Oculus didn’t mention whether or not the controllers would be included with the Oculus Rift consumer device, despite confirming that and Xbox One controller would accompany each headset.

The company didn’t demonstrate the new controller or show it in use. Neither price nor Oculus Touch release date has been confirmed, but Luckey said that the new controllers will be shown off at next week’s E3 conference.

The post Oculus Reveals ‘Oculus Touch’ Half Moon Prototype VR Input Controller appeared first on Road to VR.

Oculus Touch VR Controller Launches 1st Half of 2016, Pre-orders Open Alongside Rift

0
0

The Oculus Touch VR controller will ship in the first half of 2016, says founder Palmer Luckey. The units will be available for pre-order at the same time as the Oculus Rift.

After revealing the Oculus Touch ‘Half Moon’ prototype on stage at the company’s ‘Step into the Rift’ event earlier today, founder Palmer Luckey tweets that the controllers will launch in the 1st half of 2016. Pre-orders for Oculus Touch will open alongside the Oculus Rift, which itself will ship in Q1 2016, according to the company.

Although not direct confirmation, this seems to make it clear that the Rift headset and Touch controllers will be separate products, not sold together. Instead, Oculus has opted to include an Xbox One controller with each Rift headset.

The choice is likely a combination of product development timelines being somewhat mismatched between the headset and the controller, and also a desire to keep the Rift headset at a reasonable starting cost, while the Oculus Touch controllers may add a few hundred dollars to the price (though the cost for either remains unconfirmed).

While there are pros and cons either way, developers may not be entirely happy about the decision to not bundle the headset and VR-specific controllers together. Devs working with the Samsung Gear VR headset have expressed concern over the fact that they can’t rely on all users having a gamepad, as the headset is only optionally bundled with one. That means developers can only be sure that Gear VR users have access to the more limited touchpad on the side of the headset. The choice to build a game targeting just the controller therefore becomes difficult as developers of course would like to see their game or experience sold as widely as possible.

oculus-touch-half-moon-vr-input-controller-rift-headset

The issue is not one unique to Oculus, it’s the same for any company introducing a new peripheral, and it poses a difficult ‘chicken and egg’ problem—developers need to make content for the devices to get users to buy, but users need compelling content for the devices before they will be compelled to do so.

However, the company is on the right track tackling the very same problem for the Rift headset itself. Although taken for granted at this point by many that the headset will find a substantial user base, it’s still an unanswered question, and one that was frequently raised, especially in the early days of the headset.

The post Oculus Touch VR Controller Launches 1st Half of 2016, Pre-orders Open Alongside Rift appeared first on Road to VR.

Hands-on: Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

0
0

From the beginning, Oculus said they didn’t want to reveal an official VR input solution until they could do it right. Despite pressure from their developers, competitors, and the VR community at large, the company bided their time up until just one week ago where they showed the world the ‘Oculus Touch Half Moon’ prototype. After trying the system for myself, I can confidently say that it was worth the wait.

Anyone that’s used the Oculus Rift DK2 and beyond knows the company’s IR LED based positional tracking tech is some of the best in the business. It’s incredibly precise, low latency, and robust. So it’s no surprise that the company chose to use that tech as the foundation of their VR controller. But challenges remained: beyond just tracking position and orientation, how do you actually interact with objects in the virtual world; how do you prevent occlusion; how do you support existing game input modalities which might still be necessary?

Oculus has sufficiently answered all of these questions, and more, with an elegant VR controller which is poised to be the best on the market at the outset of consumer VR. For me, there are three major things that make the Oculus Touch VR controller a brilliant solution to the question of input.

Natural Resting Grip

Reach your arm out like you’re going to shake someone’s hand, with your fingers outstretched like you’re going to give a high five. With your arm still out in front of you, make a fist. Now switch back and forth between the high-five hand and the fist. Watch your arm carefully as you do and you’ll see muscles moving within it to achieve these different grips.

arm muscles
Muscles in your arm play a big role in determining the position of your hand

Now let your hand go limp, into its natural resting state. This is the ‘default’ position of your muscles and hand muscles when your hand isn’t doing anything, and this is what it feels like to grip the Oculus Touch controller.

Oculus has formed the controller to fit in your resting hand beautifully and this is one factor that contributes to what Oculus called “hand Presense” when they first revealed the controller. Initially I thought this was just some buzzword play, but now I realize they really meant it.

Center of Gravity

Having the core of the controller in the midst of your resting grip means that your hand feels much like there’s nothing in it. But how to get the controller ‘outside’ of the hand’s center of gravity in order to prevent occlusion? That’s where the ‘Half Moon’ ring, which reminds me of a basket crossguard on a sword, comes into play.

oculus touch hands on e3 2015 (1)The ring orbits the hand at a steady altitude ensuring that the hand’s center of gravity doesn’t shift. Others, like the SteamVR Lighthouse controller, have significant appendages that move the hand’s center of gravity, causing it to feel more like a tool in your hand than the hand by itself (as Synthesis Universe’s OlivierJT eloquently put it).

The Half Moon ring achieves this natural center of gravity while at the same time offering ample surface area for IR LED tracking points for robust occlusion avoidance.

Hand Trigger

oculus touch hands on e3 2015 (2)The hand trigger is the most natural way to grab objects in the virtual world that I’ve felt with any VR controller, including ‘controllerless’ solutions like Leap Motion (yes it turns out that mocking a grab gesture in mid air isn’t that natural). The hand trigger rests on the inside of the controller’s hilt and is just a squeeze away from your hand’s natural resting position. The trigger is big and easy to depress with your middle and lower fingers, an action which takes your hand from that natural resting state to more of a gripping state.

Mimicking this natural gripping motion is important for aligning the brain’s notion of ‘gripping’ with the gesture intended to do so—something we might call proprioceptive parity. The hand trigger leaves the index finger free for pointing or use of the index trigger, which is more like pulling the trigger of a gun, just like you would be used to on a gamepad.

Literally within seconds of first picking up the controller I was grabbing objects in the virtual world naturally with the hand trigger. It just felt right, and there was no confusion between grabbing with the hand trigger and my index finger—just like it’s easy to grip a pistol while keeping your index finger free for the trigger.

Gestures

oculus-touch-vr-input-controller-release-date-pre-orderThe Oculus Touch controller also supports hand gestures, such that your finger position can be represented in the virtual world for social cues like pointing and giving a thumbs up (didn’t have a chance to check for that all-important middle finger gesture).

The inside of the Half Moon ring houses sensors which detect your finger positions. It isn’t clear how precisely or which fingers can be detected as the demo that I tried was using canned animations to ‘snap’ to pointing and thumbs up gestures. I was told that the system can detect analogue finger positions, but it seems to be something they’re still refining.

Whatever the case, using these gestures for specific cues it’s a great way to combine the precision of button input with the ability to naturally gesture within the virtual world, rather than abstracting those motions to a ‘point’ or ‘thumbs up’ button.

“Low Mental Load”

oculus touch hands on e3 2015 (3)
Founder Palmer Luckey reveals Oculus Touch for the first time. Photo courtesy Oculus.

When Oculus founder Palmer Luckey brought the Oculus Touch controllers out on stage for the first time, he noted that they were made for “low mental load”, and it’s easy to tell that this was core to the controller’s design. The three factors outlined above contribute to this in a big way. Rather than manipulating a tool, the controller feels as close to having my own hand in the virtual space as any I’ve tried, and it makes a big difference.

When Oculus took me into their Toy Box demo, a sort of motion control playground, I was naturally grabbing, throwing, and passing objects back and forth with another player in mere moments; it felt like there was no learning to do. In fact, during the demo I never used the controller’s buttons or joysticks, there was simply no need.

It wasn’t long in the demo before the controllers began to feel invisible, there wasn’t this sense of needing to look carefully at the tracking performance; it was so good that it hardly warranted attention. Occasionally I would see some of my real arm out the bottom of the Rift, as I followed it up onto the Rift’s display, my virtual hands appeared to be aligned perfectly, it was surreal.

With this “hand Presence”, it was just me and another user in a virtual world, interacting like people might—handing objects to one another, throwing things around, shooting slingshots, and of course shrinking each other with shrink rays… but more on that soon.

Oculus’ insights go beyond mere ergonomics; proprioceptive parity was a clear priority of the controller’s design prompt, making Touch the most comfortable and natural VR controller I’ve ever used. The company’s positional tracking prowess means they are tracked nearly flawlessly. Unknown price aside, Touch is poised to be the best VR motion controller out there. Haters gonna hate, but Oculus has pulled it off again.

The post Hands-on: Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds appeared first on Road to VR.

Oculus to Open ‘Constellation’ Positional Tracking API to Third-parties

0
0

Oculus says that they’ll be opening up their ‘Constellation’ tracking API so that third-parties can use the tech to build positionally tracked peripherals beyond what the company will make themselves.

At E3 2015, Oculus founder Palmer Luckey told us that he believes the company’s Touch controller is “making the right set of compromises and tradeoffs to make a pretty universal VR input.” He does note however that “it’s never going to be better than truly optimized VR input for every game. For example, racing games: it’s always going to be a steering wheel. For a sword fighting game, you’re going to have some type of sword controller. For things that are really about fine-grain finger interactions, it’s probably going to be maybe even some type of glove or computer-vision based hand tracking solution.”

See Also: Reverse Engineering the Oculus Rift DK2 Provides Brilliant Insight into Inner Workings

Luckey said the company won’t be making Wiimote-style docking peripherals for those purposes, but there will be a way for people to get their hands on niche VR controllers.

“…we’re going to be opening up our tracking API for people so then they’ll be able to make peripherals that are tracked using our tracking system,” he said. “I think you’re going to see people making peripherals that are specifically made for particular types of games, like whether they’re steering wheels, flight sticks, or swords, or gun controllers in VR.”

oculus-touch-half-moon-vr-input-controller-rift-headset

It may have been the plan for Oculus all along, but we heard little about third-party access to the company’s IR-LED based ‘Constellation’ tracking system prior to mounting pressure from competitor Valve. At GDC 2015 in March, Valve revealed its SteamVR virtual reality system including the Laser-based ‘Lighthouse’ tracking solution which afforded what they called a ‘room-scale’ tracking volume of about 15×12 feet. The company also said “we’re gonna just give [Lighthouse technology] away,” with the hopes of promoting a universal tracking solution for VR.

See Also: Oculus Demonstrates Their Own ‘Room-scale’ Tracking Capability at E3 2015

Details about how Oculus will open their Constellation tracking API are thin on the ground (and same with Lighthouse for that matter). It isn’t clear for either company’s solution if third-parties will need to purchase tracking markers, license the tech for use, or be charged an integration fee, among a number of potential business models. Nor is it clear if any sort of certification framework will exist to give consumers confidence that a third-party peripheral will work adequately.

The tracking tech question could play a major role in which headset leads adoption; if one of the two companies has a bigger and better lineup of tracked third-party accessories, it could tip the scales of the headset purchase for some consumers. We expect to learn more about both companies’ plans as their respective headsets approach launch beginning at the end of 2015 with Valve’s HTC Vive.

The post Oculus to Open ‘Constellation’ Positional Tracking API to Third-parties appeared first on Road to VR.

Sixense STEM Edges Closer to Release

0
0

Sixense, the company behind the much beleaguered motion control system STEM, have released what they term ‘positive’ results from their latest round of EMC (Electromagnetic Compatibility) testing.

The STEM system from Razer Hydra creators Sixense, having been successfully Kickstarted back in 2013, is still yet to ship to original backers. However, hot on the heels of the company’s last update, Sixense have released their latest EMC test results which seem to indicate the company is edging closer to full FCC compliance and therefore able to sell the motion control system on the open market.

All in all, except for some moderately expected, and easily corrected, issues, the results turned out to be quite positive. We fully expect that we will be able to fix the failures on the new layout of the boards and meet the timeline of the next milestone, per the schedule presented in the previous update

The company included a summary of their latest test results:

sixense-stem-fcc-table

The post Sixense STEM Edges Closer to Release appeared first on Road to VR.


Control VR Halts Kickstarter Pending “Significant” Restructuring and “Additional Financial Investment”

0
0

Control VR, a motion input solution for VR that raised nearly $450,000 on Kickstarter, appears to be in critical condition, unable to deliver on its promises without significant help.

A June 2014 Kickstarter for Control VR, a motion input device capable of tracking arms, hands, and fingers, nearly doubled its goal, raising $442,227. The capable-seeming team from the world of professional motion capture promised Kickstarter backers it would deliver the system in December of that year. 1,161 backers supported the project, with the bulk purchasing the $600 tier for the ‘Body and Double Arm Control VR Dev Kit’.

After sparsely communicating with its backers following the successful Kickstarter, a new updated shared by Control VR suggests that the project is currently dead in the water:

Dear Kickstarter Community, we are grateful for your trust and patience in our project. We understand that many, if not everyone, are frustrated at the pace of development and delivery schedule. Unfortunately, after many months of doing everything in our powers to develop, refine, and deliver on our Kickstarter promise – the project cannot be fulfilled without significant organizational restructure and additional financial investment. To ensure success, we are actively recruiting a new management team and are in ongoing talks with potential investors. We hope to put into place a new management team and raise the necessary capital in the coming months.

The blunt update mentions nothing about a potential refund should the necessary restructuring and investments not come through.

Ben Lang trying Control VR

Since interviewing the company’s CEO & CTO, and testing the product at E3 2014, Road to VR tried several times throughout 2014 and 2015 to get in touch with the company but received no response.

Finally, in June of 2015, a former member of the Control VR team told me that “I have had nothing to do with its management since late September 2014,” just three months after the Kickstarter ended. “They wanted me out and they got me out. They don’t tell me anything and they don’t ask anything, even though I am here at their service if they wanted anything.”

Reddit user ‘Tekorc96′ purports to be a backer of the Kickstarter campaign and emailed the company demanding a refund. The response, alleged to be from an unidentified member of the company, elaborates on the plans from the beginning, which seemed to involve repurposing motion capture hardware from another company.

Maybe it’s somehow unclear and sorry for the confusion but we were selling Synertial’s very expensive motion capture suits and rebranding them Control VR because Synertial owned 1/3 of Control VR and the idea was that the high-end motion capture industry was slowly dying and we saw an opportunity in VR to push our chips into the middle of the table and go for consumer adoption asap.

“Synertial” was not mentioned anywhere on the Kickstarter page.

The alleged response goes on to defend accusations of fraud by the company:

Nothing we did at Control VR was fraud. We thought we were making history by offering this incredible tech at $600 (basically giving it away), and we believed the tech was good enough for consumer adoption. However, as the tech was put through more and more paces, the performance started unravelling quickly and what we believed to once be a good enough benchmark for consumer adoption, the hardware and overall system was coming up signficantly short of.

We realized we had to go back to the drawing board and significantly improve the system on both the hardware and software side of things, and that’s how everything went upside down very quickly.

It further suggests that the company is still trying to keep the project alive, despite needing to hand over the reigns to someone else.

So what are next steps then? We believe there is an opportunity for a fresh start, with a new team altogether. We believe money is still interested in a particular restructuring and we are trying to make that happen.

This whole process has been so frustrating, painful, and a nightmare for us at this company. And all we care about now is hopefully being able to pass the torch off to another group of people who can take it from here. Our hope now is to be able to say that we were apart of what eventually became a great thing.

Looking at the official Control VR website, you’d never know the behind-the-scenes breakdown. Pre-orders for the $600 dev kit system are marked as “Sold Out”, with an estimated shipping date of “Q4 2015″. Meanwhile, the company’s blog hasn’t been updated since October, 2014, with its social media channels going dark around the same time, just a few months after the Kickstarter ended.

The post Control VR Halts Kickstarter Pending “Significant” Restructuring and “Additional Financial Investment” appeared first on Road to VR.

New Oculus Touch Documentation Reveals Capacitive Buttons and Recognizable Gestures

0
0

While the Oculus Touch controller is largely designed to tell the computer where a user’s hands are, the controller also has an idea of what a user’s fingers are doing. New documentation adds to our understanding of the controller’s hand-tracking capabilities, and reinforces the device’s deliberate name.

Released alongside the latest version of the Oculus SDK (v0.7) the updated Oculus Rift Developer Guide describes how devs can access data provided from the Touch controllers. Orientation and position of the controllers is provided in the same coordinate frame as the Rift headset itself, separate from the input state (button presses). “Having both hand and headset data reported together provides a consistent snapshot of the system state,” the document explains.

See Also: Oculus PC SDK 0.7 is a Major Overhaul, New Direct Driver Mode Made with NVIDIA & AMD

Touch Sensitive Buttons

oculus-touch-hand-with-strapThe input state of the controller expectedly tells the developer when buttons are pressed, triggers are pulled, and joysticks are tilted. But it also tells something that most other controllers don’t: when a user’s fingers are touching (but not pressing) certain buttons.

This data isn’t particularly important for non-VR controllers but, inside VR, giving the controllers a way to sense finger position means the user’s hand/finger position can be matched closely, leading to a greater sense of Presence. It also provides important feedback for users; when you can’t see your hands on the real controllers, it’s hard to tell which button your finger is on. But with capacitive buttons that can sense touch, the game world can show users where their fingers area located on a virtual representation of the controller. I imagine this will be especially helpful for in-game tutorials explaining the controls to first-time players.

See Also: Hands-on – Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

Calling ovr_GetInputState checks the controller’s Button State, which includes Button Touch State, indicating which buttons are being touched (but not pressed). Every input on the controller, with the exception of the side ‘hand trigger’, can sense a user’s touch, including the index trigger and the joystick.

While both the index trigger and hand trigger report analogue values (to report partly-pressed states), the controller’s two face buttons and joystick button (clicking down on the joystick) are binary (not pressure sensitive), according to the documentation.

Hand Gestures

oculus-touch-half-moon-vr-input-controller-rift-headset

Hand gesture recognition is something we’ve known about since seeing Oculus Touch for ourselves at E3 2015, but the new documentation explains what data developers will have to work with.

Oculus told us that the sensing of hand positions is indeed analogue, but as far as the SDK documentation reads, the company has opted to bake in pre-defined gestures. It doesn’t appear that developers will have raw access to the hand gesture data for now.

There are only two supported gestures at this time: pointing with the index finger and thumbs up. These can be checked using ovrTouch_RIndexPointing and ovrTouch_RThumbUp respectively, switching out the R for and L to check the left hand’s state.

Although no middle finger gesture is documented, Oculus Founder Palmer Luckey confirmed to Road to VR that the only fingers the controller doesn’t detect are the pinky and ring finger, leaving the possibility of more gesture states being added to the SDK down the line. The current set of two is likely the result of testing which gestures could be identified with high consistency, so as not to have players see their fingers jumping around unnaturally.

See Also: 24 Minutes with Oculus Founder Palmer Luckey on Rift CV1, Touch Controllers, Fresnel Lenses and More

Here’s to hoping for a ‘peace sign’ gesture so I can start work on Hippie-Sim 2016.

Haptic Feedback & Two Trackers

oculus touch hands on e3 2015 (1)

Haptic feedback is perhaps the area we know the least about on the Oculus Touch controller. The Oculus Rift Developer Guide describes the controller’s haptic feedback simply as “vibration”, which makes it sound like the same sort of ‘rumble’ you’d find from a gamepad.

But looking closer at the documentation, it’s possible that the Touch controller uses a linear actuator, rather than the usual ERM motor that produces the rumble in many gamepads. Linear actuators are capable of producing more fine-grain haptic feedback events like clicks, and seem to be the haptic basis for the HTC Vive controller as well.

See Also: HapTech is Bringing Realistic Weapon Recoil to a VR Controller for $60

The hint comes in the way that developers are able to specify how the vibration should function. Using ovr_SetControllerVibration, devs set which controller to vibrate and independently set the vibration and amplitude—that last part being the hint which may indicate a linear actuator over an ERM motor which generally has just one variable for control.

Interestingly, Oculus warns against extended durations of vibration:

Prolonged high levels of vibration may reduce positional tracking quality. Right now, we
recommend turning on vibration only for short periods of time

The documentation also specifies that “at least” two positional cameras will be used with the controllers.

“For installations that have the Oculus Rift and Oculus Touch controllers, there will be at least two constellation trackers to improve tracking accuracy and help with occlusion issues.”


Oculus Touch controllers are expected to go on pre-order at the same time as the Oculus Rift, sometime in 2015, with an expected release date of H1 2016. Pricing has not yet been announced.

The post New Oculus Touch Documentation Reveals Capacitive Buttons and Recognizable Gestures appeared first on Road to VR.

‘Oculus Touch’ Connect Demo Leak Shows Cross Platform VR Development

0
0

Oculus Connect day one is over, but it was but a warm up act for the remaining two days, kicking off today with a series of keynotes at 10am PST. In the mean time, UploadVR noticed that information has ‘leaked‘ via the Oculus Connect app which means we know what demo’s will be available to try on Oculus’ proprietary VR input device, ‘Touch’.

Firstly, it’s great to see so many titles playable with the device Ben Lang labelled as “poised to be the best VR motion controller out there” after spending time with the system at E3 in June.

See Also: Hands-on: Oculus Touch is an Elegant Extension of your Hand for Touching Virtual Worlds

But very few people outside of press and selected developers have yet to try the input system Oculus is banking on to complete their virtual reality system, due to ship in Q1 2016 next year (note: Touch won’t ship until Q2 according to current information).

oculus touch hands on e3 2015 (3)
Oculus CEO Debuting ‘Touch’ at a special Pre-E3 press event in June

So, Oculus Connect 2, the developer conference specific to virtual reality and Oculus technologies, will be the first place many will have had a chance to get their hands on, literally, with Touch. The Oculus Connect 2 app, launched a little while ago gives us a sneak preview into the titles in development for Touch and on display at Connect, and the signs are that there’s an encouraging amount of cross-platform development seemingly underway spanning both Oculus’ Touch and Valve’s SteamVR controller platforms, the latter due to ship in numbers with HTC’s Vive VR headset in Q1 next year.

The list is as follows:

oculus-touch-vr-input-controller

Those last three we’ve yet to hear anything about, meaning we may see some new IP demonstrated at Connect today. Of the remainder, 5 of them have either been confirmed, demonstrated as or rumoured to support Valve’s SteamVR platform, specifically the HTC Vive.

This good news for 3 reasons:-

One: It means that both competing VR platforms from both Valve and Oculus are not aggressively locking in developers for artificial exclusivity to their platforms – at least not yet.

Two: It probably indicates that development for both platforms simultaneously is not so arduous or painful that small development teams (which make up the vast majority here) as to make such a feat inaccessible. And may even be desirable.

Three: As the vast majority of consumers looking to purchase virtual reality hardware will likely be able to afford or play host to only one of them, it means the consumer isn’t punushed for his or her choice of platforms.

Basically, it’s great news for the industry in general is allowing this healthy cross-pollination of talent to occur. Without it, the launch velocity required to unearth VR from it’s past reputation and prejudices in the consumer’s eyes won’t be reached.

Either way, we’ll be learning a lot more about these titles over the next two days as we settle in for the main body of the Oculus Connect 2 conference and people begin to get their hands on the software. For now though, it’s all looking mighty positive for everyone involved.

Road to VR are at Oculus Connect, reporting back on the latest news throughout the conference.

The post ‘Oculus Touch’ Connect Demo Leak Shows Cross Platform VR Development appeared first on Road to VR.

Tactical Haptics to Debut Vive ‘Reactive Grip’ Prototype at VR Launchpad

0
0

Tactical Haptics, the company behind the unique tactile feedback technology ‘Reactive Grip’, will show their latest HTC Vive compatible prototype for the first time at VR Launchpad, the the VR startup showcase event, kicking off tomorrow at the Computer History Museum in Mountain View, CA.

Road to VR has a long history with Tactical Haptics. We first stumbled across the company as it demonstrated an early version of its ‘Reactive Grip’ technology at GDC way back in 2013. The technology uses sliding ‘contactor plates’ to manipulate the skin on your hand to convey the illusion of translational motion and forces. So, for example, swing a virtual medieval flail around and you’ll feel the weight of the handle shifting and rotating in your hand.

Register for VR Launchpad

We last came across the TH team earlier this year, again at GDC, where we learned the company had won a grants from NASA and NSF (National Science Foundation), they’re currently in residence at StartX, a Standford-affiliated non-profit organization set out to accelerate entrepreneurs “through experiential education.”

tactical-haptics-reacive-grip-1Their latest prototype is designed to interface with Valve’s HTC manufactured Vive and Steam Controllers, adding naturalistic tactile feedback to the laser tracked VR input devices. This latest version will be shown for the first time at VR Launchpad.

Presented by SVVR and Road to VR, VR Launchpad will see 24 diverse startups pitch VR businesses focused on analytics, content distribution platforms, hardware, healthcare, entertainment, education, enterprise, and more.

Will Provancher, CEO of Tactical Haptics explains the prototype you’ll see at the event tomorrow is designed “in anticipation of working with developers that are already working with Vive,”. The device, even at this stage, is designed to be plug and play “This sleeve interface will make it easy for them to pop the Vive on or off our controller,”. With the HTC Vive, Valve’s first Steam VR compatible headset, due to ship in small quantities later this year and in numbers in Q1 2016, Provancher has his eye on attracting developer talent to the system “Our plan is to take applications from developers wanting to work with Reactive Grip later this year.”

The post Tactical Haptics to Debut Vive ‘Reactive Grip’ Prototype at VR Launchpad appeared first on Road to VR.

Hands-on: Tactical Haptics’ Vive Demo is Further Proof That VR Needs More Than Rumble

0
0

While the latest generation of haptic feedback in controllers has come a long way since the days of N64’s ‘Rumble Pak’ add-on, it’s fundamentally based on the same principle: rumble. Whether it’s the ERM motors of days passed, or linear actuators thought to be used in next-gen VR devices like Oculus Touch and Vive controllers, rumble will only take you so far, but not far enough.

Tactical Haptics has been developing their ‘Reactive Grip’ haptic feedback system since 2013. I was impressed from the very first time I tried the system back at GDC 2013. The novel approach, which uses sliding bars to mimic the pressure felt against your palms while gripping an object, is a huge step up over mere rumble, and one of only a small number of practical haptic enhancements that’s more than just a concept.

This (old) video gives a solid glimpse of how the system works for those who have never had the fortune of trying it:

The system can create the sensation of holding virtual objects in a surprisingly convincing way by simulating the forces of the object against your hand as if you were really holding it.

Think about gripping a baseball bat in your hands. When the end of the bat feels a force against it, the force will travel along the length of the bat and exert against the hands, corresponding to the direction of the force. Imagine wielding a sword and stabbing it into a dummy; your palms will feel the force of the sword pushing downward in your grip as it experiences friction while piercing the dummy. Now think about spinning a flail above your head; your palms will feel forces from the handle moving circularly as the mass swings above you. Reactive Grip portrays forces surprisingly well… so well in fact that you often feel like there really is something in your hands beyond a controller.

At VR Launchpad last week (an event presented by SVVR and Road to VR), Tactical Haptics showed off the latest wireless version of their Reactive Grip system, for the first time hooked up to the HTC Vive’s controllers. This was the first time I was able to enjoy the feedback atop an input system of this fidelity; the accuracy and latency of which further highlights how much the extra haptic information adds to immersion.

tactical-haptics-reacive-grip-1As a quick hack (until the doors open to building Lighthouse tracking directly into third-party peripherals), Tactical Haptics created a simple 3D printed attachment to mount the Vive controller to the wireless Reactive Grip controller.

During demos of the system at VR Launchpad, the company strapped users into the HTC Vive headset and handed them one of the Reactive Grip controllers along with one of the normal Vive controllers in the other hand. Giving users a point of reference for what the experience was like with and without Reactive Grip clearly drove home how powerful the added feedback can be.

I ran through the company’s latest set of demos, including one of my favorites, the flail, which still does an impressive job of telling your brain that there’s a large mass swinging over your head as you swing it about. The gravity gun demo, which I had tried previously, though not in conjunction with the Vive, garnered new appreciation because of how it utilized the haptics for communication of abstract forces.

In the gravity gun demo you can reach out and remotely grab any box in front of you. Once tethered to the gravity gun, you can swing boxes back and forth and fling them in any direction if you build up the right momentum for your throw. The haptic feedback in this case doesn’t necessarily make you feel like you’re gripping a gravity gun that’s controlling mass (after all, who knows what that feels like?), but it effectively describes the momentum forces felt on the boxes as you whip them about the scene. Rumble, in this context, wouldn’t help to describe those forces at all.

tactical-haptics-htc-vive-2

Larger boxes have more mass and thus more momentum, causing a greater amplitude of force felt on your hand as you move the boxes about the scene. The feedback is essential for knowing how much force you’re exerting on the box, and is therefore really important to getting a solid toss. The demo scene I was standing in had a simple flat floor that extended in all directions around me but eventually stopped abruptly a few hundred yards away. I spent most of that demo trying to fling a box to the very edge. After getting a feel for the momentum, thanks to the feedback from the haptics, I nailed a perfect toss and just managed to get a box over the edge. This simple little goal I made for myself wasn’t even the intended goal of the demo and yet left me with a satisfied sense of control and accomplishment; it’s easy to see how this extra layer of feedback could be explored for new game mechanics.

tactical-haptics-htc-vive-3

While rumble effectively tells us when something is happening, and (with variable amplitude) how much it’s happening, it has no capacity to indicate the direction of forces, which is essential to feeling like you’re actually holding something. This additional layer of haptic feedback means that more information can be sent to the user about the forces felt on a virtual object.

A perfect haptic feedback system would be able to simulate all possible forces, including causing your hands to stop in mid-air when pressed up against something you can’t move (like a wall). And while Reactive Grip may not be perfect in that sense, it is a practical approach to adding another layer a haptics and immersion to the VR experience, and one that seems like a logical next step for deeper immersion.

Forced to choose between the two, I’d easily pick Reactive Grip over rumble, but ultimately the two compliment one another, especially if the rumble comes from the more modern linear actuator approach which is great for subtle clicking and tapping effects as well as the usual ‘dumb’ rumble that we associate with the ERM motor rumble common in modern-day gamepads.

The post Hands-on: Tactical Haptics’ Vive Demo is Further Proof That VR Needs More Than Rumble appeared first on Road to VR.

Viewing all 93 articles
Browse latest View live




Latest Images