What games are you playing?

Ceros_X

King of Cable Management
Mar 8, 2016
748
660
Stardew Valley looks like a very close copy of the old Harvest Moon game on SNES/Gameboy Advance.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,856
4,914
Doom Open Beta is out on Steam and consoles.
- just multiplayer
- 2 maps
- 2 game modes (TDM and Warpath)
- 16-player rounds
- 7 or so weapons (excluding pickups)
- Revenant, Quad Damage and Haste power-up
- no graphics quality settings
- Motion Blur and Chromatic Abberation can be DISABLED
- 15GB download

Loads of fun, runs awesomely well on my setup.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,856
4,914
It's what I love most about racing games, the level of immersion you get to have is very high on a PC and with the right equipment like above video is what makes it even more engaging. The farthest I've come is building a base for my pedals from a Logitech Driving Force Pro, which isn't pro at all. Basically a simulator setup like this requires about 300-500€ in input devices, about double for screens, even more for GPUs and then you still need a frame and chair which all is useless for anything else but racing sims. That's why I'm hoping VR headsets will fix most of these one-trick-pony costs.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
It's what I love most about racing games, the level of immersion you get to have is very high on a PC and with the right equipment like above video is what makes it even more engaging. The farthest I've come is building a base for my pedals from a Logitech Driving Force Pro, which isn't pro at all. Basically a simulator setup like this requires about 300-500€ in input devices, about double for screens, even more for GPUs and then you still need a frame and chair which all is useless for anything else but racing sims. That's why I'm hoping VR headsets will fix most of these one-trick-pony costs.
VR will replace the screens, but physical props are still the way to go for VR, and will be for I'd wager the better part of a decade at least. Nobody has even cracked the problem of providing high fidelity haptic and tactile feedback in a lab environment, let alone in a way that would be sold to end users at costs-less-than-a-good-car prices (even a limited setup like the Cybergrasp Haptic Workstation is "if you have to ask you can't afford it"). We'll see Simpits being modelled for VR use (placing virtual controls in the same place as physical ones) long before we can stimulate the same effect as holding a wheel, pushing a pedal or flipping a switch.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,856
4,914
Ofcourse but if motion sensory input needs to be satisfied, that's crossing the realm with real racing price-wise, which is way beyond what the @GuilleAcoustic 's clip was about.
 

jeshikat

Jessica. Wayward SFF.n Founder
Original poster
Silver Supporter
Feb 22, 2015
4,969
4,781
They don't sell it anymore but I find this to be an interesting alternative to motion platforms: http://www.virtualr.net/simxperience-gs-4-g-seat-now-available

Rather than a moving platform, which can only give you a brief feeling of acceleration before it runs out of travel, this has pivoting plates under the seat that move back and forth. So if you stomp on the gas pedal in the game the plates on the back would move forward and push against your back, giving that "pressed into your seat" feeling.

It's a much simpler setup so it's cheaper than a full motion platform. I'm not sure why they quit developing it since it seems perfect for VR.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
They don't sell it anymore but I find this to be an interesting alternative to motion platforms: http://www.virtualr.net/simxperience-gs-4-g-seat-now-available

Rather than a moving platform, which can only give you a brief feeling of acceleration before it runs out of travel, this has pivoting plates under the seat that move back and forth. So if you stomp on the gas pedal in the game the plates on the back would move forward and push against your back, giving that "pressed into your seat" feeling.

It's a much simpler setup so it's cheaper than a full motion platform. I'm not sure why they quit developing it since it seems perfect for VR.
I've seen one or two DIY versions of the concept, and it's a really great idea for simulating G-loads without needing to tilt an entire rig.
It's even more desirable for VR, as a moving platform has issues with the IMU sensor fusion VR tracking systems (Constellation, Lighthouse and STEM all use IMUs for fine-scale low-latency position tracking) use: if you mount the camera/basestation on the moving platform, then any movement of the platform would be picked up by the IMUs but not the absolute positioning system, leading to a mismatch and the inability to fuse the data. If the camera/basestation is fixed relative to the world, then the IMU data and absolute position data match up normally, but any platform movement results in your virtual head position moving about in the game world (highway to nausea). Adding position tracking to the platform and performing an offset correction is tricky because it needs to be done with the same latency as the head tracking (which means it needs to be a function implemented by Oculus/Valve/Sixense) and componds any error in EITHER system into head position error, meaning just using the desired platform position is no good (no control loop is that good), and using encoders on moving elements is likely no good idea (mechanical slop, limited encoder accuracy for reasonable cost).
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
A moving platform would be possible if it could report the exact angle it's tilting by and the VR headset's API allowed to modify the tilt of the virtual floor.
So if it was tilting backwards by 10°, you could set that to be the new 0° and if the headset was only tracking in relation to that, your virtual head would still be level. It's probably much more complicated than that but it seems like a way of making this work.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
The problem comes from the fact that you need to be doing that correction as quickly (i.e. samples are synchronised, latency is identical) and as accurately (or rather, more accurately - to allow for error) than you are performing the head tracking, which means it needs to be implemented within Oculus/Valve/Sixense's software rather than something that can be applied by the game engine. And with the inherent response variation in MEMS IMUs, they would pretty much require adding multiple IMUs to both the HMD and platform (to average and minimise error from batch variation) or individually pre-calibrate both the headset IMU and the 'offset' IMU (expensive and time consuming).
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
But the system itself wouldn't need an IMU. It already knows what angles it's going to tilt to to simulate the acceleration vector you should be experiencing. So neither speed nor acurracy of those samples are an issue. I agree that it wouldn't be a good idea to let the game engine implement this, it would have to be an interoperability between the drivers for the hardware you're using.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
But the system itself wouldn't need an IMU. It already knows what angles it's going to tilt to to simulate the acceleration vector you should be experiencing. So neither speed nor acurracy of those samples are an issue.
The response time of the platform is going to be well below the sample rate of the IMU, so you can't use desired endpoint as the offset. Even with shaft encoders on all the actuators, structure flex and linkage slop is going to have the platform's actual position be pretty significantly (head-tracking demands sub-mm accuracy) different from it's desired position. If you try and minimise mechanical error, you end up with a monstrosity made of cast concrete with multi-kW motors and industrial anti-backlash ballscrews.
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
The response time of the platform is going to be well below the sample rate of the IMU, so you can't use desired endpoint as the offset. Even with shaft encoders on all the actuators, structure flex and linkage slop is going to have the platform's actual position be pretty significantly (head-tracking demands sub-mm accuracy) different from it's desired position. If you try and minimise mechanical error, you end up with a monstrosity made of cast concrete with multi-kW motors and industrial anti-backlash ballscrews.

Couldn't you calibrate that sort of thing? If you do the same movement at the same speed, it should always work out the same way, so if you just check how it does in production, you could integrate that sort of information into each unit. Would make the thing a whole lot more expensive, though and it will probably lose calibration after a few years.

Also, you don't have to just use the endpoint, you should be able to interpolate in between that with quite some accuracy, the response time of that is close to zero.
Talking about linear encoders, though, couldn't you use spring-loaded steel wire on key points of the structure to measure their movement more accurately?
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
That works in the accuracy and timescales needed for a motion platform, but not for head tracking. You;re aim is to compensate for the IMU offset, so you need to sample in the same timebase (1000Hz, or 1ms latency between sampling and having the correct value, so no averaging to account for spring oscillation or beam flex) and to the same accuracy (sub-mm). To get that high a quality reading from a linear or shaft encoder, and have that reading not be affected by the mechanics of the platform itself, requires precision machining and very stiff structures. Your typical racing seat, for example, is far too flexible and will bounce and wobble around atop the platform.
 

iFreilicht

FlexATX Authority
Feb 28, 2015
3,243
2,361
freilite.com
To get that high a quality reading from a linear or shaft encoder, and have that reading not be affected by the mechanics of the platform itself

That's why I would've attached some of those wires to parts of the structure that could flex (especially the back of the chair) in order to compensate.
What about lasers then? They can measure distance, aren't mechanical at all and have a pretty high response time from what I know. Might be lower than the sample rate of the IMU though.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
That's why I would've attached some of those wires to parts of the structure that could flex (especially the back of the chair) in order to compensate.
What about lasers then? They can measure distance, aren't mechanical at all and have a pretty high response time from what I know. Might be lower than the sample rate of the IMU though.
Sure, direct endpoint measurement would work with a sufficient sample rate. But then you're just back to optically tracking the platform in the first place, and you may as well use the same tracking system the HMD is using.
But neither solve the problem that in order to do headset-offset calculations acceptably, you need the function to be implemented into the same bit of software doing the HMD tracking (i.e. Oculus' SDK, Valve's SteamVR, or Sixense's STEM). None of these have currently implemented such a function, and likely will not for the near future due to it being a very niche use, and more difficult in practice than it first appears.