Story Highlights
- Google has published a new patent that will allow VR games to let players use random real-life objects or mock weapons to emulate gameplay. It can be done by using the newly proposed radar technology.
- The radar tech will allow for even more precise and quicker tracking capabilities without being hindered by clothing and other physical obstructions, like the objects used by the player.
- It can keep up with even the smallest moments and even in the lowest lighting. The new tech will also enable features like fine resolution and real-time control.
- It is a huge gain over the currently used tech for VR tracking in games. The hand-held controllers and visual-tracking systems fall short of giving a level of immersion, which could be impossible soon.
Google is among the leading tech giants globally and is also helming new innovations in the VR and AR ecosystems. The company has recently published a captivating patent that seeks to wholly upgrade the current VR tracking system. The upgrade will allow VR games to use many new features deemed impossible today. A significant feature is the ability to use random real-life objects or toys to emulate in-game weapons.
The player can now pick up actual mock tools and weapons and use them—even if those weapons or tools have no control capabilities. If, in the game world, a weapon is sitting on the ground, a player can pick up in the real world a plastic version of this weapon or even a toy from around the house, like a nerf gun, and use it in the game,” says Google.
The legal document dubbed “ADVANCED GAMING AND VIRTUAL REALITY CONTROL USING RADAR,” talks about utilizing a form of radar tech that will allow for even more precise and quicker tracking capabilities. The discussed techniques will allow for tracking without the existing caveats of current tech. It can keep up with even the smallest motions despite all the physical obstructions, even in the lowest lighting.
These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured, such when a user’s own clothes, game tool, or fingers occlude an action or an action is obscured due to darkness or varying light.”
The fine motions and the ability to ignore obstructions will let the system track the users’ fingers to automatically form controls for the mock weapon. The tracking system will allow for motion that lets users see themselves in the VR game in real time rather than a blocky avatar. The new tech will also enable higher resolution and real-time control, unlike the usual tracking methods used by devs.
Related Content:
- Google And Amazon Cannot Compete In Gaming, Says Microsoft.
- Universal Mod Makes Over 90% Of Unreal Engine Games Playable In VR.
- PSVR 2 On PC Now Supports Positional Tracking; Half-Life: Alyx Already Runs With Limitations.
Google argues most VR devices rely on limiting hand-held controllers or visual tracking. Visual tracking uses optical or infrared cameras to track most body motions to control a user’s in-game avatar or VR environment. It lacks spatial resolution and is quite sensitive to light — or lack of it — and physical obstructions.
On the other hand, hand-held controllers do not allow great lengths of controls needed for most games, as they depend on the number of buttons and their combinations. The lack of motion sensors and other necessities can also limit the user experience. Additionally, modern VR games usually require knowing the user’s body and hand orientation within the game worlds.
Currently, we can only use weapon-shaped controllers or objects in VR games with pre-determined controls. But Google will let VR gamers utilize any real-life mock weapons or tools to represent in-game items and much more.
Latest News: NBA 2K24 Game Size For Xbox Series X|S Is A Staggering 161 GB.
Thanks! Do share your feedback with us. ⚡
How can we make this post better? Your help would be appreciated. ✍