VR系列——Oculus Rift 开发者指南:一、LibOVR集成

来源:互联网 发布:三菱plcfx3u编程实例 编辑:程序博客网 时间:2024/05/06 18:31

LibOVR集成

Oculus SDK被设计成尽可能易于集成。本指南概述了C/C++游戏引擎或应用的基本Oculus集成。

我们将讨论LibOVR的初始化、HMD(头戴式设备)枚举、头部跟踪、帧同步,以及Rift渲染。

以下的许多代码示例是直接取自OculusRoomTiny演示源码的(Oculus/LibOVR/Samples/OculusRoomTiny均有)。当对某个特定系统或特性有所怀疑时,OculusRoomTiny和OculusWorldDemo是查看示例集成代码的好地方。

SDK概述

使用该SDK有三个主要的阶段:启动、游戏循环、关闭。

为一个新的应用增加Oculus支持,需要按以下步骤操作:

  1. 通过ovr_Initialize初始化LibOVR。
  2. 调用ovr_Create并确认返回值,以检查是否调用成功。你可以使用ovr_GetHmdDesc(nullptr)定期轮询HMD的存在。
  3. 集成头部跟踪到你的应用视图和运动代码。这包括:
    a. 通过结合调用GetPredictedDisplayTime和ovr_GetTrackingState,获得帧中预期的头戴设备定位。
    b. 当将Rift结合到其他应用的控制器时,应用它的定位和位置到摄像机视图中。
    c. 修改运动和游戏玩法,以考虑头部定位。
  4. 初始化HMD的渲染。
    a. 基于HMD能力选择渲染参数,如分辨率、视野区域。
    • 参考:ovr_GetFovTextureSize和ovr_GetRenderDesc。
    b. 通过创建D3D/OpenGL来配置渲染—明确切换当前数据的结构集给头戴设备。
    • 参考:ovr_CreateSwapTextureSetD3D11和ovr_CreateSwapTextureSetGL。
  5. 修改应用帧渲染以集成HMD的支持和适当的帧同步:
    a. 确认你的引擎支持渲染立体视图。
    b. 添加帧同步逻辑到渲染循环,以获得正确预测的眼睛渲染动作。
    c. 渲染每个眼睛的视图,中和为渲染目标。
    d. 调用ovr_SubmitFrame,提交渲染过的帧给头戴设备。
  6. 自定义用户界面屏幕,以在头戴设备中良好呈现。
  7. 在关闭时,销毁创建的资源。
    • 参考:ovr_DestroySwapTextureSet,、ovr_Destroy和ovr_Shutdown。

更完整的渲染细节总结,在第14页的章节《渲染启动概述》中阐述。


原文如下


LibOVR Integration

The Oculus SDK is designed to be as easy to integrate as possible. This guide outlines a basic Oculus integration with a C/C++ game engine or application.

We’ll discuss initializing the LibOVR, HMD device enumeration, head tracking, frame timing, and rendering for the Rift.

Many of the code samples below are taken directly from the OculusRoomTiny demo source code (available in Oculus/LibOVR/Samples/OculusRoomTiny). OculusRoomTiny and OculusWorldDemo are great places to view sample integration code when in doubt about a particular system or feature.

Overview of the SDK

There are three major phases when using the SDK: setup, the game loop, and shutdown.

To add Oculus support to a new application, do the following:

  1. Initialize LibOVR through ovr_Initialize.
  2. Call ovr_Create and check the return value to see if it succeeded. You can periodically poll for the presence of an HMD with ovr_GetHmdDesc(nullptr).
  3. Integrate head-tracking into your application’s view and movement code. This involves:
    a. Obtaining predicted headset orientation for the frame through a combination of the GetPredictedDisplayTime and ovr_GetTrackingState calls.
    b. Applying Rift orientation and position to the camera view, while combining it with other application controls.
    c. Modifying movement and game play to consider head orientation.
  4. Initialize rendering for the HMD.
    a. Select rendering parameters such as resolution and field of view based on HMD capabilities.
    • See: ovr_GetFovTextureSize andovr_GetRenderDesc.
    b. Configure rendering by creating D3D/OpenGL-specific swap texture sets to present data to the headset.
    • See: ovr_CreateSwapTextureSetD3D11 andovr_CreateSwapTextureSetGL.
  5. Modify application frame rendering to integrate HMD support and proper frame timing:
    a. Make sure your engine supports rendering stereo views.
    b. Add frame timing logic into the render loop to obtain correctly predicted eye render poses.
    c. Render each eye’s view to intermediate render targets.
    d. Submit the rendered frame to the headset by calling ovr_SubmitFrame.
  6. Customize UI screens to work well inside of the headset.
  7. Destroy the created resources during shutdown.
    • See: ovr_DestroySwapTextureSet, ovr_Destroy, and ovr_Shutdown.

A more complete summary of rendering details is provided in the Rendering Setup Outline on page 14 section.

0 0
原创粉丝点击