Example 2Įnvironment based upon a Unity project that allows people to explore the ASKAP (Australian Square Kilometer Array Pathfinder) project site. The exact mapping of the warped version is for a particular iDome configuration, but different warpings including various truncated fisheye configurations is no more than an appropriate warp map file. Of course these meshes are not visible in the game play, along with their cameras they are all on a layer by themselves.Īn example Unity project is given here: domedemos.zip. Finally an orthographic projection of the warping mesh. One orthographic rendering of the fisheye image mapped onto a warping mesh. So in all it is a 6 pass rendering process, 4 perspective views mapped to a fisheye. Note however this is only useful for the exact installation shown here, this final mesh is normally site specific and, for example, looks very different for a planetarium arrangement. The mesh to perform this is also given here as an obj file: warp.zip. An orthographic camera viewing this mesh is the final image sent to the data projector. This is achieved by applying the fisheye image to another mesh. In order to create the correct projection using a spherical mirror the fisheye needs to be warped. In the example here the fisheye render texture is 2Kx2K. The 4 meshes that form the fisheye from the 4 cubic map images is given here as obj files. If one was using a fisheye lens to illuminate the dome then this would be the end of the story. The images resulting from 90 degree field of view perspective views through the 4 cube faces are applied as textures to 4 specially designed meshes which are abutted together to form a 180 degree fisheye projection. Each render texture is 1Kx1Km this is in order to match the the resolution of the final projection system. ![]() The two other views are the top and bottom views, this gives enough of a visual field to reconstruct the field of view required for the iDome. The left and right cameras are at +- 45 degrees to the original main camera, that is, we are looking into the corner of the cube. The standard first person controller is replaced with a 4 camera rig. The technique can of course be applied to any Unity based application. ![]() In order to demonstrate the use of Unity within the iDome, the standard demo that comes with Unity Pro is modified to support the warped fisheye images required for the iDome. The iDome allows one to play with our full peripheral vision engaged, both horizontally and vertically. In the same way as we evolved to have a wide peripheral vision to help us detect predators, this peripheral vision can assist in early detection of dangers in a FPS game. Not only does this give us a heightened sense of “being there” but it also engages our strong motion cues in a peripheral region. It is relatively easy to argue that there should be a gaming advantage if, like in real life, our peripheral vision is engaged.
0 Comments
Leave a Reply. |