Panta Rhei (named after the English transliteration Panta rhei), is a video game engine developed by Capcom, for use with 8th generation consoles: PlayStation 4, Xbox One; as a replacement for its previous MT Framework engine.



Following years of Capcom using their proprietary MT Framework engine for video game development during the seventh generation of hardware, it was decided that to maximize productivity in the following generation of hardware it would be necessary to create a new development engine, now titled "Panta Rhei". This was intended for home consoles while games for Smartphones, PlayStation Vita and Nintendo 3DS still use the MT Framework Mobile & Lite engines.

Specific design aims for the new engine included increases in workflow efficiency by reducing iteration time for modifications to gameplay and game design. Additionally, the engine introduced improved physical modeling of fluids and emphasis on global illumination rendering. Development of a new engine began in summer 2011. Features of the new development engine included: in-engine management of shader (GPU) programs; an engine virtual machine allowing game scripting to be written initially in C#; changes in organization of the workflow/content meant that backwards compatibility with the MT Framework engine was lost. The engine corresponds to DirectX 11 level of technology.

The initial game to be developed with Panta Rhei was Deep Down, whose team provided feedback on the engine development; development of the game and engine were carried out in parallel. A trailer for Deep Down and the Panta Rhei engine were publicly demonstrated by Yoshinori Ono at the PlayStation 4 unveiling event in February 2013.,[1][2] the Deep Down technology demo used ~3GB of textures, with 30 shaders, running at approximately 30 frames per second. Graphics techniques used in the Deep Down demo included tessellation (actors cloak); with deferred rendering implementing dynamic lightsources; and surfaces rendered including diffuse and specular light reflections with surface roughness implemented by the Oren–Nayar reflectance model; global illumination calculations (such as light from a dragon's fiery breath) were estimated using the 'voxel cone tracing' method (with 1 specular 'ray' and an approximation to 12 dodecahedrally situated 'rays', sampled at a lower resolution, for diffuse reflectance); moving light sources including flames were modeled using a 64x64x64 voxel (voxel cube size ~0.5m) implemented as 3D textures stored in a Mipmap like structure.

Further technology demos showcasing fluid simulations of fire and smoke in the Panta Rhei engine were released in August 2013.[3] The tech demo demonstrated the engine's use of volume-based simulations of fire (also used in the February 2013 Deep Down video), as opposed to less functional 2D "billboarded" (see Sprite) based depictions. The demos used a volume (voxel) based physical simulation of the fluid, with fixed voxel size. The simulation of fluid flow used a semi-Lagrangian method for approximations to the solution of the advection equation - specifically vorticity confinement simulations with the MacCormack method used to obtain solutions. The voxel representation of the fluid required a 'ray marching' graphics rendering process (see Volume ray casting); self-shadowing of fluids, and scattering were also implemented in the engine demos.

Further details of the game engine were discussed at a talk at CEDEC (CESA Developers Conference) 2014 given by Hitoshi Mishima (三嶋 仁氏) and Haruna Akuzawa (阿久澤陽菜氏). In common with other PS4/Xbox One generation rendering engines the Panta Rhei engine used physically based rendering methods for calculating lighting reflectance; demonstrations based around the Deep Down Panta Rhei development videogame used a Oren-Nayar model for diffuse reflectance, Cook-Torrance model for specular highlights (replacing a Blinn–Phong shading model used in earlier demonstrations). Demonstrations used tile-based deferred rendering generally, with forward rendering also applied for simulations of translucent skin effects, and other transparent objects. Indirect lighting was demonstrated again, using a 128x128x128 voxel grid representing local light intensity, and voxel cone tracing along 12 directions (dodecahedral). Specific demonstrations at CEDEC were a pre-integrated skin shader for simulation of light through (human/animal) skin effects, and a 'liquid' shader; surface reflections were modelled using a screen space reflection technique originally developed by Crytek, utilising parallax-corrected environmental maps; a hair modelling effect using runtime compute shader generation of hair positions, in conjunction with tessellation.

In November 2014 at AMD's "Future of Compute" Singapore conference, Masaru Ijuin of Capcom announced that AMD's Mantle API technology was being incorporated into the game engine.[4]



External LinkEdit