GITAI Telepresence "THETA" Startup Gets Funding


  • VR headset
  • Tactile touch
  • Short 0.08 second delay due to unique technology they developed

Previous article indicated that this techology is used:

  • Language: HTML/CSS, Javascript, Java, C, C++, C#, PHP
  • Database: MySQL
  • Library: Three.js, A-Frame
  • Tools: Unity
  • Network: Photon, Serial, UDP
  • PaaS: AWS, Heroku, milkcocoa
  • Others: NodeJS,, GitHub, Arduino

Telepresence Using THETA V and Segway by TwinCam Go

Coverage of GITAI from the European Space Agency’s Technology Transfer Programme Office. Covers some details of GITAI’s impressive data compression technologies that make live streaming from RICOH THETA over public networks and long distances - like up to the space station and back - possible.

The article touches on GITAI progress in Data Reduction and compression, Low-latency communication, Workload reduction technology, and their NAT traversal system.

From the article:

Especially visual data is extremely heavy and causes delayed information transfer, so GITAI also has developed independent P2P communication based on independent protocol, data volume reduction algorithm and OS for synchronization. As a result, GITAI can minimize 2.7K resolution data of 360-degree camera from 800Mbps to 2.5Mbps and data latency to 100ms via standard WIFI networks.


I don’t think that 2.7K resolution video would take up 800Mbps.

The normal bitrate range for 2K video is around 3Mbps to 6Mbps with normal compression like H.264. One simple way to get more compression is to use H.265 (HEVC). This might yield a compression of 50%, putting the bitrate around 1.5 to 3Mbps with normal compression technologies. I wonder if he’s primarily relying on H.265 to achieve the reduced latency?

There’s numerous HEVC solutions out there like this one:

I’m curious to learn if his architecture is a straightforward cloud-based encoding system or if he’s doing something unusual for the compression.