React native (expo)

You need to buy a RICOH THETA camera to use the RICOH THETA API. If you do not have RICOH THETA camera, you cannot use the API.

Sounds to me like you just want to use your camera phone to take 360’s and nothing else.

If that is the case some Android cameras come with that capability, there are also apps you can get and Google Street view will let you make a photo sphere. Good chance you’re going to have a lot of stitching errors.

You’re much better off buying a 360 camera. I have several. If you want the best quality a Theta Z1 is the way to go. If you need to use it in water I would suggest a Insta360.

Theta is great but not designed for action shots.

Hi, But the theta api will work for expo right? and the process is same as react native ? @craig

There is a React Native demo in this repository.

Community information including the complete build process is here:

theta-client | Developer information for RICOH THETA

I’m planning to update the documentation with a theta cloud-based simulator. Let me know if you don’t have access to a physical camera and I can prioritize the documentation on the simulator and fake storage with sample media.

Hi @craig Thanks for the response but there is no example for expo! And i tried the same process as react native cli but in the end its showing error that i cant use expo go app! It means the thetha client api will not work in expo i guess! Any idea how i can make it work in expo? or any reference?

Will it work with node-rest-client and a normal GET request?

If you do not have a physical camera, I set up a fake-theta cloud server with sample media yesterday. It’s in the community documentation.

Do you have a physical device camera?

Are you using Access Point (AP) mode where the camera is the WiFi hotspot? Or, are you using Client Mode where the camera connects to an office router?

I am using RICHON THETA SC2 and i am using AP mode and i dont think there is Client mode in this! I am running expo app from expo go! in my physical device! Both mac and android device are connected to same wifi in order to run expo app! I am not sure how can i connect my richon! like i find a npm for using the ThetaHttpClient but i am not sure what to put in hostname and base url of that!

Its like i want to connect the richon sc2 or any version via usb and able to use to for clicking 360 pictures which i will use in my app! i am using expo app so i cant switch to other tech like expo cli or react native cli! I want to use the ThetaHttpClient sdk apis but i am not sure how can i connect richon over usb and get the hostname and all!

You can only use the WebAPI over WiFi. You cannot use it over USB.

This is the specification for the WebAPI

theta-api-specs/theta-web-api-v2.1 at main · ricohapi/theta-api-specs · GitHub

If you want to build an application to communicate over the USB cable, you must use the USB API.

theta-api-specs/theta-usb-api at main · ricohapi/theta-api-specs · GitHub

I used the USB API with Flutter in this video on a Raspberry Pi

You can try use libgphoto2 with React Expo to access the USB API.

I have not tried it.

If you get it working, please post your results.

i tried to use - npm this library for expo over wifi! Its clicking the pictures but i dont know much about gphoto2!
Like the installation and configuration is not readable!

There are two different ways of connecting to the camera.

  1. WiFi
  2. USB

If you use WiFi, you need to use the WebAPI.

If you use USB, you need to use the USB API.

If you are using the WebAPI, you can try and use the - npm

If are you using the USB API, you can try and use gphoto2.

I don’t know if the THETA works with gphoto2. I have not tried it. Does something like this work with the THETA SC2?

GPhoto.list(function (list) {
  if (list.length === 0) return;
  var camera = list[0];
  console.log('Found', camera.model);
  // get configuration tree
  camera.getConfig(function (er, settings) {

I think i will use the WebAPI, but if i am using the cameraTakePicture({baseUrl:}) from the npm package, i am not getting good response! Even though i am using await function still the res is under process or something i guess! The api is returning me


That is normal.

use osc/state to get the _latestFileUrl

theta-api-specs/ at main · ricohapi/theta-api-specs · GitHub

/osc/commands/status will show you the status of the camera

theta-api-specs/ at main · ricohapi/theta-api-specs · GitHub

This article explains the concept

Thanks alot for the reference! I managed to get the url (i read one of your thread) which is something like “”. but when i am using this url in my <Image style={{width:100,height:200,borderWidth:1}} source={{uri:this.state.dataUrl}} />
image tag its not showing!
Do i have to have something to access it like connection or something?

The URL is GET endpoint, so it should work like a normal file on the Internet. You can get the thumbnail with

You may need to wait until the file is fully downloaded.

Connect your computer to the THETA with WiFi, then use curl, Postman, or Insomnia to send a state command. Then, drop the URL into a web browser. You should see the image come up in the browser.

Test your app with a fake API tester like this:

If it works, then swap in the THETA URL.

Hi,Thanks i can see the image now! But i have a question, i am using the web urls and if i want to check the api response like get preview in POSTMAN, how can i do it? because currently i am connecting my laptop to the RICHON wifi and then using POSTMAN but its not working! GIving 503 error! Can you guide! I want to see the response of the getLivePreview!


It’s a stream.

See these articles

livePreview, MotionJPEG on RICOH THETA - Acquiring the Data Stream

RICOH THETA SC2 livePreview MotionJPEG Single Frame Extraction

This example is in Dart. The main idea is that normally, the HTTP request will close the HTTP connection. There is likely an option to keep the stream open with the JavaScript library you are using. You can then listen to the stream and get the individual JPEG frames to display.

import 'dart:convert';
import 'dart:io';

void main() async {
  Uri apiUrl = Uri.parse('');
  var client = HttpClient();
  Map<String, String> body = {'name': 'camera.getLivePreview'};
  var request = await client.postUrl(apiUrl)
    ..headers.contentType = ContentType("application", "json", charset: "utf-8")
  var response = await request.close();
  response.listen((List<int> data) {

If you are familiar with Python, you can also quickly test and view it with this article.

This code has an example of displaying the preview in react native. Maybe you can use it as a reference.

Hi, Thanks! I am using this library! i tried to use addListener but i really cant debug my app as i am using web apis and i have to connect to richon wifi and if i connect then i lost the expo node connection!

constructor(props) {
    this.state = { thetaIP: '',dataUrl:'',isLoading:false,live:null };
    this.thetaClient = new ThetaHttpClient({
      hostname: '',
      axiosConfig: {},
      auth: { user: 'THETAYP00153401.OSC', pass: '00153401' }
    this.onPressTakePicture = this.onPressTakePicture.bind(this);
    this.livePreview = this.livePreview.bind(this);
    this.getStatus = this.getStatus.bind(this);

componentDidMount() {
    let subscribing = this.thetaClient.addListener("cameraGetLivePreview",event => {
    return () => {

async livePreview() {
    console.log("take start");
    const res = await this.thetaClient 
      baseURL: `http://${this.state.thetaIP}`,
    .catch(e => {
      console.error('cameraGetLivePreview:', e);
    console.warn(,'cameraGetLivePreview res')

Doing something like this!
I know i am doing it wrong i guess

i tried to use addListener but i really cant debug my app as i am using web apis and i have to connect to richon wifi and if i connect then i lost the expo node connection!

Unless your home Internet router is at 192.168.1.x, you can put another WiFi adapter into your computer and connect it to both the Internet and your camera at the same time. If you have access to the Internet router, you can assign your router another IP address like

auth: { user: 'THETAYP00153401.OSC', pass: '00153401' }

Unless the RICOH THETA is in Client Mode (which the SC2 does not support), then you do not need the username and password.

Does it work without auth?

this.state = { thetaIP: '',dataUrl:'',isLoading:false,live:null };

What does live:null refer to?

I have not used the library you are referring to.

Here is my code, take picture is working but the cameraGetLivePreview is not working i tried to debug it but its not working, at all dont know why tried with the fetch command as well still debug is not stopping at breakpoint nor the console error

import React, {useEffect, useRef, useState} from 'react';
import {ThetaHttpClient} from '';
import {View, Image, Button} from 'react-native';

const LivePreview = () => {
  const [image, setImage] = useState('');
  const [clickUrl, setClickUrl] = useState('');
  const [isLoading, setIsLoading] = useState('');
  const THETA_IP_ADDRESS = '';

  const thetaClient = new ThetaHttpClient({
    hostname: '',
    axiosConfig: {},
    auth: {user: 'THETAYP00153401.OSC', pass: '00153401'},
  const [imageData, setImageData] = useState('');
  const previewUrl = '';
  const payload = {
    name: 'camera.getLivePreview',
    parameters: {},

  const handlePreviewData = data => {
    const reader = new FileReader();
    reader.onload = () => {

  const streamPreview = async () => {
    const response = 
// await fetch(previewUrl, {
      //   method: 'POST',
      //   headers: {
      //     'Content-Type': 'application/json;charset=utf-8',
      //   },
      //   body: JSON.stringify(payload),
      // });
      await thetaClient
          baseURL: '',
        .catch(e => {
          console.log(e, 'streamPreview');

    const reader = response.body.getReader();

    while (true) {
      const {done, value} = await;
      if (done) {
      const stringValue = new TextDecoder().decode(value);
      const boundaryIndex = stringValue.indexOf('---osclivepreview---');
      if (boundaryIndex !== -1) {
        const imageData = stringValue.slice(boundaryIndex);
  useEffect(() => {

    return () => {
  }, []);

  const getStatus = async lastId => {
    const axiosConfigBase = {baseURL: ''};
    const res = await thetaClient.oscCommandsStatus(lastId, axiosConfigBase);
    if ( === 'inProgress') {
    } else if ( === 'done') {
      console.warn(res, 'its complete');

  const onPressTakePicture = async () => {
    console.log('take start');
    const res = await thetaClient
        baseURL: '',
      .catch(e => {
        console.error('error:', e);
    // console.warn(res,"cameraTakePicture");
    if (res?.data?.state === 'inProgress') {
      await getStatus(;

  return (
    <View style={{flex: 1, marginTop: '20%', alignItems: 'center'}}>
        style={{borderWidth: 1}}
        title={'Live preview'}
        style={{borderWidth: 1, height: 100, width: 100}}
        source={{uri: imageData}}
        alt="Live preview"
        style={{borderWidth: 1}}
        title={'Take picture'}
        style={{borderWidth: 1, height: 100, width: 300}}
        source={{uri: clickUrl}}
        alt="Live preview"

export default LivePreview;

Try this to read a stream and display it to the console for testing.

const body = {'name': 'camera.getLivePreview'}
const response = await fetch('', 
	{method: 'POST', 
	body: JSON.stringify(body),
	headers: {'Content-Type': 'application/json'}
const data =  response.body

const reader = data.getReader()

while (true) {
	const {value, done} = await;
	if (done) break;
	console.log('Received', value);


fake-theta no camera version

const body = {'name': 'camera.getLivePreview'}
const response = await fetch('', 
	{method: 'POST', 
	body: JSON.stringify(body),
	headers: {'Content-Type': 'application/json'}
const data =  response.body

const reader = data.getReader()

while (true) {
	const {value, done} = await;
	if (done) break;
	console.log('Received', value);


expected output

Tested with RICOH THETA X.

Received Uint8Array(13140) [
  154,  66, 105,   9, 166, 147,  64,  10,  90, 155, 145,  77,
   38, 144, 154,   0, 121, 106, 137, 159,  61,  41, 174, 212,
  176, 198, 100,  97, 143, 186,  40,   2,  72, 163,  50,  31,
  106, 191,  12,  33,  71,   2, 146,  24, 192,   2, 173,  34,
  208,   0, 171,  82, 129,  72,   5,  60,  80,  32,   2, 156,
   41,   5,  45,  48,  22, 138,  40, 160,   5, 162, 138,  40,
    1, 104, 164, 165, 160,   2, 138,  40, 160,   5, 162, 146,
  150, 128,  10,  40, 162, 128,  22, 138,  74,  90,   0,  41,
  105,  41, 104,   0,
  ... 13040 more items
Received Uint8Array(12525) [
  120, 181,  98,  49,  36, 150, 249, 104, 219,   0, 118, 237,
   86, 173, 245,  72, 159,  11,  48, 242, 143, 169, 233,  85,
  102, 251, 141, 244,  53, 156, 188, 163, 231, 168, 197,  84,
  100, 209,  50, 138, 102, 165, 222, 179,  28,  74, 198,  37,
   46,  71, 115, 210, 185, 251, 173,  74, 238, 231, 115,  73,
   39, 238, 201,  35, 104, 192,  31,  79, 122, 146, 114,   2,
   50, 142,  73,  24,   2, 179,  87,  12, 141, 146, 114,  62,
  232, 237,  90,  39, 115,  54, 172, 105, 198,  75,   0,  73,
  201, 245, 173,  72,
  ... 12425 more items

viewing stream as images

I have not tested the code below, but reading through it, the primary algorithms look like it could be used on the stream.

Advice on React Native Expo Courses?

As I don’t know React Native, I am thinking of taking an online course.

I found these two:

I’m more interested in the Meta course on Coursera as it has a higher rating. I’ve already watched a few of the videos and it looks good.

Are you taking an online course for React Native expo?