Using RICOH THETA SDK for iOS - An Introduction

The RICOH THETA API v2.1 SDK for iOS is available for iOS developers. It provides a helpful sample app and access to code that simplifies development. It is a key tool for iOS developers making mobile apps that take advantage of the unique hardware strengths of the RICOH THETA 360 degree camera line.

NOTE: In my tests, the CPU usage is consistently high, over 100%. If you are going to assess the iOS SDK, it may not work for you. Ideally, this warning saves you time. It would help the community if you report back here your feedback. Does the SDK work for you? Is CPU usage high for you?

My environment

  • On a MacBook Air (M1, 2020) laptop
    • MacOS Big Sur 11.5.2
  • iPhone 12
    • iOS 14.7.1
  • USB-C to Lightning cable
  • RICOH THETA Z1
    • Firmware 2.00.1

There are two main items that you want to pay attention to when you first get the SDK:

Download the right SDK

There is a RICOH Camera SDK. This is NOT what you want. The download page had multiple SDKs for different RICOH cameras. Make sure it says RICOH THETA clearly. You want RICOH THETA API v2.1 SDK for iOS.

This link has a URL anchor that will point you to the correct part of the downloads page: https://api.ricoh/downloads/#ricoh-theta-api

Download the right API version

There are three main API versions. They are based on Google’s open source Open Spherical Camera API with some additions specific to THETA.

One main difference between 2.0 and 2.1 is a streamlined way of checking the state of the camera before sending a command. 2.0 requires more steps.

You want the newest version 2.1.

Main Components Needed

  • Xcode (30GB download)
  • An Apple Developer account (free version available)
  • RICOH THETA API v2.1 SDK for iOS
    • Includes both source and a project: ricoh-theta-sample-for-iosv2.xcodeproj

Xcode includes iOS Simulators. You may want to test on a physical device, preferably your target platform. In this screenshot I am going to build ricoh-theta-sample-for-iosv2.xcodeproj and run it on my iPhone 12.

Build the project.

In my case, I had to go into Signings & Certificates to set the Team and Bundle Identifier.

It’s running on my iPhone!

I press Connect. Nothing happens.

Duh! I need to connect to my THETA Z1 through Wi-Fi. On the THETA, the Wi-Fi icon blinks if it’s not connected. Solid when it is connected.

I try again. It’s connected and running. I can see a live preview. I can see a list of 360 degree images on my iPhone. Cool!

I take a test capture. I press Capture. I hear the shutter noise from the THETA. The new picture is listed.

I open the new picture. It’s me giving a big thumbs up.

Issue with CPU Usage

The SDK may not be usable in your situation. This may make the SDK unusable. In my limited testing, the CPU usage was over 100%.

Next Steps

The full API documentation is here: https://api.ricoh/docs/theta-web-api-v2.1/

theta360.guide had many useful docs, how-tos, videos and more. Just two examples: Capture Video using theta iOS SDK and Auto-connect to the camera using Ricoh Theta iOS SDK

If you are interested in contributing your experience using the iOS SDK, we need the help! The community can benefit from your knowledge. Please reply to this post.

I tested the livePreview from a Z1 with firmware 2.00.1 on September 1, 2021. It works fine on Android 11. See the example video below with the SDK.

Equipment

  • Google Pixel 2, Android 11 (physical device)
  • RICOH THETA Z1 with firmware 2.00.1
  • Wi-Fi: 5GHz, mobile phone and Z1 are connected in access point (AP) mode with the camera at 192.168.1.1
  • RICOH THETA Android SDK

Results

  • Phone CPU utilization is fine
  • livePreview and all other functionality is fine
1 Like

Initial test with iOS 14.5

Need to test more to isolate the problem. I was not able to replicate it. Will keep trying.

I can now replicate the problem consistently on a physical device iPhone 7.

Would really appreciate it if anyone has a Swift version of this sample app that they can share!

@jcasman I am also seeing CPU utilization close to ~80-100% range running on iPhone 12 mini

If you’re using livePreview on the SDK, look at the code for the buffer for each chunk of data in the mjpeg stream. It may be iterating over the entire mjpeg stream buffer and not each current chunk.

Thanks @craig, I am using the livePreview in the SDK and this is the code taken from the sample app that cause 80-100% cpu usage. Do you happen to have sample code of iteration only through the current chunk that you can share with the community?

repeat {
            var soi: Int?
            var eoi: Int?
            var i = 0

            dataBuffer.withUnsafeBytes { (bytes: UnsafePointer<UInt8>) -> Void in
                while i < dataBuffer.count - 1 {
                    if JPEG_SOI[0] == bytes[i] && JPEG_SOI[1] == bytes[i + 1] {
                        soi = i
                        i += 1
                        break
                    }
                    i += 1
                }

                while i < dataBuffer.count - 1 {
                    if JPEG_EOI[0] == bytes[i] && JPEG_EOI[1] == bytes[i + 1] {
                        i += 1
                        eoi = i
                        break
                    }
                    i += 1
                }

            }

            guard let start = soi, let end = eoi else {
                return
            }

            let frameData = dataBuffer.subdata(in: start..<(end + 1))
            completionHandler(frameData, nil, nil)
            dataBuffer = dataBuffer.subdata(in: (end + 1)..<dataBuffer.count)
        } while dataBuffer.count >= 4

I have also tried the following, but it was still yielding 80-100% cpu usage

        var soi: Int?
        var eoi: Int?
        var i = 0

        dataBuffer.withUnsafeBytes { bytes in
            while i < self.dataBuffer.count - 1 {
                if self.JPEG_SOI[0] == bytes[i] && self.JPEG_SOI[1] == bytes[i + 1] {
                    soi = i
                    i += 1
                    break
                }
                i += 1
            }

            while i < self.dataBuffer.count - 1 {
                if self.JPEG_EOI[0] == bytes[i] && self.JPEG_EOI[1] == bytes[i + 1] {
                    i += 1
                    eoi = i
                    break
                }
                i += 1
            }
        }

        guard let start = soi, let end = eoi else {
            return
        }

        let frameData = dataBuffer.subdata(in: start..<(end + 1))
        completionHandler(frameData, nil, nil)
        dataBuffer = dataBuffer.subdata(in: (end + 1)..<dataBuffer.count)

Can you tell us what camera model are you using? Are you using a Z1?

Hi @craig, I’m currently using Ricoh Theta V