Swift Library for Open Spherical Camera API

I have not used this library, but I know that good libraries can make a big difference in how easy (or not) it is to build something.

If you’re developing in iOS, this covers multiple OSC versions, and clearly distinguishes between Level 1 (Theta v2.0) Method and Level 2 (Theta v2.1) Method. Even better, it includes a sample iOS app (requires Swift 3.0+ and Xcode 8.0+).

If anyone tries this out, please let me know. I’d like to hear any opinions.

I am using this library to show live preview of theta camera. It’s really good library, but with the scenekit
It’s really slow. I am not getting why this slow?

1 Like

Hi Rahul, I created a swift library to replace SceneKit that was not working well at all at the time: https://github.com/gsabran/DDDKit
You might be able to use it with this Ricoh library. For live preview you might have to adapt this class: https://github.com/gsabran/DDDKit/blob/master/DDDKit/Classes/DDDVideoTexture.swift

2 Likes

@gsabran Thanks I will check it.

1 Like

@rahul2681993 Just curious, did you make any progress?

@jcasman in my case live previews frame rate was slow and quality was low. I fixed it.
and In terms of speed, I think it’s slow because of wifi communication between app and camera.

1 Like

@rahul2681993 Glad you were able to fix it. Anything you can post here? How did you fix it? What did you do?

I used the sample ios project in the original post as a reference. Eventually i was able to put a uiimageview inside an UIViewRepresentable, so it can appear in a swiftui struct. I was also able to put the osc ThetaCamera object inside an ObservableObject, so data from callbacks can eventually can be accessed in swiftui. The trick i found to getting a somewhat decent preview is to check the system time, then if enough time has passed, i process the image data on the main thread and update the preview image. uiimageview cant handle 100 updates each second so i tried updating 4 or 5 times each second.

2 Likes

Additionally, i found that you should initialize a UIImage object in the background. Then afterwards assign the newly created uiimage object to your UI in the main thread.

I found that the UI would update very slowly because if you create the uiimage with ‘data’ from the ‘osc.getLivePreview’ function, then sometimes the ‘data’ in the callback was corrupt, would cause errors in uiimage(data: data), and that froze the UI.

2 Likes

Thanks! This is very useful.

When you create the UIImage object in the background, are you able to get the framerate above 5fps (as you mentioned in your earlier post)?

Can you let us know what camera you are using?

I am personally struggling to get good framerates displayed from the RICOH THETA SC2. It is sending 30fps, but I can’t display it on my UI. Interestingly, I can display close to 30fps from the Z1, X, and V models.

Even more interesting for me is that I can save the SC2 frames to local storage at close to 30fps and the frames look fine.

Are you creating an array of UIImage objects and using it as a buffer.

In my case, I am not using Swift, but I may be having the same problem with Flutter/Dart.

I have a RICOH THETA SC2. I update my UI every 0.1 seconds so i get 10FPS. Unfortunately if i try to update too often the gui freezes and instead of getting 30FPS i get 1 frame every 5 seconds.

at 10fps i noticed 2 things:

  1. I show the camera live preview on my ipad. If the camera points at the ipad, the FPS becomes worse.
  2. If i hold the camera in the air i get an ok FPS. But if i cover 1 lens, the camera gets a much better FPS with the other camera

conclusion? I suspect the camera does some kind of processing internally, and what you record affects the FPS.

For the next month i’m going to work on the rest of my app, and later come back to improving the FPS.

2 Likes

Thanks for this information. I have similar problems with the RICOH THETA SC2 using Flutter/Dart with my own app. However, if I use the official RICOH THETA mobile app as a baseline, it seems possible to run the camera.getLivePreview fine.

I suspect that the SC2 is doing something to the data that the V, Z1, and X do not. As my app runs camera.getLivePreview fine with the V, Z1, and X. On the SC2, I had to lower the framerate, same as you. This is not satisfying for me as the offficial mobile app shows that the SC2 can deliver smooth preview. Thanks for sharing your experience struggling with the framerate on the SC2 getLivePreview as I now feel more a part of a group of people working to solve a common problem, instead of thinking that I’m the only one struggling with the SC2 smooth preview.

Just so you know, there is an older iOS SDK with Objective C

I’m trying to see if there is anything newer.

I found the problem of why my ipad swift program had a very laggy ‘live preview’.

In this swift library, the ‘live preview’ is calculated in the file /OpenSphericalCamera/LivePreviewDelegate.swift. The problem is that we load up about 50 data segments, and form an array of 80 thousand bytes, and each time we append a segment, the function parses the whole array TWO times.

long story short, LivePreviewDelegate evaluates about 1 million bytes. The solution is to check the first 2 bytes and the last 2 bytes. This approach lets you do about 100 comparisons, instead of 1 million.

below is the simplified function, removed unused variables, catch block, added comments. FYI in this code i only evaluate frames bigger than 70000 bytes, since my camera streams frames that are about 79000bytes.

class LivePreviewDelegate: NSObject, URLSessionDataDelegate {
    var validStart = false
    var validEnd = false
    
    let JPEG_SOI: [UInt8] = [0xFF, 0xD8]
    let JPEG_EOI: [UInt8] = [0xFF, 0xD9]

    let completionHandler: ((Data?, URLResponse?, Error?) -> Void)
    var dataBuffer = Data()
    var frameCount = 0
    var frameTime = 0

    init(completionHandler: @escaping ((Data?, URLResponse?, Error?) -> Void)) {
        self.completionHandler = completionHandler
    }

    @objc func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) {
        DispatchQueue.global(qos: .background).sync {
            self.dataBuffer.append(data)
            
            // most frames are about 79_000 bytes, so dont evaluate small frames
            if self.dataBuffer.count < 70_000{
                return
            }
            
            // condition to prevent buffer overflow
            if self.dataBuffer.count > 1_000_000{
                print("360 live frame buffer error, ITS TOO BIG, remove all")
                self.dataBuffer.removeAll()
            }
            
            // evaluate first 2 and last 2 bytes of 70+k bytes of data
            if self.dataBuffer[0] == self.JPEG_SOI[0] &&
                self.dataBuffer[1] == self.JPEG_SOI[1] {
                validStart = true
            }
            else{
                validStart = false
                self.dataBuffer.removeAll()
            }
            
            if self.dataBuffer[self.dataBuffer.count - 2] == self.JPEG_EOI[0] &&
                self.dataBuffer[self.dataBuffer.count - 1] == self.JPEG_EOI[1] {
                validEnd = true
            }
            
            // if either start or end of frame are not valid, then frame is not complete yet and return so the next data segment can be appended to the buffer
            if !validStart || !validEnd{
                return
            }
//                    print("360 frame start: 0x\(String(format:"%X", frameData[0])) end: 0x\(String(format:"%X", frameData[frameData.count - 1])) size: \(frameData.count)")
            validStart = false
            validEnd = false
            self.completionHandler(self.dataBuffer, nil, nil)
            
            // code to confirm start and end bytes
//            if self.dataBuffer.count > 0{
//                print("360 frame buffer start: 0x\(String(format:"%X", self.dataBuffer[0]))  end: 0x\(String(format:"%X", self.dataBuffer[self.dataBuffer.count - 1]))")
//            }
//            else {
//                print("360 frame buffer is empty \(self.dataBuffer.count)")
//            }


            // code to confirm frame rate is about 30 fps
//            let currentTime = Int(NSDate().timeIntervalSince1970)
//            if frameTime == currentTime{
//                frameCount = frameCount + 1
//            }
//            else{
//                print("360 live frame count: \(frameCount)")
//                frameCount = 1
//                frameTime = currentTime
//            }
//
            self.dataBuffer.removeAll()
        }
    }

    @objc func urlSession(_ session: URLSession, task: URLSessionTask, didCompleteWithError error: Error?) {
        session.invalidateAndCancel()
        self.completionHandler(nil, nil, error)
    }
}


2 Likes

Wow, thanks for posting this!

There was a similar problem with the livePreview on iOS in the RICOH SDK (ObjC) on GitHub.

I know the person below solved the problem, but the solution was not posted due to restrictions at his company.

1 Like

FYI, I edited my previous comment. I simplified the function again

  • only check frames once the buffer is at least 70,000 bytes, since my camera gives me frames that are 79,000
  • removed unused variables, unneeded ‘repeat loop’, unneeded catch block
  • use bool variables to as flags to identify a valid frame
  • added comments to the code
  • left commented code that lets you check your fps
  • FYI my camera gives me about 30 frames per second, and each frame is 79k bytes
2 Likes

Thanks again for posting your solution.

If you look at the previous posts, domainxh had a similar problem and was looking for a Swift solution. I think many people are looking for this solution and either don’t post or spend a long time with their own one-off solution. So glad that you were able to share your solution here.

BTW, the community awarded you a community thanks award.

We’re trying to get an email for you to send a small gift from the community. Can you send an email that would work with PayPal to jcasman@oppkey.com?