Thanks for sharing, Craig! This is a good result, as camera was a bit hot when started!
Around 23°C or less it could work for hours, much of a difference actually those 2°C difference.
Of course in same room in same quality as Ricohs live streaming plugin, my could stream for 24h and on 24FPS vs 12FPS on Ricohs.
I was able to make streaming run on Theta X in 4k, 30fps, but only using android.hardware.Camera class, I’m facing with some serious issues with camerax.jar like Ricoh did not implemetn it 100% all methods like in android.hardware.camera , some surface methods are missing.
I was able to improve significantly heating issues at Theta X… it sohuld be able to record or stream for 20 minutes at least with my plugin.
@craig ,
in Theta X, when I use android.hardware.Camera class I’ve a preview, stream, all works, but I can’t select the camera preview with stitching, I guess that’s why Ricoh created another class abstract, but when I replace it with theta360.hardware.Camera or com.theta360.pluginlibrary.factory.Camera; I get next error when I run:
I/CameraControl: InitCamera 1
D/CameraControl: cameracontrol open():false
W/ow.hdrstreamin: CheckJNI: method to register “setPreviewSurfaceWithFlag” not in the given class. This is slow, consider changing your RegisterNatives calls.
E/ow.hdrstreamin: ----- class ‘Ltheta360/hardware/Camera;’ cl=0x159ca3e8 -----
objectSize=284 (224 from super)
access=0x8008.0001
super=‘java.lang.Class<java.lang.Object>’ (cl=0x0)
vtable (6 entries, 11 in super):
0: void theta360.hardware.Camera.F()
1: void theta360.hardware.Camera.G(android.graphics.SurfaceTexture)
2: void theta360.hardware.Camera.H()
3: void theta360.hardware.Camera.finalize()
4: void theta360.hardware.Camera.setPreviewSurface(android.view.Surface)
5: void theta360.hardware.Camera.startPreview()
Seems like methods setPreviewSurfaceWithFlag and setPreviewSurface are not in camera class
Thanks for this information. We will report back to RICOH.
BTW, when I played the recorded file from HDR Wireless Live streaming plug-in, I was not able to play the file in 360 view initially. I added meta data with this tool and can now play it in VLC.
I didn’t try the HDR Wireless Live Streaming plug-in with the X.
If you’re referring to the video taken with the WebAPI, the metadata for 360 video is included in both the X and the Z1. I think it is fairly straightforward to add the metadata for ProjectionType: equirectangular to the files.
Hi,
my plugin doesn’t run yet on X. Im doing some changes on my machine, the version in store is for z1 only.
About spatial metadata, I wasreferring to videos recorded by Z1, when I upload to youtibe arent recognized as 360, so metadata injection is neeeded. Also you had to inject spatial for recorded video with my plugin…
Im not sure what is required by youtube, is it enough to set global metadata for an mp4 file prior uploading or local metadata is needed? This second takes lot of time as whole mp4 is processed and saved to another file.
Im not sure how webapi does this actually. Can i execute from my plugin directly a webapi command to set spatial metadata to a file I just recorded and saved with my plugin?
Do you know if global metadata is enough for youtube to detect video as 360?
Is it possible that now Theta desktop app adds correct metadata to recorded videos so once stitched it will be detected as 360 by YouTube?
@craig,
I checked most of mp4 and metadata specifications and there is a gap currently. I know it’s straight forward to use a tool or even by code a solution to add correct spatial metadata information to recorded videos in Z1. The problem is that it requires reencoding of complete file… This is quite time consuming, so I was searching for solution to avoid this. There is only 1 potential library that I could look at, but my initial tests were failing to make it work in Z1. I may need to customize whole library… Clearly my goal would be to get from Z1:
video that we could upload directly to YouTube, by simply adding required metadata to video file
add metadata that is required for Ricoh Theta desktop app to do video stitching for video files that are recorded by my own plugin. Here, I think will still miss sensor data, but I may be able to find that as will if I have infinite time available
Can your plugin record the long 1 hour plus videos without a network connection? I think some people want to overcome the artificial 25 minute duration of the Z1 videos.
It seems like there is enough storage on the Z1 51GB to save it for 110 minutes, but the camera will stop the video at 25 minutes.
You may be able to get even longer videos by reducing the file size with a different compression or bitrate.
yes, @craig , it can record much longer. The version in store currently can record in h.264 and h.265, without stitching , in dual fisheye mode it can record much longer. But in a 23-24°C room should record for more than an hour in 24FPS and HDR quality. Stitching isn’t perfect in HDR mode, doing my best to improve that later. In regular non HDR mode, regular Ricoh’s stitching is used and in non HDR it can record even longer, less heat is generated.
That 5 minutes per recording is really in place currently?
There are two settings, 5 minutes (default) and 25 minutes. A normal camera user cannot record longer than 25 minutes. It is a problem for some people.
How would an videographer (not developer) stitch the dual-fisheye video into equirectangular format so that they could upload to YouTube or Facebook?
For the videos taken with the camera using the WebAPI, the official desktop application from RICOH can stitch the video. However, this may not work with the video from your plug-in.
Is it possible to test the save-video-to-camera-storage function without a Wi-Fi connection?
I’m looking for best but there are tools like Mistika VR, that can do stitching. Ptgui pro vbersion may be importat to create a custom stitching parameter file and that can be reused after… so it sohuld be a one time task. Mistika can also do optical flow stabilization. I can imagine a HDR recorded h.265 encoded video and wondering how good it would look like? I think this would look great!
Do you know someone who could help with this workflow?
For now not as I want to control usage of my plugin, so to start recording, for now Z1 has to be connected to a network. BUT to start in RECORDING mode its enough to have a very pure network, just to connect to flow.tours and pull configuration data…
IMO, pulling configuration data from your system is a good feature as it may allow you to charge a reasonable amount for your hard work in development.
However, your user may want a couple of different scenarios:
may not have bandwidth onsite to have a good experience with the stream. If the resolution of the stream is lower, isn’t the resolution of the saved file also lower?
may have some issues configuring client mode network from the Z1 onsite at someplace like an live event hall
@biviel Based on the video views, it seems like this is interest in your plug-in. Are you getting analytics from the RICOH Plug-in Store on your plugin downloads?
Ive a solution for offline useg after an online activation by saving a uniquekey to local system storage of the plugin… like for 30 days or longer. This isn’t there yet in this version. I would love to see people starting to use this plugin and will do decisions accordingly.
Where can I see number of downloads of my plugin? I wasn’t aware of this…thanks!
The power consumption management is intended to lower heat. at 4K 15fps, my first test with the camera lasted 10 minute before thermal shutdown. I am doing a second test with a small $10 fan power from an external USB.
The greater the power consumption, the greater the heat. It’s like laptop getting hot when you encode a video. If a device is consuming more power, it will get hotter.
I’ve been streaming at 4K 15fps for 90 minutes. I’m going to stop it now as I’m going home for the day.