[Elphel-support] Record full res and stream compressed at the same time?

Elphel Support support-list at support.elphel.com
Sun Jul 23 18:46:05 PDT 2017


David,

That will need modification of the drivers -  ( https://git.elphel.com/Elphel/linux-elphel ), because while in FPGA sensor and compressor channels can be paired in any combination, driver currently has them one-to-one.


Do you have Eclipse-based development environment installed and set up? All the header files responsible for communication with the FPGA are missing in the git repository - they are generated/updated from the Verilog source files.

And what i your bottleneck? What is the bandwidth of the cable? Maybe you can stream JP4 and decode it on the fly with gstreamer? 


Andrey


---- On Sun, 23 Jul 2017 17:44:50 -0700 David McPike <davidmcpike at gmail.com> wrote ---- 

Andrey, can you give me some advice or point me to documentation on how I would accomplish the following:

- Run two compressors
- The compressor used by camogm would write JP4 raw or 100% JPEG quality MJPEG images to disk
- Imgsrv would serve images at various JPEG qualities as we set them


Do I need to read circbuf frame metadata to figure this out?  What's the right way?  Running multiple sensors in each camera isn't achievable at this time.


Thanks!
David




On Sat, Jul 22, 2017 at 10:12 PM David McPike <davidmcpike at gmail.com> wrote:

Hi Andrey,


Our use case would be one sensor per camera, running around 10-12fps, recording video on local disk, and surface ROV pilot viewing streamed images.  I would like to maximize image quality stored on disk, while minimizing latency for images delivered to the surface on a limited bandwidth network path.  Using a single fps is totally fine.


I can't recall our typical resoluition.  I don't feel like we ran the camera at full resolution in the past, so I'll keep your input in mind and try to finalize our requirements.


Thanks much!






On Sat, Jul 22, 2017 at 9:53 PM Elphel Support <support-list at support.elphel.com> wrote:

David,

Do you mean resolution or JPEG quality?

Different JPEG quality can be achieved with the same FPGA code, but different software if you use less that all 4 channels - run 2 compressors from the same sensor channel.

Different resolution simultaneously can only be achieved if the sensor runs at full resolution, and FPGA reduces resolution. That mode is not implemented as it is limited to the same (lowest of 2) frame rate, defined by the sensor. So it is not possible to combine low fps/high resolution and high fps/low resolution from the same sensor, only what can be done is similarly to 353 - short interruptions of the stream to get full resolution images (simple application can record them to the ssd).

Probably the most practical solution (having 4 sensor channels on the same camera - dedicate one sensor for streaming, others - for high-res recording.

Andrey

---- On Sat, 22 Jul 2017 18:45:23 -0700 David McPike <davidmcpike at gmail.com> wrote ---- 

Hi All,

I can't seem to find a clear explanation on this.  We are still primarily using the 353 cameras.  Is there a way to locally record high quality images via camogm and at the same time retrieve JPEG images with higher compression via imgsrv?


Thanks much,
David







 _______________________________________________ 
Support-list mailing list 
Support-list at support.elphel.com 
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com 












 _______________________________________________ 
Support-list mailing list 
Support-list at support.elphel.com 
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com 





-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://support.elphel.com/pipermail/support-list_support.elphel.com/attachments/20170723/6e67f396/attachment.html>


More information about the Support-list mailing list