[Elphel-support] Questions regarding "zoom in ... now enhance" blog post

Andrey Filippov support-list at support.elphel.com
Wed Apr 13 18:38:09 PDT 2011


On Tue, Apr 12, 2011 at 7:28 AM, Florent Thiery
<florent.thiery at ubicast.eu>wrote:

> Hello,
>
> First, please let me introduce our use case; we are trying to use Elphel
> cameras (353 + Computar 4-8mm 1/2") to get maximum resolution (> FullHD), 25
> fps video.
>
> Our current problems are about image quality when zooming on details
> (blurred images); in other terms, we are trying to improve the rendering
> quality as much as possible to have cleaner images. In this context, last
> year we implemented http://code.google.com/p/gst-plugins-elphel/ but only
> changing the debayering algorithm did not improve the quality enough for our
> application (at least not when compared to the processing overhead).
>
> I was wondering about the method described in the awesome article "Zoom in
> ... now enhance":
>
>    - are there any specifics about the method being for Eyesis only (my
>    guess is it's not) ?
>
> Florent, sure it can be used with as single camera too. But it may be too
slow for processing videos - it now takes 2-3 minutes/frame on an i7 with >8
GB of RAM (in multi-threaded mode)

>
>    - regarding the calibration, what are the invariable factors ? Is the
>    calibration required for:
>       - every camera model/generation (depending on camera/sensor
>       manufacturing design/process variations) ?
>       - every lens model (depending on lens model) ?
>       - every lens tuning (zoom level / focus / iris ...) ?
>       - climatic condition changes (temperature, ...) ?
>
> This was designed for "fixed everything" (though we did not notice
degradation with changing temperatures). But there is a significant
difference between lenses even the same model. The lenses we used do not
have zoom, and iris does not make much sens for such lenses - with 2.2 um
pixels you can not close lens more than ~4-5.6 because of the diffraction,
small sensors do not provide much control over DoF and using iris to limit
light - not really needed with ERS sensors - they can handle very short
exposures perfectly. And of course, focus setting would influence results
too, as well as the lens orientation.


>    - the hidden question behind this is: how can this technique be used in
>    production ?
>
>
Working with wide angle lenses ( 45x60 degrees FOV) we do not have enough
room to capture the test pattern in a single shot, so software is able to
combine multiple shots where the test pattern covers just a fraction of
frame. We did not work on optimizing computational time of the calibration,
so it takes several hours to process data ftom 8 camera modules. Current
calibration is only designed for the aberrations correction, but I'm now
working on the precise distortions calibration (with some 1/10 pixel
precision), we plan to use it for panorama stitching, it can also be used
for making measurements with the camera. This calibration will use the same
pattern we use fro aberration correction, just at closer range, so the
pattern will cover the whole FOV (minor out of focus is not a problem here).

>
>    - For a given camera/lens combination, could a public database of
>       tuning data reduce the calibration requirement (in a similar fashion to
>       A-GPSes which download correction data from the network to increase
>       performance on low-quality reception and/or chips
>       http://en.wikipedia.org/wiki/Assisted_GPS) ?
>
> Our goal was to have precise individual lens correction, we did not
experiment with correction of the lens model - it probably is possible, but
with less correction, of course. Software has multiple tuning parameters, it
should be possible to do that.

>
>    - is there a hope of having such a feature (in the long term)
>       integrated in the camera itself (i.e. grabbing an mjpeg stream who had the
>       corrections made right before the encoding) ?
>
>
Not in the near future, at least. We now heavily rely on post-processing,
camera role is just to capture all what sensor can provide, in as much raw
form, as possible.

Andrey

>
> Thanks
>
> Florent
>
> _______________________________________________
> Support-list mailing list
> Support-list at support.elphel.com
> http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://support.elphel.com/pipermail/support-list_support.elphel.com/attachments/20110413/77e9c765/attachment-0002.html>


More information about the Support-list mailing list