[Elphel-support] Problems to detect timestamps

Jennifer Valle jvs1192 at gmail.com
Fri Jan 22 08:51:51 PST 2016


Hello,

We have followed the configuration of J15 to connect the camera but when I
put the values ​​of the parameters(
http://wiki.elphel.com/index.php?title=Trigger) in the master camera does
not respond and I have to restart the camera. Instead the slave camera
takes the parameters correctly.
What it the problem?

Kind regards,

Jennifer
Jennifer,

The simplest setup is to program all (master and slaves) identically, the
master camera should also be triggered by the external trigger as all the
other cameras. There are 4 pins on the connector - 2 are output pair, 2 -
input pair . Cabling should be as described on the wiki page -
http://wiki.elphel.com/index.php?title=10369#J15_-_SYNC_.28external.29 ,
here is the diagram itself:
http://wiki.elphel.com/index.php?title=File:Sync_cable.png

http://wiki.elphel.com/index.php?title=Trigger shows settings/examples.
TRIG_CONDITION selects - which signal the camera is listening to, and the
are 3 valid options : 0 (use own FPGA), 0x200000 - use external connector
(4-pin telephone style connector) and 0x20000 - when camera boards are
connected with flex cables -  this option is used in multi-camera systems
such as Eyesis4pi, not in the regular cameras.
TRIG_OUT can be 0 (no output), 0x800000 to send signal to the telephone
connector (OK to have on "slave" cameras - those pins are just not
connected, so no harm to drive them) and 0x80000 - drive internal
connectors (like in Eyesis4pi)

As for the frame rate I would start with some slow rate (like 1 fps -
trig_period = 96000000) to make it easier to compare timestamps. And only
when you get what you expect - increase the frame rate. And make sure you
have enough light (exposure time is added to the frame readout time).

Andrey

---- On Mon, 18 Jan 2016 08:44:48 -0800 *Jennifer Valle <jvs1192 at gmail.com
<jvs1192 at gmail.com>>* wrote ----

Hi,

Thanks for the information that you have provided us. It was very useful
and we will use the tool imgsrv. Now we would like to ask you about camera
settings.

We captured two cameras and get different timestamps for each frame.  Can
it be depends on the parameters of trig_cond and trig_out (which currently
are at 0). We have  trig = 4 mode with a trig_period = 4800000 in both. In
the master camera we have external trig = 0 and the slave 1 and trig_delay
equal to 0 in both.

What parameters would have to be modified to get the same timestamp of the
two cameras for a framerate of 25fps for example?

Kind regards,

Jennifer



2016-01-11 3:13 GMT+01:00 support-list <support-list at support.elphel.com>:

Jennifer,

In our records I found that your cameras all have 10369 I/O boards, ordered
specifically for synchronization of the cameras so I suppose you or others
in your team  know how to use them.

Cameras should be placed in triggered mode, where they performs as two
independent devices: one just generates periodic trigger pulses (with
attached timestamps), and the other (the camera itself) waits for the
trigger and then initiates frame acquisition sequence (so if individual
cameras have different exposures, the _beginning_ of each exposure will
match, not the center). The synchronization cable wiring defines the
"master" - output pair of wires on its connector are soldered to all input
pairs in parallel (including input on the master itself) - that means that
all the cameras can be programmed identically as "masters" - outputs from
all but the real "master" will be just not used.

The electronic rolling shutter sensors have two running (in scanline order)
pixel pointers - erase pointer and readout pointer, and the delay between
the erase pointer and the readout one define exposure time. In free running
mode (not synchronized) the pointers can be in different frames - while the
readout is in frame #0, the erase one may already be in frame #1 (next one
), the longest exposure time in that mode is just the frame period - as
soon as pixel is read out, it is erased and exposure starts for the next
frame.

In triggered mode this is not possible, so when trigger arrives the sensor
starts erasing line by line, starting from the first line. Exposure time
later the readout pointer starts, and erasing of the next frame can not
start until the full frame is read out. This makes the minimal frame period
(the value programmed to the FPGA sync pulses generator) equal to the sum
of the exposure time (or maximal anticipated exposure time if you use the
default autoexposure mode).

Sensor readout time can be calculated using sensor datasheet available for
download on On Semiconductor web site. Pixel clock is 96MHz (96MPix/s).
This time includes not just pixel readout, but also vertical (rather small)
and horizontal (large) blanking;

 FPGA processes 53.33Mpix/s in JPEG mode and 80 MPix/s in JP4 mode (we
recommend this mode that provides you with raw Bayer mosaic data), and FPGA
has the full frame period to compress image - and there is no "blanking",
so for the FPGA the minimal synchronization period is just slightly above
total number of fixels divided by FPGA pixel rate (53.3M and 80M). JP4 mode
is always faster than the sensor, JPEG is slower for large frames, but
smaller for the smaller ones.

Other frame rate limiting factor is the network bandwidth and while sensor
readout and FPGA processing do not depend on compression quality, the
bandwidth sure does. Sending images over the 100Mbps Ethernet provides
approximately 10MB/s of data. If you are trying to push as much data as you
can, you can adjust image quality by acquiring a test image of the scene,
looking at the file size - the maximal frame rate will be just 10MB/s
divided by the image size.

10 MB/s is achieved if the network can handle 100Mbps from each camera. As
you have 16 of them you will need to split them in 2 groups (if you are
using GigE switches). On the host PC you need to run either multi-threaded
or just run one script (that reads from the imgsrv) per camera
simultaneously.

This will provide you with the sets of images precisely timestamped, each
channel will have images with exactly the same timestamp value so you
should not have any problems matching images from different channels.

There are multiple programs available to process JP4 format (an perform
client-side demosaic) - in C, Java, and even JavaScript code that converts
images using just HTML5 canvas.

Andrey


_______________________________________________
Support-list mailing list
Support-list at support.elphel.com
http://support.elphel.com/mailman/listinfo/support-list_support.elphel.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://support.elphel.com/pipermail/support-list_support.elphel.com/attachments/20160122/82dcd935/attachment-0002.html>


More information about the Support-list mailing list