[Elphel-support] UAV photogrammetry application

Andrey Filippov andrey at elphel.com
Mon Oct 18 21:41:34 PDT 2010


On Mon, Oct 18, 2010 at 9:51 PM, Richard Antecki <
richard at clearboxsystems.com.au> wrote:

> We have mounted an Elphel 353 (8.0.8.41) on an unmanned helicopter for use
> in aerial mapping research applications.  For accurate positioning of the
> images, time synchronisation is very important as we need to correlate the
> time the photo was taken with the navigation solution of the helicopter at
> that particular instant (raw GPS data providing only one input to the
> navigation solution).  Even with the ntp sync at bootup, the timestamps on
> the captured photos have been seen to be up to 10 seconds out.  We require
> time accuracy of less than 1ms for this research application.
>

Richard,

there is just a regular crystal oscillator that drives the clock, but there
are there are several things you can do to increase clock precision. The
timer in the FPGA (that drives the timestamps) uses 24-bit accumulator to
generate 1microsecond pulses, so it is possible to adjust clock rate with
the step of +/- 1/(2^23) . It is also possible to measure the temperature
(if you've got 10369 interface board) and use it to calculate correction by
calibrating clock at several temperatures.

On the other hand, the easiest thing is what you suggested - use GPS time


>
>
>
> We have a U-Blox 5 GPS receiver which spits out NMEA GPS time information
> at 1 second intervals.  I understand that the camera can take an external
> GPS serial input and embed the location in the EXIF data.  Am I correct in
> assuming that by default, the camera will include this data in the EXIF
> header of all future images captured, until a new NMEA update is received?
> So with this alone, the GPS time in the EXIF data will be accurate to +/- 1
> sec?
>

No, precision will be much higher, it will be defined by the jitter of the
serial communication with the GPS, but as you can average error through many
frames the error should be rather small. The crystal oscillator in the
camera has +/-50ppm tolerance, but it's varianace is much smaller. I would
recommend to run the sensor in triggered mode (TRIG=4) with the trigger
generated internally in the FPGA with the period programmed in sensor clock
periods (by default it is 1/96,000,000 sec). At each frame the GPS data is
copied to the synchronized buffer, so when you read frames from the camera
each Exif header will match the last data received from the GPS. Of course,
as the GPS is updated only once per second, there will be multiple frames
with different timestamps but same GPS time. It should not be difficult to
write a program that will process long frame sequences (time stamp / GPS
time pairs) and probably get to ms resolution (you will have to account for
the different time each scan line is exposed).


>
> I’ve found a nmea2exif.c file in the source.  Does this need to be started
> manually or is it run automatically somehow?
>


At boot time camera looks fro the compass and/or GPS attached and starts the
appropriate programs. But we tried just some devices, so it may fail to
detect others. Boot messages should tell you that, you may need to tweak the
detection script (it is in PHP) to recognize your device.

>
>
> The U-Blox can also spit out a timing pulse at exactly 1 sec intervals,
> which correlates to 1 sec GPS time increments.  We could potentially use
> this pulse to trigger the camera on 1 sec boundaries.  Is the sensor
> triggered on the rising edge of the trigger pulse or the falling edge?
>

The camera (not the sensor itself - trigger is processed in the FPGA) can be
triggered at any polarity (there is a register that specifies that), but I
would recommend to run sesnor at higher frame rate and use timestamps+GPS
time as described above. With just 1 FPS at sensor it will be extremely
difficult to use autoexposure - the loop lag is 3 frames (it will be 3
seconds)



> The NMEA timestamps are received after the timing pulse to which they
> relate, so the EXIF timestamp data in the image captured by the timing pulse
> trigger should be accurate, but exactly 1 second behind.  This is a
> potential solution.  The drawback with this method would be that we can only
> take photos at fixed 1 second intervals.
>

Yes, definitely.


>
>
> We tested a method using an external trigger at a known time, using camogm
> to capture images, and noted some strange behaviour when the camera is in
> trigger mode (such as Apache freezing).  The internal clock also seems to be
> affected whilst in trigger mode.  Just prior to enabling trigger mode, an
> image was captured using camogm with the filename “1287443109_363240.jpeg”,
> which is the correct time.  After trigger mode was enabled, the next set of
> images captured were named as follows:
>
>
>
> -268435456_1048575.jpeg
>
> -1610612736_1048575.jpeg
>
> -000000008_1048575.jpeg
>
> -000000001_1048575.jpeg
>
> -134217728_1048575.jpeg
>
>
>
> The timestamps captured in the EXIF data were also highly incorrect, e.g.
> “EXIF_DateTime=2106:02:06 06:28:15”
>

Insufficient data to guess what happened. What was the frame rate and image
parameters? Did you try external for the sensor (TRIG=4) but internal for
the camera (programmed in the FPGA period) mode at the same rate? Does it
have the same problems?
There could be that the trigger pulse was recognized as several pulses
instead of one, and sensor behaviour is rather difficult to predict when it
is triggered faster than it can - FPGA has some "de-glitcher", but it may
still let several pulses through. Additionally, current implementation of
camogm does not allow variable frame rate - each time it detects different
frame period it starts new segment.

Additionally, your example shows that you use individual jpeg images - that
will not work for long records. when the file directory grows, it
dramatically increases the CPU load to handle them, the least CPU resources
usage is when you use *.mov container with reasonable number of frames per
file. Camogm is designed to switch files that next segment starts at exactly
the next frame after the last one in the previous file. When you use mov,
each frame still has the full Exif headers. For the best image quality at
higher frame rate  we would also recommend using jp4 (not jpeg) encoding of
the image frames.


>
>
> From what I’ve read so far, the external trigger method has some negatives,
> such as the auto-exposure, white balancing etc not being able to function
> correctly, requiring all these parameters to be set manually beforehand.
> Can anybody see any other potential issues with this application?
>
Thoughts/suggestions on any of the above would be appreciated.
>
>
>
Hope my suggestion would help.


Andrey
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://support.elphel.com/pipermail/support-list_support.elphel.com/attachments/20101018/9e2223be/attachment-0002.html>


More information about the Support-list mailing list