Tuesday, May 12, 2015

A modern GNSS front-end

Earlier this year I had the occasion and privilege to be trying out a new front-end produced by NTLab, the NT1036. I thought it would be interesting to share this with the GNSS crowd.
The kit arrived composed by two separate boards: a control board and the actual chip evaluation board, as well as a CD with the software and detailed data-sheet. The controller board connects seamlessly to the evaluation one by means of a single flat cable with RJ12 ends. Although the suggested supply voltage is 3.0V pm5%, it was very convenient to use the same cable to power the board with 3.3V. Also, having a single common supply avoids currents on the control lines. In the end the chip worked fine in this configuration so I assume it was a safe choice to take.
The chip has many unique characteristics that make it suitable for a modern GNSS receiver. The ones of greatest interest to me are the following:
  • Four independent input channels
  • Two wideband VCO banks, on high and low RNSS bands, which can be routed with great flexibility amongst the four mixers, in particular allowing:
    • GPS/Glonass L1+L2 or L1+L5
    • GPS/Beidou L1+L2 or L1+L5
    • All 4 channels on either L1 or L2/L5
  • 3.0V supply voltage and low power dissipation (ideal for USB-powered devices)
  • Analog or digital output options for IF (real-only, which I like best) and clock lines.
  • Small, easy to assemble package
Obviously, the killer applications for this kind of chip are well contained antenna arrays and multi-frequency multi-constellation hardware and software receivers.
Having a lot of testing equipment at hand one could really crack a nut like this one. However, with limited hardware at hand I decided to use my SdrNav40 board and slightly modify its firmware so to ignore the 4 on-board RF channels and capture instead the evaluation kit outputs and clock.
Figure 1: Test setup, with 4 way power splitter and SdrNav40 powering the antenna.
Figure 2: Closeup of the test setup
Two tests were particularly useful for me: GPS/Glonass L1+L2 and four channels on L1. The first should lift any doubt on the potential fields of application of the chip. The second should solve my curiosity on phase behaviour of common LO (Local Oscillator) MI (Multiple-Input) front-ends.
The GUI to control the configuration of the NT1036 is incredibly rich and professional: low hanging fruit for a curious engineer.
Figure 3: NT1036 configuration tool: general settings tab, where the synthesizers can be programmed
Figure 4: NT1036 configuration tool: channels 1 and 2 tab
Figure 5: NT1036 configuration tool: main chip blocks tab
For GPS/Glonass reception the tuner offers a default configuration with the two VCO banks tuning in the middle between GPS L1 and Glonass G1 (and similarly for L2/G2), thus having GPS in high-side mixing and Glonass in low-side mixing. Configurable IF filter banks select one or the other. The distance between the centre frequencies (being about 26 MHz on 20 MHz for high and low RNSS respectively) suggests a L1 plan in which a FS of about 52 MHz puts both carriers around FS/4 for ease of down-conversion. Setting FS to 53MHz (derived from an integer PLL) allows achieving GPS L1 on 14.58 MHz. Plots that everyone likes follow.
Figure 6: PSD of samples acquired in high-injection mode on L1 at about 50Msps
Figure 7: Histogram and time series of the signal acquired with NT1036 (sign and magnitude output)
Figure 8: Results of GPS satellites acquisition.

I have in mind to continue my tests on the chip, subject to time which is always very little!
Till next time...

Tuesday, September 9, 2014

At ION GNSS+ 2014

To whom it may happen to be in Tampa these days: I will also be around.



Feel free to come and chat!

Tuesday, May 20, 2014

Galileo RTK with NV08C-CSM hw 4.1

Being European I am often subject to skepticism about Galileo and compelled to justify its delays and usefulness. Explaining why Galileo is better and needed is beyond the scope of my blog but IMHO there is one key selling point that not many people stress.

Galileo was designed from the ground up in close collaboration with the USA: GPS and Galileo share L1 (1575.42 MHz) and L5/E5a (1176.45 MHz). In the future, a dual frequency GPS+Galileo L1/L5 receiver will deliver products with an incredible ratio between (performance+availability)/silicon.
According to scheduled launches there could be 12+ satellites supporting open L1+L5 by the end of this year already.
In the meantime, NVS touches base first in the mass-market receiver domain by delivering consistent carrier-phase measurements for GPS+Glonass+Galileo.

I have recently run zero-baseline double-differences in static, perfect visibility conditions using a high-end survey antenna:
Figure 1: Galileo double differences in static zero-baseline (NV08C-CSM hw4.1)
Using E11 as reference, the carrier phase noise is well contained within 0.01 circles (2mm).
With RTKLIB I run a Galileo-only static IAR and the result is as expected:

Figure 2: Static IAR with Galileo only (4 IOVs)
The combined GPS+Galileo static IAR looks like this:

Figure 3: Static IAR with GPS+Galileo
Note the 12 satellites above the 10° elevation mask used in the computation of carrier ambiguities :)

Understandably, Skytraq is working on GPS+Beidou carrier phase and I may publish some results on that too although visibility of Beidou MEOs is not great from here.

In the meantime, for people who wonder where uBlox NEO6T stands in terms of GPS carrier phase noise in similar conditions to the above here it is my result:
Figure 4: GPS double differences in static zero-baseline (uBlox NEO6T)
Which shows similar noise levels to NV08C-CSM.

Monday, May 12, 2014

GNSS carrier phase, RTLSDR, and fractional PLLs (the necessary evil)

A mandatory principle when processing GNSS -in order to have high accuracy carrier phase- is to have a well defined frequency planning. This entails knowing precisely how the Local Oscillator (LO) frequency is generated.
With RTL-SDR it is not a trivial task given that both R820T and RTL2832U use fractional Phase Locked Loops (PLLs) in order to derive respectively the high-side mixing frequency and the Digital Down Conversion (DDC) carrier.
I guess most people use RTL-SDR with a 50ppm crystal so the kind of inaccuracies I am going to describe are buried under the crystal inaccuracy ..within reason.

Let us start from the common call

> rtl_sdr -f 1575420000

This means "set to 1575.42 MHz" but what is hidden is:
1) R820T, set to 1575.42e6 + your IF
2) RTL2832U, downconvert the R820T IF to baseband
.. there are approximations everywhere.

Now, the R820T has a 16 bit fractional PLL register meaning that it can only set to frequencies multiple of 439.45 Hz (exactly).
Instead, the RTL2832U has a 22 bit fractional PLL register meaning that is can recover IFs in steps of 6.8665 Hz (exactly).
Of course, nor 1575.42e6, nor 3.57e6 are exact multiples of either frequency so one always ends up with a mismatch between what he/she thinks he has set, and what really ends up with. Most of the times, this is fine. For GNSS it is not since carrier is accumulated over long intervals and even a few tenths of Hz will make it diverge from the truth.
So I went down the route of characterising the necessary evil of fractional PLLs.

The first test I did was to set the tuner to 1575421875, which leads to a -1875 Hz center frequency but is nicely represented in 16 bits using a 28.8 MHz reference (remember the R820T). In fact, 54 + 0.7021484375 = 54 + [1011001111000000]/2^16. ..ok well actually it fits on 10 :)

Here I found a small bug in the driver  and replaced the following messy (IMHO) code: 

/* sdm calculator */
while (vco_fra > 1) {
    if (vco_fra > (2 * pll_ref_khz / n_sdm)) {
        sdm = sdm + 32768 / (n_sdm / 2);
        vco_fra = vco_fra - 2 * pll_ref_khz / n_sdm;
        if (n_sdm >= 0x8000)
            break;
    }
    n_sdm <<= 1;
}


with 

mysdm = (((vco_freq<<16)+pll_ref)/(2*pll_ref)) & 0xFFFF;

Then I modified the IF of the R820T from 3.57 MHz to 3.6 MHz, as it is only a few kHz away and it is nicely represented on 16 bit  ..ok well it actually fits in 3 :)
Modifying the IF also impacted the RTL2832U fractional register of course.
I still had a significant error (about 115 Hz) which I could measure comparing the scaled code rate and the carrier rate (which should be proportional of a factor 1540).
After a long time wondering what could be happening, I decided to start tweaking the bits of the R820T.
One in particular called PLL dithering seemed suspicious. Disabling it kind of doubled the error to about 220Hz. Sad.. but I did recall now the resolution of the tuner (439.45 Hz) and guessed that there is a hidden 17th bit which toggles randomly when "dithering" and is instead fixed to 1 when "not dithering". A couple of references which could explain why are here:
http://petrified.ucsd.edu/~ispg-adm/pubs/KWangDissertation.pdf
http://www.ece.rochester.edu/users/friedman/papers/ISCAS_04_PLL.pdf

How sneaky! But I could nicely recover that 17th bit with the RTL2832U (which has 22).
So I have now rock-solid code-carrier assistance ^_^
Figure 1: Code-carrier mismatch when tracking a satellite with RTL-SDR
One step closer to integer ambiguity resolution?

Cheers,
Mic