I have been trying to receive ADS-B traffic using a "cantenna" and an RTLSDR. I am currently getting around 60 miles coverage from the upstairs window. My setup is just the cantenna, RTLSDR and a Raspberry Pi. No pre-amp or special co-ax at this time, just a short length (about 70cm) of RG58U + a short length of wire making up the BNC to SMA adapter.
During my research I found a reference suggesting that calibrating the RTLSDR migth bring some improvement, so for this purpose I downloaded and compiled kalibrate-rtl. The version cloned directly from the Github respository did not run on the Pi, which is another story, but this was ultimately resolved by downloading the ZIP package from this Github page:
github.com
This compiled and ran fine. However, I tried to calibrate using tow options:
Option 1) use kalibrate to scan for GSM900 signals and use the strongest one to calibrate the ppm offset.
Option 2)use rtl_test -p
The first option returned a value of -38.299. The second one after a few minutes settled at around 78. I re-ran the first option again and got -38.287 so a very similar reading again. So why are these two readings so significantly different, and which one do I trust/use?
I did play with both, but I'm not sure that I am seeing much difference to be fair, perhaps only a subtle one. The same can be said for using --enable-agc or --gain -10 on the dump1090 command line. The difference seems minimal. I have yet to experiment with using positive values for gain.
During my research I found a reference suggesting that calibrating the RTLSDR migth bring some improvement, so for this purpose I downloaded and compiled kalibrate-rtl. The version cloned directly from the Github respository did not run on the Pi, which is another story, but this was ultimately resolved by downloading the ZIP package from this Github page:
GitHub - asdil12/kalibrate-rtl at arm_memory
fork of http://thre.at/kalibrate/ for use with rtl-sdr devices - GitHub - asdil12/kalibrate-rtl at arm_memory
This compiled and ran fine. However, I tried to calibrate using tow options:
Option 1) use kalibrate to scan for GSM900 signals and use the strongest one to calibrate the ppm offset.
Option 2)use rtl_test -p
The first option returned a value of -38.299. The second one after a few minutes settled at around 78. I re-ran the first option again and got -38.287 so a very similar reading again. So why are these two readings so significantly different, and which one do I trust/use?
I did play with both, but I'm not sure that I am seeing much difference to be fair, perhaps only a subtle one. The same can be said for using --enable-agc or --gain -10 on the dump1090 command line. The difference seems minimal. I have yet to experiment with using positive values for gain.