I tried to update my "new" F9P to the new (July 2019) firmware that
More than a day to do a simple task. But it reminded me why I hate
WinBlows, u-center only slightly less.
I'm not disagreeing on that. I've somehow managed to spend 99% of my career in linux (so pat myself on the back for a moment over that.) :-P I am pretty stuck on google stuff though, so un-pat myself for that I guess.
$ ubxtool -f /dev/ttyACM1 -p MON-VER
swVersion EXT CORE 1.00 (61b2dd)
extension ROM BASE 0x118B2060
extension FWVER=HPG 1.12
u-blox doc "UBX-18010854 - R07" says PROTVER 27.11 is the minimum version
for UBX-RXM-SFRBX, and now I have it.
gnssId 6 svId 21 reserved1 0 freqId 11 numWords 4
reserved2 12 version 2 reserved3 0
dwrd 5568142b 3f7091fc 1495a800 09520003
Ohhh, that looks legit, I think you have it.
As I only have one, I need a local base to look at RTK, but the ODOT
NTRIP caster insists I "send GGA", but can't say what that means...
We proved last week the F9P PPP works with NRCAN.
I'm sure you know way more about most of this stuff than I do, but I can at least share the path I've been down.
In MN we have something called MnCORS for which an account is needed. My understanding is that MN has setup a number of base stations across the state all feeding into the same central system. I can send a request to the server (essentially "here I am") and it will craft custom/virtual RTCM messages for my location. I don't know exactly how this works under the hood but I am imagining some sort of interpolation between base stations based on the location I report myself at?
For our system we have this little cell network xbee that runs micropython. The xbee is connected (2-way) to UART2 of the u-blox F9P.
So it's kind of a funky house of cards, but also kind of cool. When the system comes up, the micropython script on the xbee launches automatically and watches incoming messages from the u-blox. The u-blox is configured to send the nmea GGA message out UART2. So assuming the u-blox receiver finds itself and begins generating GGA messages on UART2, the xbee sees them.
The GGA message has lon/lat in clear text. When the GGA messages start flowing, the xbee grabs the first one and crafts a custom request message from the MnCORS server (also by this time, hopefully the xbee has managed to connect up to the cell phone based interweb.) So the xbee / micropython script shoots off the "here I am, let's go" message to the MnCORS server via the xbee internet connection which is mostly just the GGA string slightly modified. (this is from memory, I could connect up and look at the exact script if you are curious about more specific details.)
(Through this project I learned the xbee can be more than a dumb radio link in that it can run code as well and forward stuff back and forth between ports and massage the data in between.)
So now, fingers crossed, the MnCORS server begins sending RTCM messages back through the cell network which the xbee then receives and relays to the u-blox over it's UART2.
Then if everything works, the u-blox should eventually jump to RTK-Fix and life is good. In my own tests, we were running in our lab through a gps repeater. I manage to get RTK Float, but not fix. I haven't yet gone outside to see if I could get an RTK fix by seeing the actual satellites first hand.
For our lab's purposes, one of our core strengths (ours, not necessarily mine) is a 15-state EKF (kalman filter) that estimates attitude and location using gyro, accelerometer, and gps. We are hoping to investigate how much attitude accuracy improvements we could achieve with an RTK fix'd gps versus your stock ublox8. Ideally for us this will give us better attitude (roll, pitch, yaw) estimates and that will help some of our projects. But if we have sit around for an hour waiting for fix that may or may not happen, that's not good. And if we drop the fix during flight because we are maneuvering and seeing different parts of the sky, then that's less helpful as well ... but we hope to do more investigation down the road to see what can be done and how well it could work for us.
This probably is overly blatant self promotion ... sorry about that. Here's a 4 minute video of something we did based on our EKF solution. The EKF solution is used to generate the HUD graphics and also fly the airplane with our own in-house developed flight controller. This video is of a fixed wing uav landing. The video is actual footage from an action cam mounted to the nose. We were landing about 30 minutes past sunset (part 107 rules) so it's pretty dark, especially on camera. Using the aircraft attitude data that was collected during the flight, I added a conformal HUD display and some augmented reality elements in post process (not real time (not yet.)) So what you are looking at are square gates that the aircraft is being guided through. After a minute or two of flying the mission, it jumps to the landing task, and continues to fly through the gates until we touch down. After landing I pick up the airplane and point it back along our approach path so you can see our actual approach in augmented reality. You can see the sync between the computer graphics and the real world in the video isn't perfect, and in some places isn't great, and this is because our built in attitude estimate (EKF) has errors. We are hoping that a more accurate GNSS solution will help us achieve a more accurate attitude solution and things like the graphics in the video will be much more locked in. Anyway, the flying part is fun when it work, not so fun when things go wrong, and we do have bad days once in a while, but this was a good day: