I didn't see an explanation of what strapdown meant in this context, so I dug one up:
"Traditional, stable-platform navigation systems commonly involve separate accelerators and fibers or laser-based gyroscopes, with all the components mechanically and rigidly mounted on a stable platform that is isolated from the moving vehicle. This leads to the drawbacks of large size, poor reliability, and high cost. In contrast, in strapdown navigation systems, the inertial sensors are fastened directly to the vehicle’s body, which means the sensors rotate together with the vehicle. "
Yes! It's in contrast to gimbaled systems. Putting the measuring instrument on a gimbal simplifies the math and often improves accuracy, but at the expense that you need this large moving object that needs more power.
The ultimate example of this is the incredibly accurate and expensive and complicated floating Advanced Inertial Reference Sphere used on the Peacekeeper ICBM.
Except that a gimbaled system, if enclosed in a single box/pod, can also be described as a strapdown system. The term speaks more to separate modularity than how the system functions internally.
I wonder if you could dangle the star tracker below the drone on a long string, decoupling its attitude from the attitude of the drone. A kevlar or spectra string capable of supporting 100 grams would be 20μm in diameter; 3 meters of it would weigh a milligram, which is significantly less than gimbals. A small weight a couple of meters below the star tracker on a carbon-fiber-composite rod would seem to be able to stabilize its attitude further except in yaw.
It's not exotic. It's a 1936 × 1216 Sony sensor with a C-mount lens.
That's below current phone camera resolution. It's monochrome,
which makes sense in this application.
They have bigger collecting optics than a phone, and you get better sensitivity without the color filters.
I'm not clear on how they get their "down" reference. It's clear how they get heading; that's easy if you can see the stars. But you need an accurate horizon or vertical to get latitude and longitude.
One degree of error in the vertical is maybe 100 km of error in position. How good are drone AHRS systems today in attitude? They have a correction system that works if you fly in a circle, but that just corrects for constant misalignment between camera and down reference.
One degree of error in the vertical for the drone's control system, if it's hovering by blowing air downward at 5 meters per second, would be a ground speed of 87 mm/s (sin(1°)×5m/s) in whichever direction the tilt is. Also without any correction in the propeller speed it would result in a loss of altitude averaging 0.76mm/s (2.7 m/hour, (1 - cos(1°) 5m/s). But that could also be caused by something like a mild downdraft, while the horizontal drift could be caused by an imperceptibly weak breeze.
So I don't really know how this is normally done. If you can set the drone on the ground for a few minutes, you should be able to get a very good reference up-vector, but I don't know how long the MEMS gyros can preserve that up-vector without GNSS once it takes off.
At sea you can probably look at the horizon with a camera unless it's foggy.
The SR-71 and U2 planes had automated celestial navigation systems b/c GPS wasn't around when they came out.
There a story in the book about Lockheed Martin's Skunk Works where they mention turning on the system while one of the planes was in the hangar and it locked on to a hole in the roof (sun was shining through the hole and system thought it was a start).
(Actually the very first one, in that history, was an intercontinental cruise missile—a jet weapon that slightly predated (~1958) rockets powerful enough to cross oceans. ICBM's came a bit later. I'm pretty sure the first generation were pure-analog circuits, but I forgot where I read about that).
- "pretty sure the first generation were pure-analog circuits"
This Wikipedia entry isn't what I had in mind, but it describes an interesting analog mechanism,
- "For guidance systems based solely on star tracking, some sort of recording mechanism, typically a magnetic tape, was pre-recorded with a signal that represented the angle of the star over the period of a day. At launch, the tape was forwarded to the appropriate time.[2] During the flight, the signal on the tape was used to roughly position a telescope so it would point at the expected position of the star. At the telescope's focus was a photocell and some sort of signal-generator, typically a spinning disk known as a chopper. The chopper causes the image of the star to repeatedly appear and disappear on the photocell, producing a signal that was then smoothed to produce an alternating current output. The phase of that signal was compared to the one on the tape to produce a guidance signal.[2]"
I've actually used this fact in a related way, for wayfinding.
Old school Open-CV was able to see tracks well from an onboard monocular camera, but calibration and scale was annoying. Track width is accurate enough that I was able to use it to input a bunch of head-end video to map the tracks.
It was mostly just a modified edge detect where the tracks approximately would be. Once finding the tracks, you could automatically calculate the camera's height, lateral location, and angle.
My preferred one for EE folks is that reportedly the first Arduino boards (now 20 years old?) had a mistake in their eCAD where the second pair of headers was 0.05 instead of 0.1" apart. But it was too late by the time they caught it. And now, 20 years later, even high end microcontroller boards ship with that same gap to be compatible.
There are a few standards for rail-line widths. I know the US is on one standard (I think the narrow width lines died out almost 100 years ago at this point). I know that Europe has two, or maybe more.
A popular legend that has circulated since at least 1937[8] traces the origin of the 1,435 mm (4 ft 8+1⁄2 in) gauge even further back than the coalfields of northern England, pointing to the evidence of rutted roads marked by chariot wheels dating from the Roman Empire.[a][9] Snopes categorised this legend as "false", but commented that it "is perhaps more fairly labeled as 'Partly true, but for trivial and unremarkable reasons.'"[10] The historical tendency to place the wheels of horse-drawn vehicles around 5 ft (1,524 mm) apart probably derives from the width needed to fit a carthorse in between the shafts.[10] Research, however, has been undertaken to support the hypothesis that "the origin of the standard gauge of the railway might result from an interval of wheel ruts of prehistoric ancient carriages".[11]
Lookup why torpedo's are almost universally 21" in diameter. The short version: because that was how big they were last time. There is no reason beyond 21" being usrd once upon a time and nobody wanting to break from it and have the old torpedos not work in the new boats.
That's insanely cool what kind of cameras / telescope are strong enough to do that? My guess is it was primarily hardware and not software bacuse of compute limits
It would work on the ground, I believe the pilots (normally) had to get a fix before takeoff. You do need to see the sky without cloud cover, but spy satellites were less of a concern back then so less risk of being overflown during a daylight setup. The cameras are basically visible telescopes with very narrow fields of view and good baffling. Only a few stars are bright enough that you can sight off them, but it can be done. The device does a scan, so it's only accepting a small area on the sky and the initial fix can be sped up because you know where/when the aircraft is taking off. A lot of tricks to minimize the need for "plate solving", like knowing which direction the aircraft is pointing within some tolerance.
It wasn't exactly a simple instrument to use, and it relied on a ton of planned course information. You could also do a cold midair start after a power outage, but preflight would be much more preferable!
Some modern microwave telescopes like BICEP3 have an additional optical telescope for star pointing that are daylight-usable, but in summer you need to use a big baffle tube. The images are taken with a high sensitivity CCD camera and you can pick out brighter target stars surprisingly well in the images.
BICEP3 actually uses a >20 year old CCD camera with analog video output (BICEP Array uses newer cameras, with more modern sensors). Daytime star pointings are possible by using a low-pass filter to block visible light and take advantage of the sensitivity of CCD / CMOS sensors to the near infrared, where the daytime sky is more transparent, combined with baffling.
I would add it also uses an ancient analog TV for manual sighting in combination with the GUI for semi-auto centroiding. I always thought that was funny to see, but it seems to work well enough. Also, inserting that baffle is somewhat terrifying because it slots into a hole next to the main vacuum window and if you dropped it on the membrane, bad things would happen. Always fun to bump into Polies here :)
It depends what you mean by useful. On its own, all you're doing is taking pictures of the sky and figuring out where the camera was pointing (and its field of view). Where it's useful is calibrating the pointing direction of other systems. It's fun to try the software at home (there is a public web interface), you just need a camera that can take long enough exposures to see stars without too much noise.
One of the more "useful" backyard astronomy tasks that is achievable for a dedicated amateur is variable star observation (eg AAVSO), because many stars don't need huge telescopes to observe and it's very expensive for a big observatory to stare at a single patch of sky for weeks. Nowadays we have instruments like LSST which is basically designed for this sort of surveying, but public data are still useful. And you do need to know exactly where you're pointing, so either you do this manually by pointing at a bunch of target stars, or you can use a guide scope that solves the field for you.
With images taken at night, you can run the images through Astrometry.net, which is a blind astrometric solver and will provide you with RA / Dec for most images, as long as you have at least a dozen or two stars visible. The code compares asterisms formed by multiple stars to index files built from Gaia or other similar data. This is the technique that's used more frequently for microwave telescopes located where there's a normal diurnal cycle, e.g., CLASS. The smaller the field of the view, the higher the precision, but it also works fine with a camera with a zoom lens.
BICEP, however, is located at the South Pole on a moving ice sheet, requiring frequent updates to its pointing model, and has six months of continuous daylight, so daytime star pointing observations are required. This requires a different technique. Instead of looking at asterisms with multiple stars, the optical pointing telescope is pointed at a single star using an initial pointing model, the telescope pointing is adjusted until the star is centered, and the offset is recorded. This measurement process is repeated for the few dozen brightest stars, which acquires the data needed for refining the pointing model.
it is kind of crazy and just a testament to people's creativity the plane basically flies using an ipdated version of whag the medieval ships used for navigation totally mind blown
why would you think this has stopped? All military aircraft and missiles need to operate in gps denied environments and near universally have dead reckoning or celestial navigation still.
I read that the US military wants a modernized version of celestial navigation to reduce dependence on GPS. With modern light amplification technology it might be able to work during the day.
A 2021 PopMech article about the US military's revival of interest in celestial navigation: https://www.popularmechanics.com/military/research/a36078957... It mentions a handheld system designed for special forces units to use, but I assume that that would incorporate something like a camera gyro stabiliser, presumably making the calculations easier than when relying on "strapdown" sensors.
> An Algorithm for Affordable Vision-Based GNSS-Denied Strapdown Celestial Navigation
Emphasis mine.
In what kind of context do you expect drones to operate in an area where GNSS is disabled by electronic warfare devices? Do you really think that a $400 cost is of any issue for military use?
Sorry, this is wrong. They'd be based on long range UAV's - not short range FPVs. Ukraine have been building about half a dozen types of different long range drones and are producing a good number of those every day now (some with jets making them analogous to cruise missiles). They're much more expensive than a cheap FPV with an RPG warhead strapped to it.
> Effectively halves the number of drones you can build
You're confusing the price tag of an FPV drone (for which this tech has no use, 4km precision is roughly the range of such drone, so even without a positioning device you'd get such a precision…) with the one of a long-range drone which is hundreds of magnitudes larger, even for Ukrainians.
FPV used in combat these days go a lot further than 4km, you can punch out to 25km+ with a good repeater setup easily, 10-15km with a good mast setup, and strikes using FPV quads out to 45km have been documented (but these are rare).
You just need to plan your battery selection and consider the electronic warfare environment to go the distance.
There’s also the optical fiber drones which come in spool lengths up to 20km…
I wonder if GPS and the like will be used more for their clock features than for position. The emissions celestial bodies are perfect fiducial markers [0,1], but connecting them to position still requires accurate timekeeping [2], as the paper notes:
Provided the use of an accurate clock, the results presented in this paper will not degrade over time.
That doesn't work well in some conditions, it's not new either.
Some cruise missiles that have TFR (Terrain-following radar) and actually do this already.
It also is not really applicable when you are on a balistic course at *very* high altitude, course correction has to happen early in these case given the reentry speed/constraints.
I guess timekeeping is relatively easy? These systems would only operate independently for a few hours tops. I would imagine even a standard quartz movement would be accurate enough.
Depends on what you're using time for. If you are doing advanced anti-jamming for comms for instance, you want extremely accurate timing (more accurate means you can frequency hop faster and do better anti-jamming).
> I guess timekeeping is relatively easy...... would imagine even a standard quartz movement would be accurate enough.
Good Lord! How wrong can you get!
Very precise timing (often taken from GNSS for convenience) is needed for much of the modern word, from IP, cellular and DAB networks, to AC phase matching the electrical mains grid. Quartz clocks are nowhere near accurate enough for these purposes.
TLDR: Our dependence on GNSS for timing almost dwarfs that for navigation. And we urgently need to consider using backups (be that local atomic clocks, or long wave time signals).
In the context of position keeping I think it's not too bad.
If we focus on longitude, where timing I guess matters more, the equator moves at a speed of about 0.46 km/s. So I guess being out by 1 second translates to precisely 0.46km error. That's second order compared to the stated error of 4 km, and it will be smaller still away from the equator.
I'm working off the assumption that such a drone can sync up to an accurate time source at launch, and then only needs maintain good timekeeping for its time in the air. I guess without the accurate initial time source, it gets bad. Being a minute out is suddenly 30km of latitude direction away.
Plus I think most decent quartz oscillators have a drift measured in single-digit PPM (or less) so even 100ms error over a single sortie would be surprising.
I mean, considering celestial navigation was a thing long before we had accurate clocks… I’d venture they aren’t wrong at all. Or did you forget that people have been doing celestial navigation by hand for over two millennia?
Quartz clocks didn't overtake chronometers in terms of accuracy until the mid 20th century, and chronometers will still beat regular crystals like you'd find in cheap electronics.
> Celestial navigation actually drove the development of accurate clocks
That's true, but that still doesn't change the fact that you don't need nanosecond precision for this purpose. At the equator, 1 second precision gives you roughly 500m accuracy, which is already much higher than what the celestial imagery allows here (4km in the paper).
Clearly this method isn't limited by clock accuracy at all.
1 second precision is a lot. A typical quartz resonator will drift by about .5 seconds per day at ambient conditions. In this paper they set the clock with GPS right before flight and they only fly for a few hours, so it's tolerable. But in a GPS denied environment where you can't set the clock right before flight, ie exactly where you are using this instead of gps, clock accuracy will become the dominant factor affecting your accuracy after a few days.
> But in a GPS denied environment where you can't set the clock right before flight
First of all I don't think the use-case involves the drones operators being deprived of GPS, but even if they were: you don't need GPS to get sub-second accurate time, any internet connection will do it thanks to NTP. Sure it's not as accurate as GPS, but it's still way more accurate than what you need for this to work. Heck, even sharing time through a phone call would work well enough.
An idea: use satellites for navigation. No, not the satellite signals, but the satellites themselves. Use NORAD orbital elements data for satellites to deduce land coordinates using time and pixel coordinates of satellites observed. Low orbit satellites will be only observable for two hours or so after sunset and before sunrise, but there are enough medium Earth orbit satellites that are still bright enough for a small camera and are visible whole night.
If you see satellites then likely you see even more stars. Unlike satellites the stars barelly move (actually they do, see "proper motion" [1]) relatively to each other, so a catalogue of stars (two coordinates values and two proper motion values) along with the time of observation is sufficient to be used over decades, unlike NORAD orbit elements requiring regular updates. With stars you need just one image at a known time to find your location, with satellites it is much much more complicated: you need to know where the sun is, you need few images of a satellite or even a video (likely on top of image of stars anyway) to distinguish it from the stars and to solve the trajectory.
How do you find your location from one image of stars? It is possible if you have a precise vertical but you don't have a precise vertical on a moving UAV. That is, you need an inertial system on top that will provide you with a vertical.
With satellite images, you don't need anything apart from time. And no, you don't need to "make a video to see satellites move", you start with your approximate location, make an image and find satellites within a circle where each of them might be, starting with the slowest moving - furthest away from you - ones (they provide poorest precision of coordinates because parallax is small, but you need to start with something, but their search circle will also be smaller), locating those, you get better coordinates of yours and the search circle for each satellite becomes smaller, then you can find faster moving satellites too to get precise coordinates of yourself.
You are right: to find a location from a star image you need a true horizon, but unless UAV is pulling some Gs even a basic accelerometer would give you the horizon, accuracy of that estimation will limit the accuracy of your location.
Regarding satellites: so "starting with the slowest moving" requires a series of images, doesn't it? Then how do you know "your approximate location"? From stars? In theory I understand what you say but practically it would be much more complicated and the obtained accuracy would not be better than with the stars, since in either case you also need a horizon to know your location.
Know your approximate location: by dead reckoning. You will need coordinate fixes once every few minutes anyway and you know your direction precisely enough from the same stars, error only comes from wind direction not being precisely known. So we are speaking of correcting for at most tens of kilometers of error. 10km at a typical distance of 1000km to a low orbit sat is <1 degree and only about 10 arcmin to a typical medium earth orbit satellite.
Astrometry allows for locating objects down to about 0.2 pixel reliably and to 0.1 pixels in optimal conditions, so a typical wide-angle camera that might have about 40 arcsecond pixels will easily give 8 arcsecond precision, for a satellite 4000km away (about 2000km orbit at 30 degrees elevation), that's 170 meters of location error, which is more than good enough for navigation (final targeting is done by optical pattern recognition on the ground anyway).
>since in either case you also need a horizon to know your location.
No you don't. Benefit of using satellites is that the source of coordinate data is the parallax of satellites vs stars. It works without having a vertical/horizon.
Simply put, we calculate that in a predicted location the satellite will be at a certain pixel distance from a few of the closest stars on the photo. And it will be a few pixels off that predicted point. Distance and direction of that error allows for calculation of discrepancy of predicted vs real location (and repeating this process on several satellites visible on same photo, allows to decrease the error by removing outliers - which might be noise/space rays on images or errors in star catalogs or orbital elements data, or satellites changing their orbits - and averaging the results).
No problem, in a few hours orbits of satellites don't change much, a day or two days' old ephemeris are ok. Especially not those on medium earth orbits which are the ones to be used (geostationary and other high orbital ones are too dim + too far away to provide precise coordinates; low orbit ones are not visible most of the night)
Perhaps I am too paranoid, but I've been told to avoid doing any DIY in this field of study.
Apparently, or so I'm told, out of the many, many ways to end up on a list — building a working celestial navigation system can lead to some very inconvenient outcomes. Second, only to ordering large quantities of certain chemicals online.
Is this true?
———
EDIT - from the paper, this is incorrect,
> The introduction of GPS caused the interest in celestial navigation to wither due to its relative inaccuracy. Consequently, celestial navigation is primarily seen only in space-based systems, whose orientation must be known to high levels of precision. Nonetheless, celestial navigation was identified as a desirable alternative to GPS [2], primarily due its robustness against potential jamming. Critically, few GPS-denied alternatives exist that are capable of using passive sensors to estimate global position at night or over the ocean. For this reason, celestial navigation remains an important topic of research.
The US and other militaries never stopped using these systems. They just stopped talking about them as much. Here's a literature search showing some of the slow & steady research on the topic,
You will likely raise a flag somewhere if you publicise what you are doing, but I highly doubt there would be any issues if you're working on this in private as a hobby.
As for chemicals, I can personally vouch that it is a terrible idea to order reagents (or even chemistry equipment) as an individual. I tried to teach myself organic synthesis in the summer before starting my doctoral studies, and ended up with MIB searching my house. Certainly on a list now :(
I remember watching a video about a dude who was building a mothership-launched glide drone that could land using camera vision. The idea was something like "the highest egg drop" or something like that. He was speaking with academics about his idea, who quickly told him to stop whatever he was doing because that would effectively be a forbidden military device. Guided artillery, basically.
Sadly I don't remember who it was, it was a fun story. I thought it was maybe Mark Rober or Joe Barnard but I really can't find it anymore.
Edit: found it! It was launched from a weather balloon, and it was both Mark and Joe. https://youtu.be/BYVZh5kqaFg
Happened to codyslab, videos taken down now (but still on archive.org) of a uranium purification process and possibly nilered,no way to prove it but he had a 'making rocket fuels: part 1' that was never followed p on. Not totally sure though as people like BPS space on yt have some pretty in depth tutorials on solid rocket motors (does explicitly censor how to make the ignition component)
Uhm, there are plenty of videos about making solid rocket fuel. Model rocketry doesn't need any special permits until you get to launching large rockets.
The rationale mentioned was it was under the subheading of people interested in strong encryption, people who care about being unobservable might have something to hide. Maybe it's a good list? People who you might want to ramp up a new Bletchley Park? Probably not.
Whatever you do, don't broadcast on the airwaves, as in pirate radio. That really does put you on the list.
I don't believe they have the people to monitor those that know 'how to use grep' and put them on a list. It stands to no reason, government civil servants are rarely from the top drawer.
Not NSA - you'd have someone from US Bureau of Industry and Security tracking you down (no pun) for most likely violating export controls if you were to openly share information on building the technology.
Openly publishing information in e.g. a book (or presumably a website) does not count as exporting. Releasing software is a bit more hazy, but has been defended[1].
Years ago, an acquaintance developed an autonomous flight controller for "real" helicopters. Cyclic-collective-tailrotor types. It would work on a full-size cargo helo just as well as an R/C model. He released it online, because why not? Drones are cool.
Some very nice gentlemen showed up and explained that he couldn't do that. He didn't get in any actual trouble that I'm aware of, but they "asked" him to take down the published code, and definitely not fix any of the bugs it had.
So, yeah, you're not wrong.
There are nuances to the rules, involving things that're openly published online, but I don't understand it in the least. A hacker's guide to ITAR would be an interesting document indeed.
I'm sure there are thousands of datasets of the night sky, and a camera, gyrometer (to get camera angles), clock, and basic image recognition/pattern matching is all you'd need.
yeah. Celestial navigation is a pretty standard thing to study if you're planning on taking up sailing or learning about satellite positioning. Celestial navigation with drones raises more interesting possibilities, but I don't think defence of key strategic assets against drones relies on the possibility it might be too difficult a problem to solve, and there are commercial solutions in the "drone navigation for GNSS denied environments" space. Don't even think the people that jailbreak consumer drones specifically to remove the geofences that prevent them flying near restricted areas get into trouble, at least not until someone spots them flying at the end of a runway or outside a military base.
Plenty of homework assignments in graduate level aerospace engineering courses that are right up the alley of this paper. Star trackers as backup for GNSS would be of great interest to maritime vessels worried about spoofing. So there are plenty of non-military use cases for these algorithms.
Can't find a source at the moment but cool side anecdote to this...working from memory
Honeywell was largely the driving force behind developing terrain avoidance systems for commercial aircraft. Those initial systems worked based on comparing the terrain below to the flight profile of an aircraft using a radar altimeter.
There was a CFIT (controlled flight into terrain) accident (I want to say AA in Peru?) where the mountains basically got to tall to fast to give the crew sufficient time to react because of that system. That caused Honeyweell to go back and look at ways to improve the system to be predictive rather than reactive - using a terrain database.
Honeywell bought/came into posession of a russian world wide terrain altitude database to do the first generation of this. I can only imagine the US had the same thing, or more accurate, but this was far enough ago that US Government wasn't sharing.
You're right! I actually know about the system you're talking about! The US data was classified and Donald Bateman, the engineer behind this and bought the data post Soviet Union collapse.
Ctrl+F and 0 results for munitions or bombs. Seems like this is really about $25 controller gets drones to within 4km in GPS denied enviroments, after which a $50 infrared camera + DSMAC find targets to hit.
I suspect you could get this to FAR higher accuracy if you combined it with a recent upload of Starlink et al LEO constellation ephemera, an initial GPS fix at launch, and a planned flight path, because LEO constellations are bright foreground objects (high location-specific parallax differences against background stars) at apparent magnitude of about 5.0.
This is simultaneously not reliant on perfect vertical attitude sensing coming off the autopilot IMU, you can do it purely photometrically.
The limitation is that this is a dawn/dusk thing, in the middle of the night there isn't a ton of light reflected and in the day you're limited by scattered daylight.
EDIT: Medium orbit satellites outside Earth's umbra but within view still provide some sort of visual fix. I wonder what the math is like for the GSO belt at midnight?
IMO could synergize well for higher end celestia navigation - there are optics sensors for day time tracking, but daylight sensitivity is limitation, perhaps much less so when fixed to starlink. So maybe feasible $$$ hardware can make daylight celestial starlink navigation workable.
Bringing component costs down seems like it would be much more useful for increasing capabilities / proliferating of lower end loitering munitions. You can already pack redundant navigation systems in more expensive platforms that gets them to area of operations. But being able to replace $20,000 inertial navigation system with $200 board + IR camera makes a lot of somewhat cheap smart munitions much smarter, and mitigates a lot of expensive electronics warfare platforms.
Starlink ubiquity does seem to open a lot of indirect strategic applications, i.e. research using starlink transmissions as bi/multistatic illumination source to detect stealth flyers.
That's a great idea. In the earlier days when they had about 2500 satellites in LEO I built a small visualizer from the fleet TLE data and it was remarkably simple with the skyfield library.
If you're in the fringes of a GNSS denial area ADSB might be useful as well. Would need more hardware of course.
Yes it does, but unless we’re talking an entire system failue the GNSS denial does tend to have limits in range. I’ve picked up ADSB traffic from well over 200km with a simple ground antenna, so if you’re in the fringes in could be a useful additional signal for similar reasons to the satellites.
I would assume the same. Operation in GNSS-denied environments is critical for military navigation systems. Comparatively, for civilian uses, it's an addon that provides low accuracy, and potentially high development or equipment cost (Maybe not for a cel nav camera, but for Ring Laser Gyro INSs etc)
GNSS is very accurate, and receivers are cheap, but its reliant on satellite signals makes relying on it a liability in adversarial uses.
Cel nav isn't self-contained in the way an INS is, because you need a clear LOS to the stars. But, it's useful on a clear night when your GPS is jammed.
note: i used gpt to clean this up because i am ill and distracted by snowfall and cold. It muddied some of my points, but it removed a lot of PII and rambling. note over.
I noticed some commenters questioning details in the article, like the Wi-Fi triangulation and the earthquake survivor detector. While it's fair to discuss technical aspects, I believe the focus should be on the broader implications rather than dismissing the story based on perceived inconsistencies.
I haven’t dealt with clearances or compartmentalization in years, but I know how serious these matters are. Disclosing specific names, dates, or events carries severe consequences—this isn’t something covered by toothless NDAs. The penalties can include federal prison for treason. I’ve personally experienced the DoD investigating me just because I was listed as a reference. It’s an intimidating process, and it makes sense why people who fear being doxxed rewrite their stories, swapping out modular details to obscure sensitive information.
Regarding the Wi-Fi triangulation: this is well within the realm of possibility. Many years ago, I purchased a Hydra SDR radio with inexpensive RTL-SDR chips. With four matched antennas arranged in a line or an X, connected to a Raspberry Pi 4, I could triangulate signals and visualize the results on a map. The hardware wasn’t advanced, but it worked. Even in 2012, there were rumors about using Wi-Fi signals to see through walls. Whether or not the article is perfectly accurate, the point is to consider the ethical and societal consequences of such technologies, not to nitpick technical details.
As for the earthquake survivor detector, the underlying principle is related. Identifying survivors using leaked signals like Bluetooth or cellular emissions isn’t fundamentally different from using Wi-Fi for similar purposes. The scenarios may involve different actors—military versus contractors—but the capabilities are converging.
I’ve worked at a defense contractor that manufactured components for Boeing and McDonnell Douglas jets. While I avoided involvement with military projects, I know how extensive and layered the contractor ecosystem is. Comments suggesting "there are only a few" don’t align with my experience.
On a personal note, I’ve always struggled with the ethical implications of the work I’ve done. This has made my career difficult. I don’t judge others who take these roles—someone else will do the work if they don’t—but my own scruples have been a constant challenge. For example, I once worked on a project at a large entertainment company based on an idea I had years earlier. The demanded i eventually sit in the office and handle tier 3 phone calls. I had a minor breakdown in the stairwell; i didn't even let my children consume their content, but i was too jazzed to work on the thing that i pitched to apple 7 years earlier. That was over a decade ago, but i'm still annoyed at myself.
I believe stories like this should be taken seriously. Dismissing them based on perceived inconsistencies seems like rationalization, to me.
At this point, it's pretty clear that this type of functionality is out of the bag. Any significant actor can easily replicate this with minimal effort, given the advances in AI.
I didn't see an explanation of what strapdown meant in this context, so I dug one up:
"Traditional, stable-platform navigation systems commonly involve separate accelerators and fibers or laser-based gyroscopes, with all the components mechanically and rigidly mounted on a stable platform that is isolated from the moving vehicle. This leads to the drawbacks of large size, poor reliability, and high cost. In contrast, in strapdown navigation systems, the inertial sensors are fastened directly to the vehicle’s body, which means the sensors rotate together with the vehicle. "
https://www.mdpi.com/2504-446X/8/11/652
Or in short, the sensors are strapped down to the platform being measured - like your phone’s sensors for example.
Yes! It's in contrast to gimbaled systems. Putting the measuring instrument on a gimbal simplifies the math and often improves accuracy, but at the expense that you need this large moving object that needs more power.
The ultimate example of this is the incredibly accurate and expensive and complicated floating Advanced Inertial Reference Sphere used on the Peacekeeper ICBM.
https://en.wikipedia.org/wiki/Advanced_Inertial_Reference_Sp...
Gyros on gimbals have other drawbacks, such as drifting and gimbal lock.
Just gimbal lock. Drifting happens to all of them.
But also the gimbal mechanisms, gimbal low response time, etc.
Except that a gimbaled system, if enclosed in a single box/pod, can also be described as a strapdown system. The term speaks more to separate modularity than how the system functions internally.
I wonder if you could dangle the star tracker below the drone on a long string, decoupling its attitude from the attitude of the drone. A kevlar or spectra string capable of supporting 100 grams would be 20μm in diameter; 3 meters of it would weigh a milligram, which is significantly less than gimbals. A small weight a couple of meters below the star tracker on a carbon-fiber-composite rod would seem to be able to stabilize its attitude further except in yaw.
Here's the camera used.[1]
It's not exotic. It's a 1936 × 1216 Sony sensor with a C-mount lens. That's below current phone camera resolution. It's monochrome, which makes sense in this application.
They have bigger collecting optics than a phone, and you get better sensitivity without the color filters.
I'm not clear on how they get their "down" reference. It's clear how they get heading; that's easy if you can see the stars. But you need an accurate horizon or vertical to get latitude and longitude. One degree of error in the vertical is maybe 100 km of error in position. How good are drone AHRS systems today in attitude? They have a correction system that works if you fly in a circle, but that just corrects for constant misalignment between camera and down reference.
[1] https://www.alliedvision.com/fileadmin/pdf/en/Alvium_1800_U-...
One degree of error in the vertical for the drone's control system, if it's hovering by blowing air downward at 5 meters per second, would be a ground speed of 87 mm/s (sin(1°)×5m/s) in whichever direction the tilt is. Also without any correction in the propeller speed it would result in a loss of altitude averaging 0.76mm/s (2.7 m/hour, (1 - cos(1°) 5m/s). But that could also be caused by something like a mild downdraft, while the horizontal drift could be caused by an imperceptibly weak breeze.
So I don't really know how this is normally done. If you can set the drone on the ground for a few minutes, you should be able to get a very good reference up-vector, but I don't know how long the MEMS gyros can preserve that up-vector without GNSS once it takes off.
At sea you can probably look at the horizon with a camera unless it's foggy.
Fun fact:
The SR-71 and U2 planes had automated celestial navigation systems b/c GPS wasn't around when they came out.
There a story in the book about Lockheed Martin's Skunk Works where they mention turning on the system while one of the planes was in the hangar and it locked on to a hole in the roof (sun was shining through the hole and system thought it was a start).
And, it's a bit older than that: the SR-71's derived from ICBM targeting systems,
https://en.wikipedia.org/wiki/Missile_guidance#Astro-inertia... ("the latter of which was adapted for the SR-71...")
(Actually the very first one, in that history, was an intercontinental cruise missile—a jet weapon that slightly predated (~1958) rockets powerful enough to cross oceans. ICBM's came a bit later. I'm pretty sure the first generation were pure-analog circuits, but I forgot where I read about that).
- "pretty sure the first generation were pure-analog circuits"
This Wikipedia entry isn't what I had in mind, but it describes an interesting analog mechanism,
- "For guidance systems based solely on star tracking, some sort of recording mechanism, typically a magnetic tape, was pre-recorded with a signal that represented the angle of the star over the period of a day. At launch, the tape was forwarded to the appropriate time.[2] During the flight, the signal on the tape was used to roughly position a telescope so it would point at the expected position of the star. At the telescope's focus was a photocell and some sort of signal-generator, typically a spinning disk known as a chopper. The chopper causes the image of the star to repeatedly appear and disappear on the photocell, producing a signal that was then smoothed to produce an alternating current output. The phase of that signal was compared to the one on the tape to produce a guidance signal.[2]"
https://en.wikipedia.org/wiki/Star_tracker
Reminds me of the "the distance between the rails of a railway are due to the width of Roman horse drawn carts" story.
I've actually used this fact in a related way, for wayfinding.
Old school Open-CV was able to see tracks well from an onboard monocular camera, but calibration and scale was annoying. Track width is accurate enough that I was able to use it to input a bunch of head-end video to map the tracks.
It was mostly just a modified edge detect where the tracks approximately would be. Once finding the tracks, you could automatically calculate the camera's height, lateral location, and angle.
My preferred one for EE folks is that reportedly the first Arduino boards (now 20 years old?) had a mistake in their eCAD where the second pair of headers was 0.05 instead of 0.1" apart. But it was too late by the time they caught it. And now, 20 years later, even high end microcontroller boards ship with that same gap to be compatible.
Small correction, one pair is 0.2" apart (so skipping one 0.1" pitch space), but the other is 0.16".
Isn't that one a hoax though?
There are a few standards for rail-line widths. I know the US is on one standard (I think the narrow width lines died out almost 100 years ago at this point). I know that Europe has two, or maybe more.
https://en.wikipedia.org/wiki/Standard-gauge_railway << This makes for fun reading if you're interested in that sort of thing.
Relevant passage
I dislike the incorrect usage of "prehistoric". The Roman-era is not prehistoric.
in theory, the carts could predate Roman history.
Lookup why torpedo's are almost universally 21" in diameter. The short version: because that was how big they were last time. There is no reason beyond 21" being usrd once upon a time and nobody wanting to break from it and have the old torpedos not work in the new boats.
H I Sutton, a naval defense analyst, made a nice video on this topic.
[1]: https://www.youtube.com/watch?v=cuS0yhwSPMc
I understand these still do incorporate celestial navigation.
Since GPS is quite likely going to be unavailable at the time of use.
The sensor was sensitive enough that it could detect stars during daylight:
* https://theaviationgeekclub.com/the-sr-71-blackbird-astro-na...
* https://www.twz.com/17207/sr-71s-r2-d2-could-be-the-key-to-w...
* https://en.wikipedia.org/wiki/Missile_guidance#Astro-inertia...
That's insanely cool what kind of cameras / telescope are strong enough to do that? My guess is it was primarily hardware and not software bacuse of compute limits
Did the planes have to fly above clouds?
It would work on the ground, I believe the pilots (normally) had to get a fix before takeoff. You do need to see the sky without cloud cover, but spy satellites were less of a concern back then so less risk of being overflown during a daylight setup. The cameras are basically visible telescopes with very narrow fields of view and good baffling. Only a few stars are bright enough that you can sight off them, but it can be done. The device does a scan, so it's only accepting a small area on the sky and the initial fix can be sped up because you know where/when the aircraft is taking off. A lot of tricks to minimize the need for "plate solving", like knowing which direction the aircraft is pointing within some tolerance.
Info here: https://www.sr-71.org/blackbird/manual/4/4-3.php
It wasn't exactly a simple instrument to use, and it relied on a ton of planned course information. You could also do a cold midair start after a power outage, but preflight would be much more preferable!
Some modern microwave telescopes like BICEP3 have an additional optical telescope for star pointing that are daylight-usable, but in summer you need to use a big baffle tube. The images are taken with a high sensitivity CCD camera and you can pick out brighter target stars surprisingly well in the images.
BICEP3 actually uses a >20 year old CCD camera with analog video output (BICEP Array uses newer cameras, with more modern sensors). Daytime star pointings are possible by using a low-pass filter to block visible light and take advantage of the sensitivity of CCD / CMOS sensors to the near infrared, where the daytime sky is more transparent, combined with baffling.
I would add it also uses an ancient analog TV for manual sighting in combination with the GUI for semi-auto centroiding. I always thought that was funny to see, but it seems to work well enough. Also, inserting that baffle is somewhat terrifying because it slots into a hole next to the main vacuum window and if you dropped it on the membrane, bad things would happen. Always fun to bump into Polies here :)
> Always fun to bump into Polies here :)
Definitely! I wasn't expecting to see a mention of BICEP while reading HN from Pole, particularly not on something as arcane as its star camera.
how hard would this be to set up for a total hardware noob? and how good or useful would the data be?
i know gaia data for instance is available for free but if one used just a homemade telescope could any useful celestial data be acquired?
It depends what you mean by useful. On its own, all you're doing is taking pictures of the sky and figuring out where the camera was pointing (and its field of view). Where it's useful is calibrating the pointing direction of other systems. It's fun to try the software at home (there is a public web interface), you just need a camera that can take long enough exposures to see stars without too much noise.
One of the more "useful" backyard astronomy tasks that is achievable for a dedicated amateur is variable star observation (eg AAVSO), because many stars don't need huge telescopes to observe and it's very expensive for a big observatory to stare at a single patch of sky for weeks. Nowadays we have instruments like LSST which is basically designed for this sort of surveying, but public data are still useful. And you do need to know exactly where you're pointing, so either you do this manually by pointing at a bunch of target stars, or you can use a guide scope that solves the field for you.
With images taken at night, you can run the images through Astrometry.net, which is a blind astrometric solver and will provide you with RA / Dec for most images, as long as you have at least a dozen or two stars visible. The code compares asterisms formed by multiple stars to index files built from Gaia or other similar data. This is the technique that's used more frequently for microwave telescopes located where there's a normal diurnal cycle, e.g., CLASS. The smaller the field of the view, the higher the precision, but it also works fine with a camera with a zoom lens.
BICEP, however, is located at the South Pole on a moving ice sheet, requiring frequent updates to its pointing model, and has six months of continuous daylight, so daytime star pointing observations are required. This requires a different technique. Instead of looking at asterisms with multiple stars, the optical pointing telescope is pointed at a single star using an initial pointing model, the telescope pointing is adjusted until the star is centered, and the offset is recorded. This measurement process is repeated for the few dozen brightest stars, which acquires the data needed for refining the pointing model.
Check out the CuriousMarc video series I linked under the OP, which gets into the sensor used and the encoding scheme.
it is kind of crazy and just a testament to people's creativity the plane basically flies using an ipdated version of whag the medieval ships used for navigation totally mind blown
The excellent CuriousMarc YouTube channel just started a new video series refurbishing a B-52 astrotracker, going over all of this in some detail:
https://www.youtube.com/watch?v=GkEjLqu-JH0&list=PL-_93BVApb...
Recommended.
It also immediately occured to me how much easier this should be on a copter, since you don't need a gimbal'd platform :)
I recently watched this channels videos on B-52 Astro tracking navigation system repairs: https://www.youtube.com/watch?v=nkvN74wuT8w&list=PL-_93BVApb...
He's got a bunch of other vintage electronics stuff that's from the early space program as well, interesting stuff to see the insides of that gear.
why would you think this has stopped? All military aircraft and missiles need to operate in gps denied environments and near universally have dead reckoning or celestial navigation still.
this makes me realize how lame GPS is, a centralized system that will take every thing down with it, should it ever go down
I read that the US military wants a modernized version of celestial navigation to reduce dependence on GPS. With modern light amplification technology it might be able to work during the day.
They have some of these on ships already.
A 2021 PopMech article about the US military's revival of interest in celestial navigation: https://www.popularmechanics.com/military/research/a36078957... It mentions a handheld system designed for special forces units to use, but I assume that that would incorporate something like a camera gyro stabiliser, presumably making the calculations easier than when relying on "strapdown" sensors.
4 km seems kind of coarse. Could you combine it with knowledge of satellite imagery or something to increase precision?
The drone on mars uses visual navigation based on known imagery of the terrain. I suspect that would be easier on this planet.
Not at sea.
400 Bucks Sensors is a touch rough.
This would only work at night, right?
The full title is:
> An Algorithm for Affordable Vision-Based GNSS-Denied Strapdown Celestial Navigation
Emphasis mine.
In what kind of context do you expect drones to operate in an area where GNSS is disabled by electronic warfare devices? Do you really think that a $400 cost is of any issue for military use?
> Do you really think that a $400 cost is of any issue for military use?
If your name is Ukraine then yeah. Effectively halves the number of drones you can build
Sorry, this is wrong. They'd be based on long range UAV's - not short range FPVs. Ukraine have been building about half a dozen types of different long range drones and are producing a good number of those every day now (some with jets making them analogous to cruise missiles). They're much more expensive than a cheap FPV with an RPG warhead strapped to it.
> Effectively halves the number of drones you can build
You're confusing the price tag of an FPV drone (for which this tech has no use, 4km precision is roughly the range of such drone, so even without a positioning device you'd get such a precision…) with the one of a long-range drone which is hundreds of magnitudes larger, even for Ukrainians.
FPV used in combat these days go a lot further than 4km, you can punch out to 25km+ with a good repeater setup easily, 10-15km with a good mast setup, and strikes using FPV quads out to 45km have been documented (but these are rare).
You just need to plan your battery selection and consider the electronic warfare environment to go the distance.
There’s also the optical fiber drones which come in spool lengths up to 20km…
You're picking nits here. Again, you barely even need any kind of positioning system in an FPV, let alone a positioning system with 4km accuracy.
Such a system only make sense for use in long-ranged drones.
I wonder if GPS and the like will be used more for their clock features than for position. The emissions celestial bodies are perfect fiducial markers [0,1], but connecting them to position still requires accurate timekeeping [2], as the paper notes:
Provided the use of an accurate clock, the results presented in this paper will not degrade over time.
0. https://www.twz.com/17207/sr-71s-r2-d2-could-be-the-key-to-w...
1. https://timeandnavigation.si.edu/multimedia-asset/nortronics...
2. https://www.rmg.co.uk/stories/topics/harrisons-clocks-longit...
They are perfect markers only as long as you can see them. Clouds and fog are your enemies here
That. However that works just fine for ICBMs and the like…
The future is more likely to be quantum accelerometers and quantum gyroscopes, as they have no “external dependency”.
More likely image-based navigation, where you just upload the entire imagery of the route and then use it to correct your inertial references.
Encrypted positioning information from low-orbit satellites is another option.
That doesn't work well in some conditions, it's not new either. Some cruise missiles that have TFR (Terrain-following radar) and actually do this already.
It also is not really applicable when you are on a balistic course at *very* high altitude, course correction has to happen early in these case given the reentry speed/constraints.
I presume radio signal or certain frequencies of thermal would be viable for adverse weather conditions.
The frequencies used in GPS: Yes. The frequencies used for celestial nav: No.
I guess timekeeping is relatively easy? These systems would only operate independently for a few hours tops. I would imagine even a standard quartz movement would be accurate enough.
Depends on what you're using time for. If you are doing advanced anti-jamming for comms for instance, you want extremely accurate timing (more accurate means you can frequency hop faster and do better anti-jamming).
> I guess timekeeping is relatively easy...... would imagine even a standard quartz movement would be accurate enough.
Good Lord! How wrong can you get!
Very precise timing (often taken from GNSS for convenience) is needed for much of the modern word, from IP, cellular and DAB networks, to AC phase matching the electrical mains grid. Quartz clocks are nowhere near accurate enough for these purposes.
This government report makes very sobering reading: https://www.gov.uk/government/publications/satellite-derived...
TLDR: Our dependence on GNSS for timing almost dwarfs that for navigation. And we urgently need to consider using backups (be that local atomic clocks, or long wave time signals).
In the context of position keeping I think it's not too bad.
If we focus on longitude, where timing I guess matters more, the equator moves at a speed of about 0.46 km/s. So I guess being out by 1 second translates to precisely 0.46km error. That's second order compared to the stated error of 4 km, and it will be smaller still away from the equator.
I'm working off the assumption that such a drone can sync up to an accurate time source at launch, and then only needs maintain good timekeeping for its time in the air. I guess without the accurate initial time source, it gets bad. Being a minute out is suddenly 30km of latitude direction away.
Plus I think most decent quartz oscillators have a drift measured in single-digit PPM (or less) so even 100ms error over a single sortie would be surprising.
> Our dependence on GNSS for timing almost dwarfs that for navigation.
Galileo satellites also now sign the timestamp (IIRC) via a Merkle tree so you know it isn't spoofed.
I mean, considering celestial navigation was a thing long before we had accurate clocks… I’d venture they aren’t wrong at all. Or did you forget that people have been doing celestial navigation by hand for over two millennia?
Celestial navigation actually drove the development of accurate clocks
https://timeandnavigation.si.edu/navigating-at-sea/longitude...
Quartz clocks didn't overtake chronometers in terms of accuracy until the mid 20th century, and chronometers will still beat regular crystals like you'd find in cheap electronics.
> Celestial navigation actually drove the development of accurate clocks
That's true, but that still doesn't change the fact that you don't need nanosecond precision for this purpose. At the equator, 1 second precision gives you roughly 500m accuracy, which is already much higher than what the celestial imagery allows here (4km in the paper).
Clearly this method isn't limited by clock accuracy at all.
1 second precision is a lot. A typical quartz resonator will drift by about .5 seconds per day at ambient conditions. In this paper they set the clock with GPS right before flight and they only fly for a few hours, so it's tolerable. But in a GPS denied environment where you can't set the clock right before flight, ie exactly where you are using this instead of gps, clock accuracy will become the dominant factor affecting your accuracy after a few days.
> But in a GPS denied environment where you can't set the clock right before flight
First of all I don't think the use-case involves the drones operators being deprived of GPS, but even if they were: you don't need GPS to get sub-second accurate time, any internet connection will do it thanks to NTP. Sure it's not as accurate as GPS, but it's still way more accurate than what you need for this to work. Heck, even sharing time through a phone call would work well enough.
Some clueless downvoting here!
The clock accuracy required for celestial navigation is on the order of seconds, not microseconds.
An idea: use satellites for navigation. No, not the satellite signals, but the satellites themselves. Use NORAD orbital elements data for satellites to deduce land coordinates using time and pixel coordinates of satellites observed. Low orbit satellites will be only observable for two hours or so after sunset and before sunrise, but there are enough medium Earth orbit satellites that are still bright enough for a small camera and are visible whole night.
If you see satellites then likely you see even more stars. Unlike satellites the stars barelly move (actually they do, see "proper motion" [1]) relatively to each other, so a catalogue of stars (two coordinates values and two proper motion values) along with the time of observation is sufficient to be used over decades, unlike NORAD orbit elements requiring regular updates. With stars you need just one image at a known time to find your location, with satellites it is much much more complicated: you need to know where the sun is, you need few images of a satellite or even a video (likely on top of image of stars anyway) to distinguish it from the stars and to solve the trajectory.
1. https://en.m.wikipedia.org/wiki/Proper_motion
How do you find your location from one image of stars? It is possible if you have a precise vertical but you don't have a precise vertical on a moving UAV. That is, you need an inertial system on top that will provide you with a vertical.
With satellite images, you don't need anything apart from time. And no, you don't need to "make a video to see satellites move", you start with your approximate location, make an image and find satellites within a circle where each of them might be, starting with the slowest moving - furthest away from you - ones (they provide poorest precision of coordinates because parallax is small, but you need to start with something, but their search circle will also be smaller), locating those, you get better coordinates of yours and the search circle for each satellite becomes smaller, then you can find faster moving satellites too to get precise coordinates of yourself.
You are right: to find a location from a star image you need a true horizon, but unless UAV is pulling some Gs even a basic accelerometer would give you the horizon, accuracy of that estimation will limit the accuracy of your location.
Regarding satellites: so "starting with the slowest moving" requires a series of images, doesn't it? Then how do you know "your approximate location"? From stars? In theory I understand what you say but practically it would be much more complicated and the obtained accuracy would not be better than with the stars, since in either case you also need a horizon to know your location.
No you just "start looking" on a single image.
Know your approximate location: by dead reckoning. You will need coordinate fixes once every few minutes anyway and you know your direction precisely enough from the same stars, error only comes from wind direction not being precisely known. So we are speaking of correcting for at most tens of kilometers of error. 10km at a typical distance of 1000km to a low orbit sat is <1 degree and only about 10 arcmin to a typical medium earth orbit satellite.
Astrometry allows for locating objects down to about 0.2 pixel reliably and to 0.1 pixels in optimal conditions, so a typical wide-angle camera that might have about 40 arcsecond pixels will easily give 8 arcsecond precision, for a satellite 4000km away (about 2000km orbit at 30 degrees elevation), that's 170 meters of location error, which is more than good enough for navigation (final targeting is done by optical pattern recognition on the ground anyway).
>since in either case you also need a horizon to know your location.
No you don't. Benefit of using satellites is that the source of coordinate data is the parallax of satellites vs stars. It works without having a vertical/horizon.
Simply put, we calculate that in a predicted location the satellite will be at a certain pixel distance from a few of the closest stars on the photo. And it will be a few pixels off that predicted point. Distance and direction of that error allows for calculation of discrepancy of predicted vs real location (and repeating this process on several satellites visible on same photo, allows to decrease the error by removing outliers - which might be noise/space rays on images or errors in star catalogs or orbital elements data, or satellites changing their orbits - and averaging the results).
yeah, and then you need to get refreshed orbital elements for those satellites. not good if you are in an airtight environment.
celestial ephemerides don't change nearly as much.
No problem, in a few hours orbits of satellites don't change much, a day or two days' old ephemeris are ok. Especially not those on medium earth orbits which are the ones to be used (geostationary and other high orbital ones are too dim + too far away to provide precise coordinates; low orbit ones are not visible most of the night)
Perhaps I am too paranoid, but I've been told to avoid doing any DIY in this field of study.
Apparently, or so I'm told, out of the many, many ways to end up on a list — building a working celestial navigation system can lead to some very inconvenient outcomes. Second, only to ordering large quantities of certain chemicals online.
Is this true?
———
EDIT - from the paper, this is incorrect,
> The introduction of GPS caused the interest in celestial navigation to wither due to its relative inaccuracy. Consequently, celestial navigation is primarily seen only in space-based systems, whose orientation must be known to high levels of precision. Nonetheless, celestial navigation was identified as a desirable alternative to GPS [2], primarily due its robustness against potential jamming. Critically, few GPS-denied alternatives exist that are capable of using passive sensors to estimate global position at night or over the ocean. For this reason, celestial navigation remains an important topic of research.
The US and other militaries never stopped using these systems. They just stopped talking about them as much. Here's a literature search showing some of the slow & steady research on the topic,
https://scholar.google.com/scholar?q=astro-inertial+navigati...
Example systems that have been deployed in many (most? all???) American combat aircraft,
https://theaviationist.com/2021/09/10/lets-have-another-look...
https://www.gpsworld.com/honeywell-demonstrates-military-gra...
https://ieeexplore.ieee.org/document/290940
Alright. I'm ready to be on that list, Mr NSA agent.
You will likely raise a flag somewhere if you publicise what you are doing, but I highly doubt there would be any issues if you're working on this in private as a hobby.
As for chemicals, I can personally vouch that it is a terrible idea to order reagents (or even chemistry equipment) as an individual. I tried to teach myself organic synthesis in the summer before starting my doctoral studies, and ended up with MIB searching my house. Certainly on a list now :(
LLC or nonprofit, with a business address. At least for bio reagants, they won't ship to you otherwise.
Please tell me there is a blog post or something documenting this experience. Sounds like a fun read.
I remember watching a video about a dude who was building a mothership-launched glide drone that could land using camera vision. The idea was something like "the highest egg drop" or something like that. He was speaking with academics about his idea, who quickly told him to stop whatever he was doing because that would effectively be a forbidden military device. Guided artillery, basically.
Sadly I don't remember who it was, it was a fun story. I thought it was maybe Mark Rober or Joe Barnard but I really can't find it anymore.
Edit: found it! It was launched from a weather balloon, and it was both Mark and Joe. https://youtu.be/BYVZh5kqaFg
Happened to codyslab, videos taken down now (but still on archive.org) of a uranium purification process and possibly nilered,no way to prove it but he had a 'making rocket fuels: part 1' that was never followed p on. Not totally sure though as people like BPS space on yt have some pretty in depth tutorials on solid rocket motors (does explicitly censor how to make the ignition component)
Uhm, there are plenty of videos about making solid rocket fuel. Model rocketry doesn't need any special permits until you get to launching large rockets.
I guess once you have the permits it doesnt matter. With the nilered vids, they were more of the more hypergolic variety
That is probably why Eric Schmidt (ex Google CEO) develops his AI combat drones in Estonia instead of US.
Afraid not. I don't have much of an online presence, so didn't think to write anything.
there's always a first time :)
You're definitely on the list of people worried about being on lists now.
The only people not on any lists, are boring people.
They're just kept on the list of all people not on a list.
Peaceful, not harmless.
But that's the bestest list!
I've also been told learning too much about linux or the nuclear reactions in power plants or bombs puts you on a list. I just assume I'm on several.
Learning too much about Linux puts you on a list? That can't be a thing. Isn't Linux itself entirely a civilian project?
The rationale mentioned was it was under the subheading of people interested in strong encryption, people who care about being unobservable might have something to hide. Maybe it's a good list? People who you might want to ramp up a new Bletchley Park? Probably not.
Whatever you do, don't broadcast on the airwaves, as in pirate radio. That really does put you on the list.
I don't believe they have the people to monitor those that know 'how to use grep' and put them on a list. It stands to no reason, government civil servants are rarely from the top drawer.
Not NSA - you'd have someone from US Bureau of Industry and Security tracking you down (no pun) for most likely violating export controls if you were to openly share information on building the technology.
Celestial tracking is a dual use technology (See 7A004 or 7A104) - https://www.bis.doc.gov/index.php/documents/regulations-docs...
Openly publishing information in e.g. a book (or presumably a website) does not count as exporting. Releasing software is a bit more hazy, but has been defended[1].
[1]: https://en.wikipedia.org/wiki/Pretty_Good_Privacy#Criminal_i...
Years ago, an acquaintance developed an autonomous flight controller for "real" helicopters. Cyclic-collective-tailrotor types. It would work on a full-size cargo helo just as well as an R/C model. He released it online, because why not? Drones are cool.
Some very nice gentlemen showed up and explained that he couldn't do that. He didn't get in any actual trouble that I'm aware of, but they "asked" him to take down the published code, and definitely not fix any of the bugs it had.
So, yeah, you're not wrong.
There are nuances to the rules, involving things that're openly published online, but I don't understand it in the least. A hacker's guide to ITAR would be an interesting document indeed.
> A hacker's guide to ITAR would be an interesting document indeed.
I suspect producing something called "a hacker's guide to ITAR" really would get you put on a list...
Knowing how stupid ITAR enforcement is, the guide would probably fall under ITAR :)
I'm sure there are thousands of datasets of the night sky, and a camera, gyrometer (to get camera angles), clock, and basic image recognition/pattern matching is all you'd need.
yeah. Celestial navigation is a pretty standard thing to study if you're planning on taking up sailing or learning about satellite positioning. Celestial navigation with drones raises more interesting possibilities, but I don't think defence of key strategic assets against drones relies on the possibility it might be too difficult a problem to solve, and there are commercial solutions in the "drone navigation for GNSS denied environments" space. Don't even think the people that jailbreak consumer drones specifically to remove the geofences that prevent them flying near restricted areas get into trouble, at least not until someone spots them flying at the end of a runway or outside a military base.
Plenty of homework assignments in graduate level aerospace engineering courses that are right up the alley of this paper. Star trackers as backup for GNSS would be of great interest to maritime vessels worried about spoofing. So there are plenty of non-military use cases for these algorithms.
Can't find a source at the moment but cool side anecdote to this...working from memory
Honeywell was largely the driving force behind developing terrain avoidance systems for commercial aircraft. Those initial systems worked based on comparing the terrain below to the flight profile of an aircraft using a radar altimeter.
There was a CFIT (controlled flight into terrain) accident (I want to say AA in Peru?) where the mountains basically got to tall to fast to give the crew sufficient time to react because of that system. That caused Honeyweell to go back and look at ways to improve the system to be predictive rather than reactive - using a terrain database.
Honeywell bought/came into posession of a russian world wide terrain altitude database to do the first generation of this. I can only imagine the US had the same thing, or more accurate, but this was far enough ago that US Government wasn't sharing.
You're right! I actually know about the system you're talking about! The US data was classified and Donald Bateman, the engineer behind this and bought the data post Soviet Union collapse.
https://en.wikipedia.org/wiki/C._Donald_Bateman
https://www.flightsafetyaustralia.com/2023/05/don-bateman-en...
the amount of random 'stuff' like this that I've accummulated over the years could fill a book that is interesting only to me lol
Thanks for the link!
These days celestial navigation is trivial. See my comment here
https://news.ycombinator.com/item?id=42695079
I'm told quantum navigation is the new hotness for being on lists these days
Ctrl+F and 0 results for munitions or bombs. Seems like this is really about $25 controller gets drones to within 4km in GPS denied enviroments, after which a $50 infrared camera + DSMAC find targets to hit.
Thanks for the summary.
I suspect you could get this to FAR higher accuracy if you combined it with a recent upload of Starlink et al LEO constellation ephemera, an initial GPS fix at launch, and a planned flight path, because LEO constellations are bright foreground objects (high location-specific parallax differences against background stars) at apparent magnitude of about 5.0.
This is simultaneously not reliant on perfect vertical attitude sensing coming off the autopilot IMU, you can do it purely photometrically.
The limitation is that this is a dawn/dusk thing, in the middle of the night there isn't a ton of light reflected and in the day you're limited by scattered daylight.
EDIT: Medium orbit satellites outside Earth's umbra but within view still provide some sort of visual fix. I wonder what the math is like for the GSO belt at midnight?
EDIT2: Or the Moon.
IMO could synergize well for higher end celestia navigation - there are optics sensors for day time tracking, but daylight sensitivity is limitation, perhaps much less so when fixed to starlink. So maybe feasible $$$ hardware can make daylight celestial starlink navigation workable.
Bringing component costs down seems like it would be much more useful for increasing capabilities / proliferating of lower end loitering munitions. You can already pack redundant navigation systems in more expensive platforms that gets them to area of operations. But being able to replace $20,000 inertial navigation system with $200 board + IR camera makes a lot of somewhat cheap smart munitions much smarter, and mitigates a lot of expensive electronics warfare platforms.
Starlink ubiquity does seem to open a lot of indirect strategic applications, i.e. research using starlink transmissions as bi/multistatic illumination source to detect stealth flyers.
That's a great idea. In the earlier days when they had about 2500 satellites in LEO I built a small visualizer from the fleet TLE data and it was remarkably simple with the skyfield library.
If you're in the fringes of a GNSS denial area ADSB might be useful as well. Would need more hardware of course.
Doesn't ADS-B get the location from GNSS?
Yes it does, but unless we’re talking an entire system failue the GNSS denial does tend to have limits in range. I’ve picked up ADSB traffic from well over 200km with a simple ground antenna, so if you’re in the fringes in could be a useful additional signal for similar reasons to the satellites.
Just spitballing though really.
I would assume the same. Operation in GNSS-denied environments is critical for military navigation systems. Comparatively, for civilian uses, it's an addon that provides low accuracy, and potentially high development or equipment cost (Maybe not for a cel nav camera, but for Ring Laser Gyro INSs etc)
GNSS is very accurate, and receivers are cheap, but its reliant on satellite signals makes relying on it a liability in adversarial uses.
Cel nav isn't self-contained in the way an INS is, because you need a clear LOS to the stars. But, it's useful on a clear night when your GPS is jammed.
Don't get distracted https://news.ycombinator.com/item?id=42388354
note: i used gpt to clean this up because i am ill and distracted by snowfall and cold. It muddied some of my points, but it removed a lot of PII and rambling. note over.
I noticed some commenters questioning details in the article, like the Wi-Fi triangulation and the earthquake survivor detector. While it's fair to discuss technical aspects, I believe the focus should be on the broader implications rather than dismissing the story based on perceived inconsistencies.
I haven’t dealt with clearances or compartmentalization in years, but I know how serious these matters are. Disclosing specific names, dates, or events carries severe consequences—this isn’t something covered by toothless NDAs. The penalties can include federal prison for treason. I’ve personally experienced the DoD investigating me just because I was listed as a reference. It’s an intimidating process, and it makes sense why people who fear being doxxed rewrite their stories, swapping out modular details to obscure sensitive information.
Regarding the Wi-Fi triangulation: this is well within the realm of possibility. Many years ago, I purchased a Hydra SDR radio with inexpensive RTL-SDR chips. With four matched antennas arranged in a line or an X, connected to a Raspberry Pi 4, I could triangulate signals and visualize the results on a map. The hardware wasn’t advanced, but it worked. Even in 2012, there were rumors about using Wi-Fi signals to see through walls. Whether or not the article is perfectly accurate, the point is to consider the ethical and societal consequences of such technologies, not to nitpick technical details.
As for the earthquake survivor detector, the underlying principle is related. Identifying survivors using leaked signals like Bluetooth or cellular emissions isn’t fundamentally different from using Wi-Fi for similar purposes. The scenarios may involve different actors—military versus contractors—but the capabilities are converging.
I’ve worked at a defense contractor that manufactured components for Boeing and McDonnell Douglas jets. While I avoided involvement with military projects, I know how extensive and layered the contractor ecosystem is. Comments suggesting "there are only a few" don’t align with my experience.
On a personal note, I’ve always struggled with the ethical implications of the work I’ve done. This has made my career difficult. I don’t judge others who take these roles—someone else will do the work if they don’t—but my own scruples have been a constant challenge. For example, I once worked on a project at a large entertainment company based on an idea I had years earlier. The demanded i eventually sit in the office and handle tier 3 phone calls. I had a minor breakdown in the stairwell; i didn't even let my children consume their content, but i was too jazzed to work on the thing that i pitched to apple 7 years earlier. That was over a decade ago, but i'm still annoyed at myself.
I believe stories like this should be taken seriously. Dismissing them based on perceived inconsistencies seems like rationalization, to me.
thanks for the link!
Ukrainian and Russian drones already do that. They use simple visual navigation for terminal guidance: https://www.rockingrobots.com/ukraine-drones-able-to-navigat...
At this point, it's pretty clear that this type of functionality is out of the bag. Any significant actor can easily replicate this with minimal effort, given the advances in AI.