“We developed a deep neural network that maps the phase and amplitude of WiFi signals to UV coordinates within 24 human regions. The results of the study reveal that our model can estimate the dense pose of multiple subjects, with comparable performance to image-based approaches, by utilizing WiFi signals as the only input.”
Here’s what they’re putting in the goggles that Infantrymen wear now.
I don’t care to guess what the drones are packing.
What we know about drones is that they have cameras that can discern individuals from 10 km altitude.
What we suspect is that US has Hubble-sized spy satellites that can do almost the same. There were a lot of classified military STS missions.
What is theoretically possible is that US drones and spy sats can function as very large arrays (we do this with astronomical telescopes already) to dramatically increase spatial resolution.
I’d believe it. When I was in the infantry 20 years ago we could see you 3km away with the optics mounted on our machineguns. And several kilometers with cameras mounted on towers. I don’t know how far they went but it was at least 5km because we were directing mortar fire with them and that’s about the range of the mortar system we were using.