Have you ever wondered how you can walk or jog, with your head bouncing up and down, while focusing on a near or far object? Have you noticed how you can do the same and assess object distance, speed, and minute details of that object quickly and accurately? Well the reason you can do it so well is the way the mind uses your memory burst images and retinal jitter to help you fill in the details quickly, while your visual cortex fills in the details. white – all of this happens in microseconds using a brain that consumes barely 20 watts of power. Wow, talk about cutting edge organic design and technology – impressive my human colleague.
Of course, some animals and birds do it even better than us, with much smaller brains. Consider if you want an owl, a hawk, or a bald eagle. The phrase “Eagle Eyes” is apropos here, think about it. Using biomimicry strategies, maybe we can make our UAV (Unmanned Aerial Vehicle) or drone video imagery more powerful and sharper – and in doing so, consider for a moment how many applications this will affect? How do we get here with these concepts? Well, 3 axis gimbals are the most sought after by owners of small drones but why have a 3 axis if you can do a 4.5 or 6 axis gyro stabilization gimbal for better video resolution and accuracy. It would certainly help stabilize the video camera, as are the quadcopter models which are quite stable even in moderate turbulence.
Let’s talk about strategies for a moment – to get to that eagle eye ability that we see in nature. A patent, “Apparatus and Methods for Stabilizing and Reducing Vibration,” US 9277130 B2, duly states: “Currently, there are mainly four vibration damping methods commonly used in photography and videography to reduce the effects of vibration on the machine. ‘image: software stabilization, lens stabilization, sensor stabilization and general stabilization of shooting equipment. “
And if we were also working with visual recognition systems for burst images, focusing only on things that meet our mission criteria, “OR” are complete anomalies (irrelevant). In a human mind, displaced things often trigger the N400 brain wave, evoking curiosity, nuance, or interest. We can program the same thing using algorithms requiring the video camera; investigate, identify and act. Or, as Colonel Boyd’s “OODA Loop Strategy” suggests: Observe, Direct, Decide, and Act. And the fighter pilot who can do it the fastest should win aerial dogfighting if he puts his energy and speed to good use. Good advice, even if we borrow it to discuss how best to program a UAS (Unmanned Air System) to accomplish a task or mission.
In an article “Model-Based Video Stabilization for Real-Time Micro Aerial Vehicles”, the abstract states; “The emerging branch of micro aerial vehicles (MAVs) has generated great interest in their indoor navigation capabilities, but they require high quality video for remotely operated or autonomous tasks. A common problem with on-board video quality is the unwanted motion effect, and there are different approaches to solving it with mechanical stabilizers or video stabilization software. Very few video stabilization software can be applied in real time and their algorithms do not take into account the intentional movements of the operator. “
Indeed, that’s the problem and it’s real if we’re hoping to send drones on autonomous missions, whether it’s delivering a package or working as a flying security guard for, say, a commercial construction site.
This article then proposes a way to solve some of these challenges, namely: “A new technique is introduced for real-time video stabilization with a low computational cost, without generating false movements or decreasing performance. Our proposal uses a combination of transformations and rejection of outliers to obtain a robust estimate of interframe motion and a Kalman filter based on a dynamic model. “
Now, while there are people working on these things, it is evident that until the sensors, imaging, and equipment improve at such tasks, we will not meet the desire to allow for drones to work autonomously in a safe and efficient manner, which will gain the benefits we expect from these technologies in the future. I hope you will take into consideration my thoughts here and some of my recommendations for borrowing strategies from nature to achieve such goals.
A.) “Vision Based Detection and Distance Estimation of Micro Unmanned Aerial Vehicles”, by Fatih Gokce, Gokturk Ucoluk, Erol Sahin and Sinan Kalkan. Sensors 2015, 15 (9), 23805-23846; doi: 10.3390 / s150923805
B.) Thesis: “Accelerated Object Tracking with Local Binary Features”, by Breton Lawrence Minnehan of the Rochester School of Technology; July 2014.
C.) “Model-based video stabilization for micro aerial vehicles in real time”, by Wilbert G Aguilar and Cecilio Angulo.
D.) “Real-time megapixel multispectral bioimaging,” by Jason M. Eichenholz, Nick Barnetta, Yishung Juanga, Dave Fishb, Steve Spanoc, Erik Lindsleyd and Daniel L. Farkasd.
E.) “Improved monitoring system based on the micro-inertial measurement unit to measure sensorimotor responses in pigeons”, by Noor Aldoumani, Turgut Meydan, Christopher M Dillingham and Jonathan T. Erichsen.
Source by Lance Winslow