RealSense » Case Studies » From Bird Feeders to Edge AI:Tracking Flight Paths with RealSense D456

From Bird Feeders to Edge AI:Tracking Flight Paths with RealSense D456


Tracking Birds in 3D with RealSense, Geti, and OpenVINO

Download Case Study

The challenge

What happens when you combine Intel’s edge AI stack with a backyard bird feeder? For Dr. Oliver Hamilton, it becomes a personal experiment in perception, inference, and flight path reconstruction.

The solution

Paired with a 13th Gen Intel® NUC (i5-1340P), the setup runs full-stack AI locally, without the cloud in the loop. It’s a perfect testbed for embedded inference in robotics and wildlife tracking alike.

The results

With the latest hardware, he moved from 70-80 fps capture rates to >130 fps inference performance and began aiming at 500 fps pipelines. Pushing these limits highlighted where the bottlenecks shifted: once camera I/O is solved, model optimization became the new frontier.

Gazing out of the kitchen window one morning, he was watching birds fly to and from the feeder, and a question popped into his freshly caffeinated head… “I wonder if different birds approach the feeder differently?” anecdotally, some birds will take a multi-stop, cautious, approach, while others zip in from afar with little hesitation. The hypothesis was that different species have different flight paths.


To answer this, he’d need to detect and track birds in images and know their 3D positions. Luckly, he’s well versed with the Intel AI and vision offerings to solve this!

Rugged Depth in the Rain

Oliver’s experiments began with the RealSense™ Depth Camera D456, an IP65-rated camera designed to withstand outdoor elements. Mounted near his bird feeder, it captures accurate depth data without having to worry about the typical weather in the North of England i.e. rainy. “Putting the IP65 rating to the test” is not just a product demo, it’s a way to ensure reliability in uncontrolled, natural environments.

Capture:
First, only the RGB or colour video data was recorded. Using the RealSense™ SDK users can easily capture different data streams from the cameras such as the RGB, Depth, IR.

Train:
He used the Geti™ computer vision training platform to annotate, train and optimize a custom vision model to detect birds from the RGB or colour stream from the RealSense™ D456 Camera. To improve inference performance, he used the OpenVINO model, that is automatically optimized by Geti™, for running inference on Intel hardware very efficiently.

At this point, the system could ‘see’ and find birds in 2D images, now to integrate the 3D data with the final phase

Deploy:
Running the OpenVINO detection model at real-time on the edge at the same time as capturing the depth information from the RealSense™ D456, he used the bounding box locations from the vision model to extract the 3D positions of the birds as they fly to and from the feeder.

Plotting 3D Flight Paths

When multiple birds are detected, they can result in confusion for the flights paths. To avoid false paths, Oliver implemented a velocity-based filter. If a new detection suggests a jump greater than the expected airspeed of a small bird, around 8 m/s for a bluetit, the point is rejected. The result is a clean trajectory that looks like a true flight path rather than random noise jumping between inconsistent points.

Monty Python: The Airspeed Velocity of an Unladen Swallow

Oliver admits to looking up the ‘airspeed velocity of an unladen bird’ along the way. #MontyPython #ifyouknowyouknow

If you’re not a Monty Python fan, here’s the reference: in Monty Python and the Holy Grail (1975), King Arthur is challenged by a Bridgekeeper who demands: “What is the airspeed velocity of an unladen swallow?”

The knights argue over whether he means an African or a European swallow – an intentionally ridiculous bit of trivia that’s since become a classic geek in-joke.

BirdBoxBot, Trackertron, and the FPS Race

Oliver’s creativity didn’t stop at a single bird tracker. He’s previously built devices like BirdBoxBot, an AI powered birdhouse that would tweet or “X” updates, and its cousin Trackertron, designed for high-speed motion tracking.
With the latest hardware, he moved from 70-80 fps capture rates to >130 fps inference performance and began aiming at 500 fps pipelines. Pushing these limits highlighted where the bottlenecks shifted: once camera I/O is solved, model optimization became the new frontier.

Full Intel Stack at Work

Oliver’s workflow is a microcosm of Intel’s AI vision ecosystem:

  • Capture: RealSense D456 depth streams
  • Train: Geti™ for dataset creation, labelling, and model training
  • Optimize: OpenVINO™ for model inference on Intel CPU/GPU/NPU hardware
  • Hardware: 13th Gen Intel® NUC for running in real-time at the edge
  • Visualize: 3D plotting tools from Intel Labs

By structuring his experiments as phase 1 (capture), phase 2 (train), and phase 3 (deploy), he mirrored production-ready pipelines. What started as a hobby became a case study in end-to-end computer vision deployment.

Why

it Matters

Oliver’s backyard projects may seem playful, but they illustrate serious lessons for robotics, security, and edge AI:

  • Environmental robustness: If it works in the rain with birds, it’s likely to work in harsher industrial environments. The RealSense™ D456’s IP65 sealing against dust and water makes it suitable for everything from smart cities to outdoor robotics.
  • Depth accuracy and range: Unlike 2D cameras, the D456 provides precise 3D data that enables motion tracking, obstacle detection, and environment mapping – even in complex, dynamic settings.
  • Data cleanliness: Techniques, such as, velocity-aware filtering ensures results don’t suffer from spurious associations, a lesson that applies to both bird flight paths and industrial machine monitoring.
  • Scalability: Pushing pipelines past 100 fps shows how far edge inference can be stretched before needing specialized hardware, proving that the D456 can handle both prototype experiments and production deployments.
  • Engagement: His work inspires others by showing that applied AI can be both technically rigorous and fun – something that helps developers approach edge AI not just as work, but as discovery.

What’s

next?

Learn how RightHand Robotics is using RealSense™ computer vision technology to revolutionize automated warehouse order fulfillment.

Robotics

overview

See how RealSense™ cameras can revolutionize the robotics industry with the latest in depth‑sensing technology.