Welcome to my site

My name is Ignacio (Nacho) Mellado. I build intelligent machines with visual perception and control. In this website you will find a selection of my projects, experiments and random thoughts.

How I hacked a vintage C++ compiler to support exceptions before they were standard

TL;DR I hacked a 28-year-old C++ compiler to support exceptions. That’s two years before exceptions were even part of the first C++ standard!

The compiler is the Watcom C++32 Optimizing Compiler Version 11.0, from 1996, by Sybase, Inc. Around that time, some friends and I started a real-time graphics [...]

Fig. 1: The Watcom C++32 Optimizing Compiler Version 11.0 running on DOSBox.

Years later, I resumed maintaining the library. Real-time code was mostly written in assembly, but I started adding C++ support to integrate faster. However, with error checking everywhere, the code started [...]

Readying a MacBook Pro M2 Max for Tensorflow

I was looking for a development laptop that would let me prototype rather big ML models locally. Life will have me moving across countries in the next months, and I would like to avoid depending [...]

I ended up getting myself a MacBook Pro M2 Max. Apple silicon is very power-efficient, and, most importantly, its shared memory architecture gives the GPU access to the entire RAM. In my case, that's [...]

Making Tensorflow work with Apple silicon can be straightforward... if you know how to. Hopefully, this post will save someone the time I spent troubleshooting.

According to this Apple Developer guide, you need four things:

The Perceptive Portable Device
Can a smartphone perceive the environment like a human does? Portable devices are full of sensors, but they are still very limited to understand what is happening from a human perspective: Where am I inside the building? Is my user healthy? Is the baby crying? This side project is my quest to give portable devices such capabilities.
Autonomous LinkQuad quadcopter with Computer Vision
MAVwork, my open-source framework for visual control of multirotors, is now supporting a new quadcopter from UAS Technologies Sweden. Everything was tested with a speed control application. Watch a semiautonomous flight of this new elegant drone.
Autonomous Pelican quadcopter with Computer Vision
Check how the versatile Pelican from Ascending Technologies acquired basic automatic take-off, hover and landing capabilities thanks to MAVwork, the open-source framework for drone control. Watch the open MultirotorController4mavwork in action.
Camera localization with visual markers
There are tons of applications where it is key to have the accurate location of things in a workspace. With these cheap and easy-to-build visual markers, you can know the position and attitude of anything with a camera on it. They block less visual space and offer less air resistance than equivalent-size 2D codes.
MAVwork released for Parrot AR.Drone
MAVwork is a framework for drone control that was born in 2011 during a short research stay in the Australian Research Centre for Aerospace Automation (ARCAA). Read about the inception of MAVwork and watch a video of the first test controller for a Parrot AR.Drone with a Vicon system.
Laura: Self-driving public transportation. Prototype II.
Discover how this 12-ton truck was automated to drive itself with Computer Vision. This was the second prototype, after a Citro├źn C3, in a project led by Siemens to develop a self-driving public transportation system.
Laura: Self-driving public transportation. Prototype I.
High buildings blocking GPS signal, lane markings and road signs hidden by traffic, ... Cities can be a very harsh environment for a driverless bus trying to know where it is and where to go. In this project, led by Siemens, I explored a solution with Computer Vision.