We're looking for a developer to create a library (either in C++, Swift or Objective C) to interactively allow a user to replace a wall color in a mobile app, whilst keeping color luminance and saturation values and only changing preceptioned hue. This is meant to work interactively (i.e. tap a wall part and it gets floodfilled and painted) and not in real time (i.e. augmented reality). nonetheless, augmented-reality-like performance would be a bonus. User Experience is *not* part of the job - only the Graphics-related algorithmics.
Overall ideas on implementing such an algorithm need to combine some flood-fill / edge detection algorithm with the color replacement part. Most people online suggest using OpenCV for ease of implementation (see below - not our posting, just a similar example), but a pure OpenGL implementation could also be considered. Nonetheless, considering the work already provided by OpenCV, its usage is highly encouraged.