-
-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Color picker for Building/Roof color #633
Comments
I don't remember exactly, but I think there were good reasons not to allow arbitrary colors in the building/roof color quests. |
The problems with picking from camera are:
I agree that the current mixed palette is very confusing. |
Yeah, few presented choices are similar/confusing... I'd go with cleanup there of that few circled values there, but remain with small fixed palette using mostly (and preferably only, if possible) English names (instead of hex-RGB colors) as values. One could use taginfo to find popular values for that and similar tags (ignoring those present in SCEE, as those are obviously incorrectly self-reinforcing), and see how the list could be modified in BuildingColour.kt and RoofColour.kt |
This attracted my attention, because I'm also struggling with this and therefore I usually ignore this matter. I feel the color of something is largely opinion based and dependent on several factors (time of the day, natural or artificial lighting and who knows what else). Still it fascinates me so I've looked up how the Vespucci app tackles this. |
My first thought was to be able to choose between the CSS Web Colors. But after looking at them, I doubt the prevalence of these colors in house facades. (https://www.w3.org/TR/css-color-3/#html4) At least in German-speaking countries, RAL colors are widely used for facade paints. But there are thousands of colors here. That would be too many. But perhaps a selection can be defined from the Classic colors? |
The benefit of using a color picker is that it already comes with a built-in palette (see link below), matching the selected color to the closest available shade. Instead of copying RGB hex codes, we could simply fetch the color name. In terms of accuracy, 10 phone cameras are likely to produce more consistent results than asking 10 people to manually select from a palette of 200 color options. Modern phone cameras are highly reliable. To address lighting, we could make this quest time-based, avoiding color mapping at night, which wouldn't make sense regardless of the method used. @mcliquid well, RAL is copyright protected. We can't use it. See first sentence in your link: "Die technischen Werte der dargestellten RAL Farben sind urheberrechtlich geschützt." Edit: I forgot the link to the color palette the app uses: https://en.wikipedia.org/wiki/List_of_colors_%28alphabetical%29 |
Maybe @gmikhail could weigh in here — would you be open to adding a feature that returns the color name when your app is called? |
A little philosophy: |
Before people invest effort, I should warn that I'm heavily against SCEE (being FOSS) depending or even recommending using 3rd party non-FOSS apps in order to solve a quest, especially if there is no way to solve a quest without external app. And unless I'm wrong, suggested app does not look FOSS. In other words, I don't want SCEE getting F-droid anti-features flags because we promote some closed source software, nor SCEE being associated with such practices. If we go the way of external color-picker app, it should be one maintained and available on f-droid at the very least.
I'm dubious how well it would work. Think different daylight conditions. Have you tried it? What RGB colors do you get of the same roof on sunny day, cloudy day, at dusk and at early dawn at the very least? I doubt you get the same, or even similar, values. Now try with different camera softwares and with different phone hardware. Perhaps if the color palette is very small, error rates might be comparable to human.
RGB hex codes are definitely bad idea; so using color name would be less bad. But still I prefer humans and the small fixed palette to such external apps, but even if we found FOSS external app which is suitable, I'd say:
[Citation Needed]
That is why trying to match 200-color palette is horrible idea (for computers or humans). It is madness trying to match if something is "Cerulean frost" or "Carolina blue" or "Dark sky blue" or "Glaucous", even with studio equipment and stable unchanging known CRI; much less in varying outside conditions. We should stick to say 16-colors fixed palette. Then human can then perhaps make a mistake between (dark shade of) "red" and "brown" at dusk.
That is way too simplistic... you'd at the very least need to carry sheet of white paper with you at all times, and do AWB calibration on it (after properly positioning it to avoid shades/creases) every time lighting condition change (i.e. few hours pass, or sun becomes occluded by clouds etc.). That is hugely impractical, and without it AutoWhiteBalance sucks even for regular photography; and is horrible for attempting color matching.
I like that too. Very small color palette is absolute requirement for even attempting to claim any kind of verifiability. Rule of thumb: if you cannot recite whole palette by names from memory, it's too big. |
Yes, it is impossible, so a very small palette must be used.
It's even worse than that. While you can precisely and verifiably measure a paving stone1 and results will come the same every time2, you can't do that with colors, not even close. Every time some mapper try they'll get a different result then you. Only protection against that is making the scale more and more coarse (i.e. the palette smaller and smaller) until such verifiability errors disappear / become negligible. Footnotes |
First of all, I'm glad you liked my app. In the Color Picker settings, you can enable the function of automatically copying the HEX color value to the clipboard after pressing the button. Then you can paste it into any app as plain text. The color name is not copied for the sake of universality.
You are right that Color Picker is not FOOS, but the built-in palettes that come with the app are free to use. In the Color Picker settings, the palette sources are specified. The source for the palettes was mainly Wikipedia: https://en.wikipedia.org/wiki/List_of_colors_(alphabetical) From the discussion, I understand that you are looking for a simple and universal way to specify the color of the roof. I am not familiar with the project in question, but if you can't specify the HEX color value manually, it's better to use a predefined color palette with a limited number of colors. As @mnalis suggested earlier. You can use either a ready-made palette, like HTML Colors or use your own simplified palette based on spectral colors: chromatic colors (primary colors of the rainbow) + achromatic colors (black, white, gray): Red You might also want to take a look at the 2014 Material Design palette: https://m2.material.io/design/color/the-color-system.html Red 500 I still use the Material Design color palette from 2014 in many things and am happy with its simplicity and variety. A small note: In the third (current) iteration of Material Design, Google abandoned the pre-defined color palette and modern palettes are generated based on any selected color. Therefore, in the latest documentation, this particular palette is not mentioned. |
There's a limit to this. If you reduce it to two colors, like black and white, no one can confirm it. The current palette is similarly unbalanced—too coarse in some areas, too fine in others. A refined, verifiable palette, like the one on Wikipedia, could work since it organically grew from people's desire to differentiate colors. But with so many colors, it's tough for humans to handle. That's where the app comes in. If you take a picture of a house with 10 modern phones, the colors will resemble reality much more than if a person manually selects it. Plus, the human isn't completely out of the loop. You get a preview of the color the app chooses, and you can confirm it if it looks right to you. Here’s an example of how the app's chosen color compares to a natural looking camera view: @mnalis wrote
This sounds a lot like how Microsoft viewed open source as a "cancer" two decades ago, just reversed. I strongly disagree. I use plenty of free software, but I also see no problem using proprietary tools when they work better. I’m not suggesting we replace the manual palette. Instead, like the Measure app in SC, we could offer a button that lets users open the external app when needed. |
@gmikhail thanks for taking the time to respond - much appreciated!
You noted that HEX values are used for universality, but in this case, the universality of color names might be more useful than 8-bit RGB. This project already has two apps: StreetComplete (and its variant, SCEE) and the Measure app, which helps with measurements using ARcore. My suggestion is to follow the same approach with your app for color: SCEE could trigger your app, which would return the color name instead of the HEX value. Hope this makes sense! |
I agree that the app should either implement the picker or at least rely on a FOSS app, and that means that the picker is probably not something coming soon. Yet, at least the palette could be chosen right now before the picker matter is resolved. In most European cities I've visited, I would come across houses that I could not in honesty assign one of the colors of the palette (for instance pale green or pink, and it's even worse for the roof colors). And on the other hand, some colors are way too close (Ex : light yellow and beige are indiscernible - at least on my phone). |
As an experiment, I took few pictures with 4 different phones, one after the other (i.e. same time of day and same color temperature -- which is likely to be even more highly variable factor). Here are the results of samples of several building facades and roofs:
While some look relatively similar, some are quite different (E.g. 8 or 11) |
And here using 3 different color averaging methods for e.g. pic1 & pic8 from https://devpicker.com/image-average-color for easy comparison (all values in each column would ideally be the same or at least very close). Also included is closest UNIX ex. Picture 1:
ex. Picture 8:
There seems to be too much difference with RGB.txt colors for the picture matcher to be useful with different phones And especially as examples above should be much closer to each other then if :
If someone manages to at least get consistent (yet sensible, unlike CSS matches above) named color results for samples in the table above, I'll take extra time to take the same pics in different lightning conditions so we can see how the colormatcher fares with that much harder problem. My conclusion is: unless someone can come up with much better color matcher, that camera-approach is likely not going to be very usable i.e. verifiable. Thus, we're probably best with improving existing fixed-color palette table in SCEE (i.e. getting rid of hex colors without names and similar colors, while adding a few new named colors which might be wanted). Suggestions about which colors to throw out and which to bring in are much welcome! (and those would be needed even if we did find a way to use camera to find closest color in that 4x7 palette, so please do give suggestions regardless if you prefer camera or manual input) |
@mnalis can you do the same with your eyes for comparison on a different building? |
I'm not sure what are asking here, can you elaborate exact steps of the experiment you propose? I obviously cannot extract exact numbers from my eyeballs as I can with digital photos 😄. Or was that perhaps rhetorical question? With that research above I've tried to collect samples to determine how hard it might be to do useful computer color detection (as it turns out, very hard with existing common methods, unless someone can come up with new algorithm for better color detection and/or future phones come with significantly better cameras in color-reliability regards - and not just "more megapixels" which seems to be current trend). Human eyes have better dynamic range than today's cameras; but you're correct that they also have issues with detecting correct white balance and color detection in different lightning conditions; most popular recent example is probably the Blue and Black Dress (which is often by different humans identified as Gold and White Dress instead!) Also, human brain however seems to usually be significantly better at such guesses that current software - e.g. when I was looking at those buildings when I was talking pictures, I would've still said those roofs were brown, even if camera & computer estimates it is grey / plum. So IMHO at least the fixed palette quest still seems to make sense for SCEE (even if not for vanilla SC), even if all colour tags are inherently somewhat subjective... |
Not rhetorical. The question was: Isn't it expected that the results are different in different lighting conditions? I mean if I go out at sunset, the colors of a roof will appear completely different than on a summer day. I was looking at the neighbors house, which has a dark red roof, which appeared 3 hours ago on a foggy winter day like dark brown to my eyes. So the question for me is, is there really a large difference between a camera sensor compared to a human eye? At the end we look at most pictures made by a camera and are sufficiently satisfied with the representation of the scene, in mimicking real life.
Well, there are two types of dynamic range. One is per scene and one is considering the adjustments with sensitivity, aperture, and exposure time. While the highly miniaturized sensors in phones do have a more limited dynamic range for scenes, higher quality sensors do surpass the human eye's capabilities in terms of dynamic range. But this doesn't matter much, as you wouldn't expect a roof to be overexposed or underexposed if you point the camera sensor on it. The camera sensor will adjust the exposure, aperture, and sensor sensitivity to adjust for the center of the image to be within its dynamic range. Your example however has nothing to do with dynamic range or natural vision of humans, as the scene is not viewed by humans directly. We're instead looking at differently good or bad calibrated screens with different white balances in rooms with lighting with different color temperatures and brightnesses at an image presented there. It's obvious those differences in presentation will lead to different outcomes of perception. |
I'll keep this deliberately very brief, but I hope my feedback is not in vain. First of all, thank you very much for the extensive tests! A hell of a lot of time and good work has gone into this. And I'd say it was worth it. All in all, I would say that reducing the possible color values to the 140 color names of the W3C would already create a very good basis. Of course, there will be situations where the principle of exact repeatability cannot be applied. But we also have to be satisfied with a certain amount of fuzziness in OSM. If I enter a POI, e.g. a hydrant, with my smartphone, it may deviate by 2-3 meters from the exact position because my GPS is not good. Someone else moves the hydrant to what they consider to be the correct position on another smartphone, but which may end up being 2-3 meters off to another position. Then a third person comes along and aligns the hydrant with the aerial image, which is also 2 meters off because it is poorly aligned. Whether a house is displayed in a 3D application with OpenStreetMap thrives on the idea that things can always be improved afterwards. Let's start by classifying things in red or green. Whether it is |
Good question. But not really - if you intend to tag it with existing tags. And that is a huge problem, yes? OSM depends on verifyability. Small subjective differences might be workable - e.g. while I might call something However, bigger differences put whole tag existence in question. If your error bars get almost as big as range of information recorded, the information becomes useless1.
It is not just a raw sensor, much of it is in software (i.e. human brain vs. camera software). I'm not a biologist, so I certainly have no idea of why I seem to match colors much better then my Samsung phone. I suspect (judging by some optical illusions) that brain does some serious whackery and NN-matching previous experiences with current situation. But the current results make me wary of phone cameras in their current state (whatever the underlying reason might be) -- e.g. that roof (that is bright orange in reality) getting identified as plum (!!) is hardly usable IMHO. Perhaps phone-camera taking could be improved, but it is lots of work (for users or developers or both):
Agreed; and that was my intention behind those test -- to measure how big do those differences get in real conditions. Unfortunately, it seems to be that using camera as truth source is not viable at this time, unless we get much better software solutions.
Yes, that was my conclusion too, but I could never write so short and concise 😃 TL;DR: So, IMHO, stay with the manual (human-chosen) palette, but improve the list we have. colour=* wiki seems to favour 16 basic CSS 1.0 colors, but list some taginfo favorites that (IMHO righly) overtake those in usage. So my suggestion would be to use that 16-color CSS chart with few colors replaced - e.g. get rid of Footnotes
|
Use case
Currently the color of a building/roof need to be estimated by eye. Apart from the limited list of colors, which are hard to match to the real world, this is often weird, as there are very little differences between some colors, say
lightgrey
vs#cccccc
,lightyellow
vsbeige
,grey
vs#708090
making me often wonder what to choose.Proposed Solution
There's a neat app called Color Picker where you can average the color directly from the camera image, with a configurable circle size.
It would be nice if SCEE could work together with the App dev of the app, to have the app called like the Measure app, and if the user taps on the color copy button below, the color is copied back to SCEE (either as RGB color or as color name).
The text was updated successfully, but these errors were encountered: