Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Color picker for Building/Roof color #633

Open
RubenKelevra opened this issue Sep 2, 2024 · 22 comments
Open

Color picker for Building/Roof color #633

RubenKelevra opened this issue Sep 2, 2024 · 22 comments

Comments

@RubenKelevra
Copy link

Use case
Currently the color of a building/roof need to be estimated by eye. Apart from the limited list of colors, which are hard to match to the real world, this is often weird, as there are very little differences between some colors, say lightgrey vs #cccccc, lightyellow vs beige, grey vs #708090 making me often wonder what to choose.

Screenshot_2024-09-02-12-18-27-720-edit_de westnordost streetcomplete expert

Proposed Solution
There's a neat app called Color Picker where you can average the color directly from the camera image, with a configurable circle size.

Screenshot_2024-09-02-12-19-30-015-edit_com android vending

It would be nice if SCEE could work together with the App dev of the app, to have the app called like the Measure app, and if the user taps on the color copy button below, the color is copied back to SCEE (either as RGB color or as color name).

@Helium314
Copy link
Owner

I don't remember exactly, but I think there were good reasons not to allow arbitrary colors in the building/roof color quests.

@rusty-snake
Copy link

The problems with picking from camera are:

  • depend on (day)light conditions
  • depend on device camera color "correction"

I agree that the current mixed palette is very confusing.

@mnalis
Copy link
Collaborator

mnalis commented Sep 4, 2024

Yeah, few presented choices are similar/confusing... I'd go with cleanup there of that few circled values there, but remain with small fixed palette using mostly (and preferably only, if possible) English names (instead of hex-RGB colors) as values.

One could use taginfo to find popular values for that and similar tags (ignoring those present in SCEE, as those are obviously incorrectly self-reinforcing), and see how the list could be modified in BuildingColour.kt and RoofColour.kt

@meiphoo7-Mae
Copy link

This attracted my attention, because I'm also struggling with this and therefore I usually ignore this matter. I feel the color of something is largely opinion based and dependent on several factors (time of the day, natural or artificial lighting and who knows what else). Still it fascinates me so I've looked up how the Vespucci app tackles this.
To my surprise it offers only the colors: blue, brown, yellow, grey, red, white and black or the user can type whatever they see fit.
TBH: I like this. The difference between these colors is pretty clear and if they're not appropriate in rare cases there's always the free choice available.

@mcliquid
Copy link

mcliquid commented Sep 4, 2024

My first thought was to be able to choose between the CSS Web Colors. But after looking at them, I doubt the prevalence of these colors in house facades. (https://www.w3.org/TR/css-color-3/#html4)

At least in German-speaking countries, RAL colors are widely used for facade paints. But there are thousands of colors here. That would be too many.
There are already 216 shades in the Classic color palette.
(https://www.ral-farben.de/alle-ral-farben)
(https://de.m.wikipedia.org/wiki/RAL-Farbe)

But perhaps a selection can be defined from the Classic colors?

@RubenKelevra
Copy link
Author

RubenKelevra commented Sep 4, 2024

The benefit of using a color picker is that it already comes with a built-in palette (see link below), matching the selected color to the closest available shade. Instead of copying RGB hex codes, we could simply fetch the color name.

In terms of accuracy, 10 phone cameras are likely to produce more consistent results than asking 10 people to manually select from a palette of 200 color options. Modern phone cameras are highly reliable.

To address lighting, we could make this quest time-based, avoiding color mapping at night, which wouldn't make sense regardless of the method used.

@mcliquid well, RAL is copyright protected. We can't use it. See first sentence in your link: "Die technischen Werte der dargestellten RAL Farben sind urheberrechtlich geschützt."

Edit: I forgot the link to the color palette the app uses:

https://en.wikipedia.org/wiki/List_of_colors_%28alphabetical%29

@RubenKelevra
Copy link
Author

Maybe @gmikhail could weigh in here — would you be open to adding a feature that returns the color name when your app is called?

@meiphoo7-Mae
Copy link

A little philosophy:
Let's assume you encounter a road that's not surveyed by someone else. There's a quest about the surface of the road. The surface is clearly made out of paving stones, so that's the answer to the quest.
But hey, wait a minute... There are dozens of types of paving stones. So you're going on your knees to measure if the stone is 10 X 2.5 cm to distinguish it from stones that are 9.6 X 4.5 cm?
Well, I don't think so, because the answer "paving stones" is sufficient.
So why is it so important to distinguish one shade of color from the probably 43493 very similar other shades of the same color?
A paving stone is a paving stone and likewise a color is a color.

@mnalis
Copy link
Collaborator

mnalis commented Sep 5, 2024

would you be open to adding a feature that returns the color name when your app is called?

Before people invest effort, I should warn that I'm heavily against SCEE (being FOSS) depending or even recommending using 3rd party non-FOSS apps in order to solve a quest, especially if there is no way to solve a quest without external app.

And unless I'm wrong, suggested app does not look FOSS. In other words, I don't want SCEE getting F-droid anti-features flags because we promote some closed source software, nor SCEE being associated with such practices. If we go the way of external color-picker app, it should be one maintained and available on f-droid at the very least.

The benefit of using a color picker is that it already comes with a built-in palette (see link below), matching the selected color to the closest available shade.

I'm dubious how well it would work. Think different daylight conditions. Have you tried it? What RGB colors do you get of the same roof on sunny day, cloudy day, at dusk and at early dawn at the very least? I doubt you get the same, or even similar, values. Now try with different camera softwares and with different phone hardware.

Perhaps if the color palette is very small, error rates might be comparable to human.

[...] matching the selected color to the closest available shade
Instead of copying RGB hex codes, we could simply fetch the color name.

RGB hex codes are definitely bad idea; so using color name would be less bad. But still I prefer humans and the small fixed palette to such external apps, but even if we found FOSS external app which is suitable, I'd say:

  • color picker / measurement app should be freely available on F-droid at least (and Google Play & Huawei etc.; not to mention soon also on iPhones)
  • it should still match to the closest color in small fixed color palette (e.g. 16-color one)
  • there should still be interface for solving quest without external app (e.g. like measurement quests work if AR software is not available)

In terms of accuracy, 10 phone cameras are likely to produce more consistent results

[Citation Needed]

than asking 10 people to manually select from a palette of 200 color options. Modern phone cameras are highly reliable.

That is why trying to match 200-color palette is horrible idea (for computers or humans). It is madness trying to match if something is "Cerulean frost" or "Carolina blue" or "Dark sky blue" or "Glaucous", even with studio equipment and stable unchanging known CRI; much less in varying outside conditions.

We should stick to say 16-colors fixed palette. Then human can then perhaps make a mistake between (dark shade of) "red" and "brown" at dusk.

To address lighting, we could make this quest time-based, avoiding color mapping at night, which wouldn't make sense regardless of the method used.

That is way too simplistic... you'd at the very least need to carry sheet of white paper with you at all times, and do AWB calibration on it (after properly positioning it to avoid shades/creases) every time lighting condition change (i.e. few hours pass, or sun becomes occluded by clouds etc.). That is hugely impractical, and without it AutoWhiteBalance sucks even for regular photography; and is horrible for attempting color matching.

To my surprise it offers only the colors: blue, brown, yellow, grey, red, white and black or the user can type whatever they see fit.
TBH: I like this. The difference between these colors is pretty clear and if they're not appropriate in rare cases there's always the free choice available.

I like that too. Very small color palette is absolute requirement for even attempting to claim any kind of verifiability. Rule of thumb: if you cannot recite whole palette by names from memory, it's too big.

@mnalis
Copy link
Collaborator

mnalis commented Sep 5, 2024

So why is it so important to distinguish one shade of color from the probably 43493 very similar other shades of the same color?

Yes, it is impossible, so a very small palette must be used.

A paving stone is a paving stone and likewise a color is a color.

It's even worse than that. While you can precisely and verifiably measure a paving stone1 and results will come the same every time2, you can't do that with colors, not even close. Every time some mapper try they'll get a different result then you. Only protection against that is making the scale more and more coarse (i.e. the palette smaller and smaller) until such verifiability errors disappear / become negligible.

Footnotes

  1. As evidenced by a plethora of paving_stones sub-tags 😄

  2. well, technically it won't be exactly the same, but the errors will be so small they'd be the same for all practical purposes. It might 23.007mm one time and 22.997mm another time, but you'll write it down as 23mm either way.

@gmikhail
Copy link

gmikhail commented Sep 5, 2024

Maybe @gmikhail could weigh in here — would you be open to adding a feature that returns the color name when your app is called?

First of all, I'm glad you liked my app. In the Color Picker settings, you can enable the function of automatically copying the HEX color value to the clipboard after pressing the button. Then you can paste it into any app as plain text. The color name is not copied for the sake of universality.

Before people invest effort, I should warn that I'm heavily against SCEE (being FOSS) depending or even recommending using 3rd party non-FOSS apps in order to solve a quest, especially if there is no way to solve a quest without external app.

You are right that Color Picker is not FOOS, but the built-in palettes that come with the app are free to use. In the Color Picker settings, the palette sources are specified. The source for the palettes was mainly Wikipedia: https://en.wikipedia.org/wiki/List_of_colors_(alphabetical)


From the discussion, I understand that you are looking for a simple and universal way to specify the color of the roof.

I am not familiar with the project in question, but if you can't specify the HEX color value manually, it's better to use a predefined color palette with a limited number of colors. As @mnalis suggested earlier.

You can use either a ready-made palette, like HTML Colors or use your own simplified palette based on spectral colors: chromatic colors (primary colors of the rainbow) + achromatic colors (black, white, gray):

Red #FF0000
Orange #FF8000
Yellow #FFFF00
Green #00FF00
Cyan #00FFFF
Blue #0000FF
Violet #8000FF
Black #000000
White #FFFFFF
Gray #808080

You might also want to take a look at the 2014 Material Design palette: https://m2.material.io/design/color/the-color-system.html
If you take a shade as a basis for each color, for example, 500 (used by default to represent the color in the first iteration of the palette: https://m1.material.io/style/color.html#color-color-palette), then you get 21 colors in total:

Red 500 #F44336
Pink 500 #E91E63
Purple 500 #9C27B0
Deep Purple 500 #673AB7
Indigo 500 #3F51B5
Blue 500 #2196F3
Light Blue 500 #03A9F4
Cyan 500 #00BCD4
Teal 500 #009688
Green 500 #4CAF50
Light Green 500 #8BC34A
Lime 500 #CDDC39
Yellow 500 #FFEB3B
Amber 500 #FFC107
Orange 500 #FF9800
Deep Orange 500 #FF5722
Brown 500 #795548
Gray 500 #9E9E9E
Blue Gray 500 #607D8B
Black #000000
White #FFFFFF

I still use the Material Design color palette from 2014 in many things and am happy with its simplicity and variety.

A small note: In the third (current) iteration of Material Design, Google abandoned the pre-defined color palette and modern palettes are generated based on any selected color. Therefore, in the latest documentation, this particular palette is not mentioned.

@RubenKelevra
Copy link
Author

RubenKelevra commented Sep 5, 2024

While you can precisely and verifiably measure a paving stone and results will come the same every time, you can't do that with colors, not even close. Every time some mapper try they'll get a different result then you. Only protection against that is making the scale more and more coarse (i.e. the palette smaller and smaller) until such verifiability errors disappear / become negligible.

There's a limit to this. If you reduce it to two colors, like black and white, no one can confirm it. The current palette is similarly unbalanced—too coarse in some areas, too fine in others.

A refined, verifiable palette, like the one on Wikipedia, could work since it organically grew from people's desire to differentiate colors. But with so many colors, it's tough for humans to handle. That's where the app comes in.

If you take a picture of a house with 10 modern phones, the colors will resemble reality much more than if a person manually selects it.

Plus, the human isn't completely out of the loop. You get a preview of the color the app chooses, and you can confirm it if it looks right to you.

Here’s an example of how the app's chosen color compares to a natural looking camera view:

Screenshot_2024-09-05-13-20-58-201-edit_gmikhail colorpicker

@mnalis wrote

Before people invest effort, I should warn that I'm heavily against SCEE (being FOSS) depending or even recommending using 3rd party non-FOSS apps in order to solve a quest, especially if there is no way to solve a quest without external app.

This sounds a lot like how Microsoft viewed open source as a "cancer" two decades ago, just reversed. I strongly disagree. I use plenty of free software, but I also see no problem using proprietary tools when they work better.

I’m not suggesting we replace the manual palette. Instead, like the Measure app in SC, we could offer a button that lets users open the external app when needed.

@RubenKelevra
Copy link
Author

@gmikhail thanks for taking the time to respond - much appreciated!

Maybe @gmikhail could weigh in here — would you be open to adding a feature that returns the color name when your app is called?

In the Color Picker settings, you can enable the function of automatically copying the HEX color value to the clipboard after pressing the button. Then you can paste it into any app as plain text. The color name is not copied for the sake of universality.

You noted that HEX values are used for universality, but in this case, the universality of color names might be more useful than 8-bit RGB.

This project already has two apps: StreetComplete (and its variant, SCEE) and the Measure app, which helps with measurements using ARcore. My suggestion is to follow the same approach with your app for color: SCEE could trigger your app, which would return the color name instead of the HEX value.

Hope this makes sense!

@Chepycou
Copy link

Chepycou commented Dec 2, 2024

I agree that the app should either implement the picker or at least rely on a FOSS app, and that means that the picker is probably not something coming soon.

Yet, at least the palette could be chosen right now before the picker matter is resolved. In most European cities I've visited, I would come across houses that I could not in honesty assign one of the colors of the palette (for instance pale green or pink, and it's even worse for the roof colors). And on the other hand, some colors are way too close (Ex : light yellow and beige are indiscernible - at least on my phone).

@mnalis
Copy link
Collaborator

mnalis commented Jan 7, 2025

As an experiment, I took few pictures with 4 different phones, one after the other (i.e. same time of day and same color temperature -- which is likely to be even more highly variable factor). Here are the results of samples of several building facades and roofs:

Phone rm3 rm6 p30 s23
img1 c_rm3_1s1 c_rm6_1s1 c_p30_1s1 c_s23_1s1
img2 c_rm3_1s2 c_rm6_1s2 c_p30_1s2 c_s23_1s2
img3 c_rm3_2s1 c_rm6_2s1 c_p30_2s1 c_s23_2s1
img4 c_rm3_2s2 c_rm6_2s2 c_p30_2s2 c_s23_2s2
img5 c_rm3_2s3 c_rm6_2s3 c_p30_2s3 c_s23_2s3
img6 c_rm3_2s4 c_rm6_2s4 c_p30_2s4 c_s23_2s4
img7 c_rm3_2s5 c_rm6_2s5 c_p30_2s5 c_s23_2s5
img8 c_rm3_3s1 c_rm6_3s1 c_p30_3s1 c_s23_3s1
img9 c_rm3_3s2 c_rm6_3s2 c_p30_3s2 c_s23_3s2
img10 c_rm3_3s3 c_rm6_3s3 c_p30_3s3 c_s23_3s3
img11 c_rm3_3s4 c_rm6_3s4 c_p30_3s4 c_s23_3s4

While some look relatively similar, some are quite different (E.g. 8 or 11)
(source images here if someone wants to play more)

@mnalis
Copy link
Collaborator

mnalis commented Jan 7, 2025

And here using 3 different color averaging methods for e.g. pic1 & pic8 from https://devpicker.com/image-average-color for easy comparison (all values in each column would ideally be the same or at least very close). Also included is closest UNIX RGB.txt as well as CSS color name (using https://shallowsky.com/colormatch)

ex. Picture 1:

RGB simple sqrt dominant sqrt CSS
rm3 #8ba5b5 LightSkyBlue3 #8ca6b6 LightSkyBlue3 #91abbb LightSkyBlue3 silver
rm6 #96a8b8 gray66 #97a9b8 gray66 #97a9b8 gray66 silver
p30 #9aafbf SlateGray3 #9bb1c0 SlateGray3 #acc2d2 LightSteelBlue silver
s23 #88a6c9 LightSkyBlue3 #89a7ca LightSkyBlue3 #90aed1 LightSkyBlue3 silver

ex. Picture 8:

RGB simple sqrt dominant sqrt CSS
rm3 #ba9b8f RosyBrown #ba9c8f RosyBrown #b99a8e RosyBrown silver
rm6 #a69176 LightYellow4 #a69176 LightYellow4 #a69176 LightYellow4 gray
p30 #b09c81 RosyBrown #b29e84 RosyBrown #beaa8f PeachPuff3 gray
s23 #baaea0 gray68 #baaea1 gray68 #c5b9ab AntiqueWhite3 silver

There seems to be too much difference with RGB.txt colors for the picture matcher to be useful with different phones
Even the much reduced CSS color namespace is not fully consistent, and that is with great losses in precision due to much smaller color namespace (i.e. basically all of those of those picture1 and picture8 colors end up being closest to "Silver" or "Gray", even if to the eye they are more rosy/brownish/yellowish/peachy/bluish...

And especially as examples above should be much closer to each other then if :

  • the cameras were used in different times of day and brightness
  • even more in different lightning white balance: e.g. cloudy vs. sunny skies skews white balance a lot (not to mention early morning or late evenings), and while AWB improves things slightly it never really works well -- even with professional cameras, you usually need to carry piece of white paper with you to calibrate WB manually for lightning source if you want anything close to reproducible colors). example, example2
  • camera "vividness", "natural colors" and other "beatifying" filters that many modern phones ship with and enable by default (example)

If someone manages to at least get consistent (yet sensible, unlike CSS matches above) named color results for samples in the table above, I'll take extra time to take the same pics in different lightning conditions so we can see how the colormatcher fares with that much harder problem.


My conclusion is: unless someone can come up with much better color matcher, that camera-approach is likely not going to be very usable i.e. verifiable.

Thus, we're probably best with improving existing fixed-color palette table in SCEE (i.e. getting rid of hex colors without names and similar colors, while adding a few new named colors which might be wanted).

Suggestions about which colors to throw out and which to bring in are much welcome! (and those would be needed even if we did find a way to use camera to find closest color in that 4x7 palette, so please do give suggestions regardless if you prefer camera or manual input)

@mnalis
Copy link
Collaborator

mnalis commented Jan 19, 2025

And for those interested; some more example pictures as promised, taken with same phone (Samsung Galaxy S23+), but different cameras in different days (again; ideally each column should have the same color average value).

We can see that the differences are striking even for same phone just depending on time of the day and amount of clouds; or even same phone and same weather and time of day, just using different zoom (which uses different camera)

Day scene green b1 blue b1 gray b2 l_roof b2 r_roof b2 peach b2 white b2 grey
sunny (sun) Image n/a n/a Image Image Image Image Image
sunny (shade) Image Image Image n/a n/a n/a n/a n/a
evening cam1 n/a Image Image Image Image Image Image Image
evening cam2 n/a Image Image Image Image Image Image Image
evening cam3 n/a Image Image n/a n/a n/a n/a n/a

Or numerically in SQRT average with closest color name:

Day scene green b1 blue b1 gray b2 l_roof b2 r_roof b2 peach b2 white b2 grey
sunny (sun) #699a74 DarkSeaGreen4 n/a n/a #654d43 gray32 #948180 seashell4 #ab8182 RosyBrown #acada8 gray67 #767771 gray46
sunny (shade) #385f46 DarkSlateGray #9eb5cb SlateGray3 #b7b7b3 gray71 n/a n/a n/a n/a n/a
evening cam1 n/a #505a72 gray37 #605d67 gray38 #4d3c5b gray30 #72608c plum4 #865b73 pink4 #9a899b gray58 #725852 gray37
evening cam2 n/a #9095b1 gray61 #67666e DimGray #443440 gray24 #816178 pink4 #8c5e61 LightPink4 #947773 MistyRose4 #6d5663 grey38
evening cam3 n/a #697894 SlateGray4 #79727f gray47 n/a n/a n/a n/a n/a

@RubenKelevra
Copy link
Author

@mnalis can you do the same with your eyes for comparison on a different building?

@mnalis
Copy link
Collaborator

mnalis commented Jan 20, 2025

@mnalis can you do the same with your eyes for comparison on a different building?

I'm not sure what are asking here, can you elaborate exact steps of the experiment you propose? I obviously cannot extract exact numbers from my eyeballs as I can with digital photos 😄. Or was that perhaps rhetorical question?


With that research above I've tried to collect samples to determine how hard it might be to do useful computer color detection (as it turns out, very hard with existing common methods, unless someone can come up with new algorithm for better color detection and/or future phones come with significantly better cameras in color-reliability regards - and not just "more megapixels" which seems to be current trend).

Human eyes have better dynamic range than today's cameras; but you're correct that they also have issues with detecting correct white balance and color detection in different lightning conditions; most popular recent example is probably the Blue and Black Dress (which is often by different humans identified as Gold and White Dress instead!)

Also, human brain however seems to usually be significantly better at such guesses that current software - e.g. when I was looking at those buildings when I was talking pictures, I would've still said those roofs were brown, even if camera & computer estimates it is grey / plum.


So IMHO at least the fixed palette quest still seems to make sense for SCEE (even if not for vanilla SC), even if all colour tags are inherently somewhat subjective...

@RubenKelevra
Copy link
Author

@mnalis can you do the same with your eyes for comparison on a different building?

I'm not sure what are asking here, can you elaborate exact steps of the experiment you propose? I obviously cannot extract exact numbers from my eyeballs as I can with digital photos 😄. Or was that perhaps rhetorical question?

Not rhetorical. The question was: Isn't it expected that the results are different in different lighting conditions?

I mean if I go out at sunset, the colors of a roof will appear completely different than on a summer day. I was looking at the neighbors house, which has a dark red roof, which appeared 3 hours ago on a foggy winter day like dark brown to my eyes.

So the question for me is, is there really a large difference between a camera sensor compared to a human eye? At the end we look at most pictures made by a camera and are sufficiently satisfied with the representation of the scene, in mimicking real life.

Human eyes have better dynamic range than today's cameras; but you're correct that they also have issues with detecting correct white balance and color detection in different lightning conditions; most popular recent example is probably the Blue and Black Dress (which is often by different humans identified as Gold and White Dress instead!)

Well, there are two types of dynamic range. One is per scene and one is considering the adjustments with sensitivity, aperture, and exposure time. While the highly miniaturized sensors in phones do have a more limited dynamic range for scenes, higher quality sensors do surpass the human eye's capabilities in terms of dynamic range.

But this doesn't matter much, as you wouldn't expect a roof to be overexposed or underexposed if you point the camera sensor on it. The camera sensor will adjust the exposure, aperture, and sensor sensitivity to adjust for the center of the image to be within its dynamic range.

Your example however has nothing to do with dynamic range or natural vision of humans, as the scene is not viewed by humans directly. We're instead looking at differently good or bad calibrated screens with different white balances in rooms with lighting with different color temperatures and brightnesses at an image presented there.

It's obvious those differences in presentation will lead to different outcomes of perception.

@mcliquid
Copy link

I'll keep this deliberately very brief, but I hope my feedback is not in vain. First of all, thank you very much for the extensive tests! A hell of a lot of time and good work has gone into this. And I'd say it was worth it.

All in all, I would say that reducing the possible color values to the 140 color names of the W3C would already create a very good basis. Of course, there will be situations where the principle of exact repeatability cannot be applied. But we also have to be satisfied with a certain amount of fuzziness in OSM.

If I enter a POI, e.g. a hydrant, with my smartphone, it may deviate by 2-3 meters from the exact position because my GPS is not good. Someone else moves the hydrant to what they consider to be the correct position on another smartphone, but which may end up being 2-3 meters off to another position. Then a third person comes along and aligns the hydrant with the aerial image, which is also 2 meters off because it is poorly aligned.

Whether a house is displayed in a 3D application with #816178 or with #806279, the delta doesn't matter for most if not all users.

OpenStreetMap thrives on the idea that things can always be improved afterwards. Let's start by classifying things in red or green. Whether it is #00FF00 or #2FBC2F in the end can always be changed if it is relevant.

@mnalis
Copy link
Collaborator

mnalis commented Jan 20, 2025

The question was: Isn't it expected that the results are different in different lighting conditions?

Good question. But not really - if you intend to tag it with existing tags. And that is a huge problem, yes? OSM depends on verifyability.

Small subjective differences might be workable - e.g. while I might call something smoothness=bad and someone else would categorize it as smoothness=very_bad (or I might call something colour=pink and another light red), there is still useful information recorded, only the error bars are larger then we'd like.

However, bigger differences put whole tag existence in question. If your error bars get almost as big as range of information recorded, the information becomes useless1.

So the question for me is, is there really a large difference between a camera sensor compared to a human eye?

It is not just a raw sensor, much of it is in software (i.e. human brain vs. camera software). I'm not a biologist, so I certainly have no idea of why I seem to match colors much better then my Samsung phone. I suspect (judging by some optical illusions) that brain does some serious whackery and NN-matching previous experiences with current situation.

But the current results make me wary of phone cameras in their current state (whatever the underlying reason might be) -- e.g. that roof (that is bright orange in reality) getting identified as plum (!!) is hardly usable IMHO. Perhaps phone-camera taking could be improved, but it is lots of work (for users or developers or both):

  • camera software made for such color recognition could be made a lot smarter. It could take into account GPS coordinates, time of day, weather prognosis and AQI for given location, and use those (especially in combination with piece of sky captured!) to calibrate white balance and via that real color of the nearby roof (also ideally autodetected by AI) much better then they currently do. That might work much closer to how human brain works, or maybe even better? But seems like quite some work to implement.2

  • another way would be to require users doing manual WB calibration. I.e. software auto-matching with predefined color chart which users would need to color-print at home and bring with themselves on surveys (or at least a white paper) that user would present in outdoor conditions before capturing colors (and be required to re-calibrate after some time). As it by definition requires manual actions and preparations, I don't think much users would be happy with that requirement, though.

Whether a house is displayed in a 3D application with #816178 or with #806279, the delta doesn't matter for most if not all users.

Agreed; and that was my intention behind those test -- to measure how big do those differences get in real conditions. Unfortunately, it seems to be that using camera as truth source is not viable at this time, unless we get much better software solutions.

OpenStreetMap thrives on the idea that things can always be improved afterwards. Let's start by classifying things in red or green. Whether it is #00FF00 or #2FBC2F in the end can always be changed if it is relevant.

Yes, that was my conclusion too, but I could never write so short and concise 😃


TL;DR: So, IMHO, stay with the manual (human-chosen) palette, but improve the list we have. colour=* wiki seems to favour 16 basic CSS 1.0 colors, but list some taginfo favorites that (IMHO righly) overtake those in usage.

So my suggestion would be to use that 16-color CSS chart with few colors replaced - e.g. get rid of lime (people could use green instead) and aqua (teal or blue could be used instead) and bring in orange and brown (taginfo favorites) instead. I actually think that giving too much choice is detrimental here, and 16 sounds like a reasonable number (see i.e. Choice overload). Does that make sense to people? Would you have some tweaks? Or (having read all that before - I know there is much, sorry!) do you think there is better actionable short-term option?

Footnotes

  1. E.g. if one would routinely perceive some road as smoothness=good and another that same road as smoothness=very_horrible, that would make that tag useless. Same problem is someone calls something color=white and for another that is obviously color=blue, or to me it is color=gold and for someone else it's obviously color=black (to use that dress example); then that whole color tag would be so subjective and become effectively useless and should be deprecated.

  2. on the other hand, if we do make such tech, we can probably do automated edits at a scale using say Panoramax and Mapillary imagery and record such color information with much less needless human manual effort.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants