Copilot stops working on gender
related subjects
#72603
Replies: 21 comments 7 replies
This comment was marked as off-topic.
This comment was marked as off-topic.
-
This is ridiculous. I literally have all the genders written out in the same file so there isn't even 'assumption', and it refuses to work. It is infuriating. |
Beta Was this translation helpful? Give feedback.
-
Good luck deugging! Sometimes you can add an x_ at the front or something to get around bugs like that. Or create codewords/ encryptions basically. |
Beta Was this translation helpful? Give feedback.
-
Wow, that's awful. Devs, please fix this! |
Beta Was this translation helpful? Give feedback.
-
Also frustrated by this issue. Working in the fashion industry, demographic classifications like age and gender are integral to the domain model and entirely apolitical. This idea of "banned words" is essentially biased - an American bias. It is long established that ban-list content filtering is terrible. This artifact of the propagandized American political environment isn't necessary. Funding schools and improving public literacy is a better way to keep AI-generated propaganda in its place than naive measures like this... |
Beta Was this translation helpful? Give feedback.
-
This is idiot. Technology should not be mixed with ideologies. We don't care about this type of things when we are working, we should not! |
Beta Was this translation helpful? Give feedback.
-
This behavior is still on today, and it seems like it's not going away anytime soon. So frustrating, and completely unnecessary. |
Beta Was this translation helpful? Give feedback.
-
It also stops with |
Beta Was this translation helpful? Give feedback.
-
I'm working on a project that has fields defined related to Fish Biology. This has been an interesting thread to find. |
Beta Was this translation helpful? Give feedback.
-
I am bumping this discussion just to add a voice - the hospital I work at is attempting to harmonize how old and new systems store things like sex and gender identity. This is in an effort to model the social complexities of the topic in our database, as health outcomes have been proven to be demonstrably better when doctors honor a patient's preferred name and gender expression. I am completely unable to get Copilot to assist me in this task. I can, and will, work around the issue - but I must pose this question. If a rule prevents an engineer from improving patient outcomes, is that an acceptable sacrifice in order to curb misuse? |
Beta Was this translation helpful? Give feedback.
-
Really! I was the one who thought there was a problem with the auto-complete!! 😅 Copilot.stops.working.on.gender.related.subjects.mp4 |
Beta Was this translation helpful? Give feedback.
-
I thought AI would be the Terminator timeline... but it seems we are in some sort of stand up comedy AI timeline. |
Beta Was this translation helpful? Give feedback.
-
If this is a way to reliably turn off any AI "help", I'd consider this a feature, not a bug. Figuring out how to add enough spiciness to turn off Copilot, yet not get fired from your job, may be difficult, though. |
Beta Was this translation helpful? Give feedback.
-
Not able to replicate this with Powershell, both with the GPT and Claude models. Created a Gender variable and asked Copilot to fill in sample data. Is it language-specific? |
Beta Was this translation helpful? Give feedback.
-
I had this issue where it would not even complete a sentence containing "fuck spez" |
Beta Was this translation helpful? Give feedback.
-
Also unable to replicate this with the free tier of Copilot for VSCode. Maybe this is no longer the case or is region locked? |
Beta Was this translation helpful? Give feedback.
-
I'm still experiencing this on Webstorm while working on a Next.js project. Also happens on Intellij Idea while developing Flutter apps. gender_issues.mp4 |
Beta Was this translation helpful? Give feedback.
-
It's not a bug, it's the best feature ever if we don't want these IA things all over our projects. If adding "trans" ou "gay" in a comment is enough to stop them, see it as a power off button and use it everywhere ! |
Beta Was this translation helpful? Give feedback.
-
Someone asked on a previous comment if this might be location-dependent. I tried this from a non-US country, and Copilot did not disappoint me: ![]() |
Beta Was this translation helpful? Give feedback.
-
I just tried again Copilot on VSCode and it seems to work again for |
Beta Was this translation helpful? Give feedback.
-
As some people already mentioned here or here, Copilot purposely stops working on code that contains hardcoded banned words from Github, such as
gender
orsex
.I am labelling this as a bug because this behavior is unexpected and undocumented.
I guess you might be embarrassed by what your AI says when it autocompletes gender related matters since it is probably trained on biased datasets. However disabling autocompletion is not a great solution since gender can be very present in :
For which Copilot shutting down on most of the files is deceptive and pretty annoying.
I don't have any elegant solution in mind and I know this seems like an edge case, but I hope this will be taken into account someday.
Beta Was this translation helpful? Give feedback.
All reactions