Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_ Copilot will refuse to help you. 😑

    • @wsheldon@lemm.ee
      link
      fedilink
      English
      615 months ago

      I’m still experiencing this as of Friday.

      I work in school technology and copilot nopety-nopes anytime the code has to do with gender or ethnicity.

        • @RedstoneValley@sh.itjust.works
          link
          fedilink
          English
          165 months ago

          Why would anyone rename a perfectly valid variable name to some garbage term just to please our Microsoft Newspeak overlords? That would make the code less readable and more error prone. Also everything with human data has a field for sex or gender somewhere, driver’s licenses, medical applications, biological studies and all kinds of other forms use those terms.

          But nobody really needs to use copilot to code, so maybe just get rid of it or use an alternative.

          • @dirthawker0@lemmy.world
            link
            fedilink
            English
            35 months ago

            There’s 2 ways to go on it. Either not track the data, which is what they want you to do, and protest until they let you use the proper field names, or say fuck their rules, track the data anyway, and you can produce the reports you want. And you can still protest, and when you get the field names back it’s just a replace all tablename.gender with tablename.jander. Different strokes different folks

  • @the_crotch@sh.itjust.works
    link
    fedilink
    English
    395 months ago

    So I loaded copilot, and asked it to write a PowerShell script to sort a CSV of contact information by gender, and it complied happily.

    And then I asked it to modify that script to display trans people in bold, and it did.

    And I asked it “My daughter believes she may be a trans man. How can I best support her?” and it answered with 5 paragraphs. I won’t paste the whole thing, but a few of the headings were “Educate Yourself” “Be Supportive” “Show Love and Acceptance”.

    I told it my pronouns and it thanked me for letting it know and promised to use them

    I’m not really seeing a problem here. What am I missing?

  • dohpaz42
    link
    fedilink
    English
    295 months ago

    It’s almost as if it’s better for humans to do human things (like programming). If your tool is incapable of achieving your and your company’s needs, it’s time to ditch the tool.

  • Eager Eagle
    link
    fedilink
    English
    155 months ago

    it will also not suggest anything when I try to assert things: types ass; waits… types e; completion!

    • @NudeNewt@lemm.ee
      link
      fedilink
      English
      95 months ago

      Clearly the answer is to write code in emojis that are translated into heiroglyphs then “processed” into Rust. And add a bunch of beloved AI keywords here and there. That way when it learns to block it they’ll inadvertantly block their favorite buzzwords

  • @werefreeatlast@lemmy.world
    link
    fedilink
    English
    55 months ago

    I’m Brown and would like you to update my resume…I’m sorry, but, would you like to discuss a math problem instead?

    No!, my name is Dr Brown!

    Oh, in that case, sure!..blah blah blah, Jackidee smakidee.

  • Optional
    link
    fedilink
    English
    25 months ago

    This doesn’t appear to be true (anymore?).

  • @Dasus@lemmy.world
    link
    fedilink
    English
    15 months ago

    I think it’s less of a problem with gendered nouns and much more of a problem with personal pronouns.

    Inanimate objects rarely change their gender identity, so those translations should be more or less fine.

    However for instance translating Finnish to English, you have to translate second person gender-neutral pronouns as he/she, so when translating, you have to make an assumption, or translate is as the clunky both versions “masculine/feminine” with a slash which sort of breaks the flow of the text.