As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • @Spyder@lemmy.ml
    link
    fedilink
    23
    edit-2
    1 year ago

    Do your patients know that their information is being transcribed in the cloud, which means it could potentially be hacked, leaked, tracked, and sold? How does this foster a sense of distrust, and harm the patients progress?

    Could you leverage this information and the possibility of being sued if information is leaked with the bureaucrats?

  • Lath
    link
    fedilink
    141 year ago

    Dunno, maybe collect the news of every private digital data leak in recent years and show how unsafe it really is?

  • macniel
    link
    fedilink
    111 year ago

    Shouldn’t that be a HIPAA violation? Like you can’t in good conscious guarantee that the patient data isn’t being used for anything but the healthcare.

    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      15
      edit-2
      1 year ago

      My question is not a legal one. There probably are legal obstacles for my hospital in this case but HIPAA is not applicable in my country.

      I’d primarily like to get your opinions of how to effectively present my case for my bosses against using a non local model for this.

      • 520
        link
        fedilink
        41 year ago

        Look to your local health privacy laws. Most countries have that tightly controlled in such a way that this use of AI is illegal.

        Your question is not a legal one, but a legal argument can be a very persuasive one.

    • @Szymon@lemmy.ca
      link
      fedilink
      English
      3
      edit-2
      1 year ago

      It is until they prove it isn’t, which they might not be able to do. Many trusted 23andme only to see private data stolen. Make the company prove the security in place and the methods ensuring privacy, because you’ll essentially be liable for any failures of the system from a lack of due diligence.

      • @lewdian69@lemmy.world
        link
        fedilink
        -2
        edit-2
        1 year ago

        Voice recognition dictation has been used in the medical field for over a decade, probably even longer. My regional health system of multiple hospitals and clinics has been using an electronic based, like Dragon dictation, solution since at least 2012. Unfortunately in this case op is being overly paranoid and behind the times. I’m all for privacy but the HIPAA implications have already been well sorted out. They need to either learn to type faster or use the system provided that will increase their productivity and save the health system an fte that used to be used on their transcriptionist which can not be used more directly to care for patients.

        • Boozilla
          link
          fedilink
          English
          5
          edit-2
          1 year ago

          It is true that Dragon and similar apps have been used for years. But I don’t think it’s fair to say OP is being paranoid and a luddite. Data breaches in the cloud are a weekly occurrence, and OP wanting to protect their voice / biometrics is not foolish it’s smarter than the average bear. You can change a compromised password. You can’t change your biometrics or voice.

          Also, those products were used on local networks for many years before they entered the cloud. They gradually reduce our privacy over time, getting people numb to it.

          • @Szymon@lemmy.ca
            link
            fedilink
            English
            2
            edit-2
            1 year ago

            I think the issue is moreso that you’re sending confidential health data to a 3rd party, which is where you lose control. You don’t know the intentions of people looking to steal that data, and you need to consider the worst possible outcome and guard against those. AI training is just one option. Get creative, what could you do with a doctor’s voice and their patient’s private medical history?

            Simplest solution is to stop the arrangement until the company can prove data security on their end or implement an offline solution on local servers not connected to the internet.

  • DontMakeMoreBabies
    link
    fedilink
    11
    edit-2
    1 year ago

    You’re going to lose this fight. Admin types don’t understand technology and, at this point, I imagine neither do most doctors. You’ll be a loud minority because your concerns aren’t concrete enough and ‘AI is so cool. I mean it’s in the news!’

    Maybe I’m wrong, but my organization just went full ‘we don’t understand AI so don’t use it ever,’ which is the other side of the same coin.

    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      81 year ago

      I understand the fight will be hard and I’m not getting into it if I cant present something they will understand. I’m definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.

  • 7heo
    link
    fedilink
    101 year ago

    I would have work sign a legal discharge that from the moment I use the technology, none of the recordings or transcription of me can be used to incriminate me in case of an alleged malpractice.

    In fact, since both are generated or can be generated in a way that both sounds very assertive but also can be adding incredibly wild mistakes, in a potentially life and death situation, they legally recognise potentially nullifying my work, and taking the entire legal responsibility for it.

    As you can see in the most recent example involving Air Canada, a policy has been invented out of thin air. Such policy is costing the company. In the case of a doctor, if the administration of the wrong sedative, the wrong medication, or if the wrong diagnosis was communicated to the patient, etc; all that could have serious consequences.

    All sounding (using your phrasings, etc) like you, being extremely assertive, etc.

    A human doing that job will know not to derive from the recording. An AI? “antihistaminic” and “anti asthmatic” aren’t too far off, and that is just one example off of the top of my head.

  • @BurningRiver@beehaw.org
    link
    fedilink
    91 year ago

    I would suggest that that first action item would be is to ask for (in writing) are 1) data protection and 2) privacy policies. I would then either pick it apart, or find someone who works in cybersecurity (or the right lawyer) to do that. I’ve done it a few times and talked my employer out of a few dodgy products, because the policies clearly try to absolve the vendor of any potential liability. Now, whether the policies truly limit liability would have to be tested in court.

    You could also talk about how data protection, encryption, identity and access management, and governance is actually really expensive, but I’d first start poking holes in the actual policies to create doubt.

  • Boozilla
    link
    fedilink
    English
    9
    edit-2
    1 year ago

    Will they allow you to use your own non-cloud solution? As long as you turn in text documents and they don’t have to pay a person to transcribe, they should be happy. There are a number of speech to text apps you can run locally on a laptop, phone, or tablet.

    But of course, it’s sometimes about control and exercising their corporate authority over you. Bosses get off on that shit.

    Not sure which type of doctor you are, but there’s a general shortage of NPI people. I hope you can fight back with some leverage. Best of luck.

    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      8
      edit-2
      1 year ago

      It will not be possible to use my own software. The computer environment is tightly controlled. If this is implemented my only input device to the medical records will be the AI transcriber (stupidity).

      I’m a psychiatrist in the field of substance abuse and withdrawal. Sure there’s a shortage of us too but I want the hospital to understand the problem, not just me getting to use a old school secretary by threatening going to another hospital.

      • Boozilla
        link
        fedilink
        English
        41 year ago

        I was afraid that might be the case. Was hoping they would let you upload the files as if you had typed them yourself.

        Maybe find some studies / articles on transcription bots getting medical terminology and drug names wrong. I’m sure that happens. AI is getting scary-good, but it’s far from perfect, and this is potentially a low-possibility-but-dangerous-consequences kind of scenario. Unfortunately the marketers of their software probably have canned responses to these types of concerns. Management is going to hear what they want to hear.

        • @FlappyBubble@lemmy.mlOP
          link
          fedilink
          31 year ago

          Thaks fot he advice but I’m not against using AI-models transcribing me, just not a cloud model specifically trained on my voice without any control by me. A local model or more preferrably a general local model woulf be fine. What makes me sad is that the persons behind this are totally ignorant to the problem.

          • Boozilla
            link
            fedilink
            English
            21 year ago

            I understand, and we’re basically on the same page. I’m not fully anti-AI, either. Like any tool, it can be used for good or evil. And you are right to have concerns about data stored in the cloud. The tech bros will mock you for it and then… oh look, another data breach has it been five minutes already. :)

            • @FlappyBubble@lemmy.mlOP
              link
              fedilink
              11 year ago

              Yes I agree. Broadening the scope a little, I frankly just wait for a big leak of medical records. The system we use is a birds nest of different softwares, countless API:s, all sorts of database backends. Many systems syem from MS-DOS, just embedded in a bit more modern integrated environment. There are just so many flaws and I’m amazed a leak hasn’t happened (or at least surfaced) yet.

  • @tonyn@lemmy.ml
    link
    fedilink
    71 year ago

    Stop using the digital voice recorder and type everything yourself. This is the best way to protect your voice print in this situation. It doesn’t work well as a protest or to educate your colleagues, but I suppose that’s one thing you can use your voice for. Since AI transcription is a cost saving measure, there will be nothing you can do to stop its use. No decision maker will choose the more expensive option with a higher error rate on morals alone.

    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      61 year ago

      Unfortunately the interface of the medical records system will be changed when this is implemented. The keyboard input method will be entirely removed.

  • Norgur
    link
    fedilink
    51 year ago

    Okay, so two questions:

    1. are you in a country that falls under the GDPR?
    2. this training data is to be per person?
    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      11
      edit-2
      1 year ago

      I work in Sweden and it falls under GDPR. There are probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.

      The training data is to be per person, resulting in a tailored model to every single doctor.

      • Norgur
        link
        fedilink
        41 year ago

        I think you can use the gdpr for your advantage here. Someone has to have tried this, right? So they could put on a gdpr request, demanding all data stored from them.

  • The Doctor
    link
    fedilink
    English
    41 year ago

    The personalized data model will be trained on your voice. That means that it’s going to be trained on a great deal of patient medical history data (including PII). That means it’s covered by HIPAA.

    I strongly doubt the service in question meets even the most minimal of requirements.

  • @datavoid@lemmy.ml
    link
    fedilink
    English
    31 year ago

    Personally I’d be more worried about leaking patient information to an uncontrolled system than having a voice model made

    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      31 year ago

      Thats another issue and doesn’t lessen the importance of this issue. Both are important but separate. One is about patiwnt data, the other about my voice model. Also in thsi case I have no control over the mesical records and it’s already stored outside the hospital in my case.

  • umami_wasabi
    link
    fedilink
    21 year ago

    So what’s your concern? I’m a bit confused.

    1. Using cloud to process patient data? Or,
    2. Collecting your voice to train a model?
    • DessertStorms
      link
      fedilink
      01 year ago

      Yeah, I’d be sooooo confident and reassured if I knew my doctor was prioritising the security of their voice of the security of my information… /s

      (yes, it can be both, but this post doesn’t seem at all concerned with one, and entirely with the other)

  • Boozilla
    link
    fedilink
    English
    21 year ago

    I had another idea. You might be able to use something that distorts your voice so that it doesn’t sound anything like you, but the AI can still transcribe it to text. There are some cheap novelty devices on amazon that do this, and also some more expensive pro audio gear that does the same thing. Just a thought.

    • @FlappyBubble@lemmy.mlOP
      link
      fedilink
      41 year ago

      Sure but what about my peers? I want to get the point across and the understanding of privacy implications. I’m certain that this is just the first of many reforms without proper analysis of privacy implications.

      • Boozilla
        link
        fedilink
        English
        11 year ago

        I agree that getting the point across and having them rethink this whole thing is a much better way of handling this than using a tech solution. I am just pessimistic you can change their minds and you might need a plan B.

  • Gaia [She/Her]
    link
    fedilink
    21 year ago

    Unfortunately a guy I know works for a gov hospital and they’ve used such technology for over a decade at this point. It seems unavoidable.

  • ZILtoid1991
    link
    fedilink
    11 year ago
    1. Go to the Minecraft servers of OpenAI and similar corporations.
    2. Find a room called “AI server room”, all while avoiding of defeating the mobs protecting the area.
    3. Destroy everything there.
    4. Go to the offices.
    5. Destroy everything there.