Neat but I don’t think LLMs are the way to go for these sort of things

BolexForSoup
link
fedilink
49M

I don’t mind so long as all results are vetted by someone qualified. Zero tolerance for unfiltered AI in this kind of context.

Skua
link
fedilink
39M

If you need someone qualified to examine the case anyway, what’s the point of the AI?

FaceDeer
link
fedilink
19M

Why do skilled professionals have less-skilled assistants?

Skua
link
fedilink
1
edit-2
9M

Usually to do work that needs done but does not need the direct attention of the more skilled person. The assistant can do that work by themselves most of the time. In the example above, the assistant is doing all of the most challenging work and then the doctor is checking all of its work

BolexForSoup
link
fedilink
1
edit-2
9M

deleted by creator

Skua
link
fedilink
19M

In the example you provided, you’re doing it by hand afterwards anyway. How is a doctor going to vet the work of the AI without examining the case in as much detail as they would have without the AI?

BolexForSoup
link
fedilink
1
edit-2
9M

Input symptoms and patient info -> spits out odds they have x, y, or z -> doctor looks at that as a supplement to their own work or to look for more unlikely possibilities they haven’t thought of because they’re a bit unusual. Doctors aren’t gods, they can’t recall everything perfectly. It’s as useful as any toxicology report or other information they get.

I am not doing my edits by hand. I am not using a blade tool and spooling film. I am not processing it. My computer does everything for me, I simply tell it what to do and it spits out the desired result (usually lol). Without my eyes and knowledge the inputs aren’t good and the outputs aren’t vetted. With a person, both are satisfied. This is how all computer usage basically works, and AI tools are no different. Input->output, quality depends on the computer/software and who is handling it.

TL;DR: Garbage in, garbage out.

The ai can examine hundreds of thousands of data points in ways that a human can not

Skua
link
fedilink
1
edit-2
9M

In the test here, it literally only handled text. Doctors can do that. And if you need a doctor to check its work in every case, it has saved zero hours of work for doctors.

Residents need their work checked also. I don’t understand your point.

BolexForSoup
link
fedilink
1
edit-2
9M

asdfasfasf

Skua
link
fedilink
19M

how high processing power computers with AI/LLM’s can assist in a lab and/or hospital environment

This is an enormously broader scope than the situation I actually responded to, which was LLMs making diagnoses and then getting their work checked by a doctor

Create a post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

  • 1 user online
  • 40 users / day
  • 139 users / week
  • 305 users / month
  • 2.32K users / 6 months
  • 1 subscriber
  • 3.01K Posts
  • 43.4K Comments
  • Modlog