You've been diagnosed with terminal projected gradient descent

Subject: General Tech | April 20, 2018 - 01:10 PM |
Tagged: security, scary, health, PGD

Researchers have demonstrated how a projected gradient descent attack is able to fool medical imaging systems into seeing things which are not there.  A PGD attack degrades pixels in an image to convince an image recognition tool into falsely identifying the presence of something in an image, in this case medical scanners.  The researchers were successful in fooling three tests, a retina scan, an x-ray and a dermatological scan for cancerous moles; regardless of their access level on the scanner itself.  Take a look over at The Register for more information on this specific attack as well as the general vulnerability of image recognition software.

View Full Size

"Medical AI systems are particularly vulnerable to attacks and have been overlooked in security research, a new study suggests."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

May 1, 2018 | 09:46 AM - Posted by ET3D

Under what circumstances can such an attack happen? I mean, if you have the level of access that allows you to modify a medical image, you could do pretty much anything to it; there's no particular reason for an attack that fools AI only.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.