I had meant to post a technical article today on a new SQL 2014 feature that could help with troubleshooting performance problems, but I ran into an environmental snag. So look for that in the coming weeks.

Instead I thought I would write a quick post sharing an article on how Google’s DeepMind AI is using eye scans to spot diseases earlier.

 

Right now AI companies are being bought up left and right. I can see a time in the next 10-15 years where a dozen players, such as Google and IBM have developed APIs for their AI systems that vendors can plug into. Using those diagnosis capabilities in interesting new ways.

What will have to happen in order to have people taking diagnosis and care from online website avatars driven by these AIs?

What can we do to utilize these AIs to lower out of pocket costs and reduce appointments and visits for the patient?

What can we build that can notify doctors when a diagnosis for a patient has been diagnosed using an interactive AI where an actual doctor should follow up on?

How can we balance patient privacy concerns with delivering personalized actionable information?

The next generation of medical AI is going to be pretty amazing.  Most people already use websites and research they find online to question / confirm their doctors diagnosis / advice. However, creating that trust between patient and computer [data] is going to be tricky.

 

What do you think about the evolution of healthcare AI?

Let us know in the comments below.