December 5, 2016 | Written by: Roger Strukhoff
Categorized: OpenPOWER | Power servers | Power Systems
Share this post:
Cognitive computing, artificial intelligence, machine learning and deep learning are creating a lot of excitement in enterprise IT. However, it is continuous improvement in hardware infrastructure that have made software breakthroughs in these areas possible.
OpenPOWER in action
I was reminded of this as I watched a project demonstration of IBM OpenPOWER hardware working with Docker and TensorFlow software to process medical images on a large scale as part of the medical industry’s fight against cancer. The work was overseen by Indrajit Poddar, a senior technical staff member at IBM, and Andrei Yurkevich, CTO of Altoros.
Docker is a well-known open-source software container platform that has gained great traction over the past year. It allows developers to place software into a complete file system, including code, runtime, tools and libraries, and run it in any environment without modification. TensorFlow is a breakthrough software library that Google developed for numerical computation. Google made it an open-source project earlier this year.
In the project demo I saw, Docker and TensorFlow were powered by as many as eight OpenPOWER GPUs to examine 100,000 large medical images. The goal was to see how well Tensorflow, running within Docker containers, could analyze potential cancer diagnoses when compared to results previously determined by cancer specialists.
The project used learning algorithms to create a system that can see instances of cancer returning in images that are normally viewed by doctors. The side-by-side image below compares how a doctor sees the image (left) and how the machine sees the image (right).
With the assistance of the OpenPOWER GPUs, TensorFlow was able to conduct its analysis in only 90 minutes and be within 0.5 percent of the results found by doctors. Of course, this project test is not meant to eliminate the doctors, but to deliver a transformational tool that accomplishes large, complex tasks with astonishing speed. Doctors and other medical professionals on their teams would be able to employ it to study many images quickly, and then drill down to determine next steps with individual patients. Check out the technical recap.
New generations of transformation
You can imagine that similar tests in healthcare, along with other compute-intensive industries such as energy exploration, logistics and transportation, meteorology, aerospace and economic analysis, will develop generations of similar helpful tools that address big, difficult problems.
Enterprises and industries can be transformed to not only achieve new efficiencies but, as in the case of the project I described here, to save lives.
A key point is the use of open-source technology — hardware and software — throughout this project. Open-source initiatives benefit mightily from having large, global communities who work on developing them, with governance from their peers. There is a transparency to open-source projects that proprietary technology will never have.
To check out the technologies used in this project, head to OpenPOWER, Docker, TensorFlow and the TensorFlow GitHub page where you can also find and join their communities.