In March 2018, IBM announced a significant partnership with Apple, which brings together Apple's Core ML technology with IBM Watson's AI solution to help developers more easily create AI-powered iOS apps.
The benefits of this partnership were the focal of IBM's challenge at the TechCrunch Hackathon at VivaTech 2018. The 24-hour coding event brought together hundreds of developers, technologists, engineers and specialists in other fields from around the world to create new solutions aimed at making the world a better place, using AI.
Sixty-four teams competed across five challenges, each sponsored by a different organization. Each challenge had a grand prize of 5,000 euros sponsored by the companies and TechCrunch awarded another 5,000 euros to the top team overall.
All 12 teams that participated in the IBM challenge created working prototypes within the 24 hours, which is a phenomenal achievement. Keep in mind not every hackathon results in working prototypes. This reflects the ease of use of the code patterns and the potential of Watson when paired with Apple's Core ML.
Creating AI-powered iOS apps
The IBM challenge called on the hackers to build AI-powered apps for iOS, showcasing the development power of both the talented participants and the AI solutions used to build intelligent technologies of the future. Those participating in the IBM challenge had access to IBM "code patterns" that have all the parts and instructions they needed to get started creating their apps.
In addition to the available code patterns, our team at IBM decided to challenge ourselves to create and publish our own iOS app using the two technologies. Members of my team, including David Okun, Anton McConville, and Sanjeev Ghimire, created WatsonML, a timed object identification app using visual recognition, which showcased what can be created with Watson and Core ML technologies. We released the code in a form of a code pattern in time for the event, that many developers ultimately used as a model for their own creations.
One of the benefits of creating an app with Watson and Core ML is that it can fully operate offline. Instead of having to constantly communicate with the cloud, offline AI allows for people to use the visual recognition software to access a communication tool without needing access to the internet — a critical function for those in underdeveloped areas or dealing with the aftermath of natural disasters.
How teams accelerated development to create working prototypes
An impressive aspect of the hackers' performance in the IBM challenge at the TechCrunch Hackathon was their ability to create a working prototype for their final pitches.
These prototypes represented a wide range of applications, all using Watson and Core ML technologies that could operate offline. Among the solutions were:
- AID, which can identify early signs of a possible stroke based on facial cues using visual recognition.
Check My Skin, an app that easily and quickly gives a first assessment of a mole to determine if it is a melanoma or not, with 90 percent accuracy.
Help, Watson, an app that determines which hospital you should go to based on traffic, wait times and other variables impacting how quickly one can receive help.
Joiyce, a solution that empowers Sign Language speakers to communicate in real-time with others who speak another sign language or spoken language.
- Rescop, an app that uses drones and visual recognition to locate people who need rescue assistance, like those who are stranded on their roofs after flooding from a hurricane.
The IBM challenge winner was Joiyce, whose motto was "Giving a voice to those who want to be heard."
Using the TechCrunch Hackathon as a template for building apps to benefit business and society
One striking feature of the event was how diverse the teams were: a wide range of ages and ethnicities were present at the event, as well as significant gender diversity.
Seeing the turnout and results of the TechCrunch Hackathon at VivaTech 2018, I'm optimistic about the future of technology, inspired by the people creating it, and excited to see how it will improve the world.
The winning IBM challenge team, Joiyce, making their pitch.
The fast-paced nature of the event should encourage developers and consumers regarding the use of technology in changing the world for the better. If these functioning apps were created in 24 hours, imagine what can be done in a month or a year. Just think what can happen when we have more time to invest in our endeavors.
At the event — and in general — there was so much interest in and excitement around the development opportunities created by AI. Someone even approached me asking if we could do a workshop at the United Nations to educate their team on the power behind the tech to elicit meaningful change.
It's worth noting that this recent wave of AI and intelligent technologies seems to be driving a greater emphasis on use of technology to create social good. While the tech industry has something of a reputation for being self-serving and capitalistic, I think everyone's cognizant that they want to give back — even if they don't necessarily know how.
One way IBM is addressing this is through the Call for Code. Natural disasters have been growing in severity and frequency over the past decade. While we cannot stop them from happening, we can use technology to lessen their impact. We're encouraging developers worldwide to use their skills and our technology to help solve the problems natural disasters create.
VivaTech functions as an annual showcase of futuristic technologies poised to change the world, and this year, the TechCrunch Hackathon seemed to emphasize the goal of improving society through technology. It's as exciting to see what we can do with teamwork, brilliant minds, and access to the world's best technology solutions. I look forward to seeing all the ways today's technology could soon improve lives around the world.
This article was originally published on Mobile Business Insights.