Machine Learning has the potential to revolutionize so many different aspects of our lives, and it’s starting to enter into game development with IBM’s Watson.
LISTEN TO THE VOICES OF VR PODCAST
For instance, developers could integrate cloud-based AI services into their game to dynamically change the game design progression curve based upon a user’s behavior and performance. If the player is zipping through a series of easy puzzles with no problems, then Watson could detect that and quickly move the player to advanced levels in order to keep the game challenging and interesting for them.
I was able to get a sampling of how a number of different innovative game designers have started to integrate machine learning resources last week at an Intel Buzz Workshop presentation by IBM’s Interactive Media CTO George Dolbier. He showed off some code sample of sample of how to integrate Watson with Unity with IBM’s Watson Developer Cloud API, and gave a number of different use cases for how to integrate machine learning into VR experiences.
I caught up with George to talk where machine learning networks can add value, the future of interactive narratives with AI chatbots, and conversational commerce and the future of conversational interfaces in the Experiential Age. Ars Technica recently premiered a sci-fi short film that was written by a recurrent neural network, and George and I also talk about how AI systems like Watson have the potential to empower humans to do more of what humans do best with our imagination and creativity.
Here’s some of the Unity code that calls the Tradeoff Analytics API as a part of the Watson Developer Cloud.
— KentBye Voices of VR (@kentbye) June 23, 2016
Here’s a brief marketing video about the Watson Tradeoff Analytics feature that George talks about in the podcast:
I curated a Twitter list of over 100 AI & machine learning experts, and I’ll be tweeting more about AI on a new Twitter account at @VoicesofAI
— KentBye Voices of AI (@voicesofai) June 27, 2016