… on the way to work, I stop by the supermarket for a bag of ground coffee. My wife has a special brand she likes. I run down the aisle scanning what seems to be dozens of varieties and spot my wife’s favorite. Damn. They’re out. I begin to turn away when I see two shelves down a blue package. It’s a packet of my wife’s favorite packet that someone had put back in the wrong place! …
… the intelligence analyst sighs and picks up another areal photograph taken hours ago by a high-flying surveillance drone. Taking a sip from her fourth coffee that morning, she examines it with a magnifying glass. Almost certainly, there will be nothing again, but if there’s the slightest hint of a terrorist training camp or chemical weapons production facility …
The first story above is about something that you and I have probably done in one way or another many times in our lives. It’s not hard. For us, that is. For computers to do that same task, recognizing an item out of its regular position and turned in a different direction, it almost unfathomably hard. As good as computers are at solving mathematical equations that would make our heads spin, they are often useless at maneuvering the real world.
The second story above is but one example of why it’s important for computers to get better at exactly that. An AI system that can be taught to recognize patterns and relationships in unclear data sets would potentially reduce the number of pictures on that analyst’s desk from hundreds to tens – allowing her to focus on the photos that might really matter.
The key challenging in creating AI that will be able to effectively handle this task lies in being able to train them to learn for themselves based on real-world models. Essentially, giving them a picture of the world and having them navigate it and learn from their mistakes. Unfortunately, we can’t just send them out into the world to do that as we would both have to know everything about the world to spot all the mistakes ourselves, and their mistakes might be costly or dangerous!
What we need are very large, custom-created data sets that we can let the AI stumble through and learn from. We can create these by hand, of course, but that would be a long and laborious process – which means expensive as well. Further, anything created by people will have their biases built in by default – meaning it won’t be a good model of the real world.
**Get started with Bitcoin at Coinbase.**
Or we could create a distributed synthetic data platform that will create huge, custom, hyper-accurate data sets to create the perfect deep learning environment needed to train the next generation of AI. And this is exactly what the Neuromation project is doing.
Neuromation is seeking to solve the problem of how to create these large synthetic datasets by tapping the huge computational power currently being used by individual GPUs across the globe to mine cryptocurrencies. These miners would be given the option to help create datasets on demand and earn the Neuromation token (Neurotoken, or NTK) in exchange, which they can then trade on the open market for the cryptocurrency of their choice. NTK can then be bought on the open market by those wishing to create large data sets for AI deep learning.
Some projects currently underway that would be able to benefit from Neuromation are:
- Project Maven, a US Defense Department project to train and utilize AI under battlefield conditions.
- Innovation DX, a project to help doctors better use imaging to make quick and accurate diagnoses.
- Let’s Enhance, software that will allow for the enhancement of low-resolution photos far beyond what is currently possible.
Neuromation are currently gearing up for an ICO of their NTK token. 1,000,000,000 are planned to be created, with 700,000 being distribute to investors and 300,000,000 being reserved for liquidity.
Dates of the sale are:
Pre-sale starts: October 25th, 2017 – GMT
Pre-sale ends: When ICO begins
The main sale starts: Nov 28th, 2017 – GMT
Further, Neuromation has just been awarded a positive rating by ICORating.com. You can read their review here.