Data Required to Train A.I. Falling 50% Every Two Years
AI Entrepreneur Ben Vigoda has claimed that the amount of data required for training an artificial intelligence is falling at the rate of 50% every two years.
The rate has been highlighted for its similarity to Moore’s law – the trend of computer power doubling every two years.
One implication of the precipitous fall is that AI may become democratised – if anyone with access to 500 pictures can train a system, then Facebook and other tech giants lose some of their (relative) algorithmic power.
This may come as welcome news to startups, who will one day be able to access AI insights that only large corporations can benefit from at present.
Some speculate that the specificities of AI will cause the trend to have the opposite impact, however.
Assuming knowledge is a network, where expertise in one field can benefit expertise in another, there may still be advantages for big firms.
In this case, having access to a broad range of small data sets may be the way to train a powerful AI.
In any case, the trend will reduce the cost of performing current AI operations.
Rob May, CoFounder/CEO of Talla, argues:
“Lowering the cost of producing a smart model of something means we get more smart models of more things, and thus more widely distributed AI. At least in the near term, that should mean big gains for society.”