User Review( votes)
ARTIFICIAL INTELLIGENCE BIG DATA- “Gone are the days of data engineers manually copying data around again and again, delivering datasets weeks after a data scientist requests it”-these are Steven Mih’s words about the revolution that artificial intelligence is bringing about, in the scary world of big data.
By the time the term “big data” was coined, data had already accumulated massively with no means of handling it properly. In 1880, the US Census Bureau estimated that it would take eight years to process the data it received in that year’s census. The government body also predicted that it would take more than 10 years to process the data it would receive in the following decade. Fortunately, in 1881, Herman Hollerith created the Hollerith Tabulating Machine, inspired by a train conductor’s punch card. Although an operator still had to manually feed data through the machine’s “counters”, it was exponentially faster than manual counting. In 1943, the British invented a machine to crack Nazi codes- the Colossus. It scanned 5 characters in a second, and reduced the workload from weeks to mere hours. These inventions opened the world’s eyes to the manifold benefits of automating the handling of data.
This is exactly what artificial intelligence is meant to do-perform tasks more efficiently by mimicking our abilities to learn and solve problems. As technology advances at breakneck speed, benchmarks that previously defined AI are becoming outdated by the day recognizing text through optical character recognition (OCR), which was once considered a feat, is now taken for granted as a basic computer function. At the same time, data is growing bigger fuelled by its democratization and the IoT environment. While no technology ever has or will be a magic bullet for industry at large, the leaps and bounds that these two worlds are making has led to a synergistic relationship between them-AI is useless without data and data is insurmountable without AI.