Samsung has unveiled a prototype for a “digital eye”, built using IBM’s neuromorphic (brain-like) TrueNorth processors.
Unlike most processors, TrueNorth is a manycore design with 4,096 computing cores. Each core simulates 256 neurons and each neuron can be connected to up to 256 other neurons through artificial “synapses”. The processor is designed to process large amounts of data while consuming minimal power, with the processor consuming as little as 70mW of power.
Samsung has integrated this chip in its Dynamic Vision Sensor, which uses the processor to analyse individual camera pixels for changes at a speed so rapid (2000 frames/second) that it allows the sensor to accurately track movement in a 3D space. This is precisely how the human retina works, garnering the sensor its nickname, “digital eye”.
Samsung showcased the digital eye in the form of a TV system that could recognise hand gestures, finger waves and even finger pinches from up to three metres away. The company believes that the system could be useful in self-driving cars, creating 3D maps and gesture recognition. There is no word as to whether the digital eye will be released to the consumer market.
Scientists are currently testing the technology for use in detecting computer attacks, enabling autonomous drone flights and for AI research, among other applications.