Due to technological advancements and the rise of machine learning, the volume of data has increased. Global data production has grown substantially, hitting 64.2 zettabytes in 2020, and is expected to reach 181.0 zettabytes by 2025. Physical sciences, computer sciences, medicinal sciences, speech recognition, computer vision, and natural language processing are some areas where this has significant application. Large datasets place significant computing demands on hardware systems.
The rate at which the processing power needed for modern AI jobs is currently doubling is far faster, happening every 3.5 months on average. To keep up with this expansion, hardware capacity must quadruple every 3.5 months. Enhancing the data dimensionality that such technology can process is one suggested solution. Although multiplexing space and wavelength have been used to handle two-dimensional data, hardware implementation of three-dimensional processing is needed.
Consequently, researchers from the Universities of Oxford, Muenster, Heidelberg, and Exeter have developed photonic-electronic hardware to handle three-dimensional (3D) data. This breakthrough considerably improves the parallelism of data processing for artificial intelligence (AI) activities.
The researchers used radio-frequency modulation to increase the parallelization of photonic communications, which added another layer to the data. They could do this by utilizing wavelength multiplexing and incorporating non-volatile memories dispersed throughout space. Compared to techniques that merely exploit spatial and wavelength fluctuations, scientists achieved a good level of parallelism with this system, reaching 100 and improving two orders.
The research team has advanced their work by enhancing the photonic matrix-vector multiplier chips’ processing capacity by an additional parallel dimension. Using numerous radio frequencies to encode the data, this improvement, known as higher-dimensional processing, raises parallelism to a level surpassing prior accomplishments.
The research team tested the risk of sudden mortality in patients with heart disease by examining electrocardiograms in a real-world setting using their innovative gear. They successfully identified the probability of sudden death with a 93.5% success rate while simultaneously analyzing 100 ECG readings.
The researchers also asserted that this approach has the potential to outperform the most recent electrical processors, even with a slight increase in inputs and outputs. This scalability might result in a major 100-fold increase in computation density and energy efficiency.
Check out the Paper. All Credit For This Research Goes To the Researchers on This Project. Also, don’t forget to join our 32k+ ML SubReddit, 40k+ Facebook Community, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more.
If you like our work, you will love our newsletter..
We are also on Telegram and WhatsApp.