io.net Adds Apple Silicon Chips Support for AI and ML Computing

9 months ago 5
ARTICLE AD BOX

io.net Adds Apple Silicon Chips Support for AI and ML Computing

The recently launched decentralized physical infrastructure network (DePIN) io.net is planning to include Apple silicon chip technology into its artificial intelligence (AI) and machine learning (ML) offerings.

Using GPU processing power from dispersed data centers, cryptocurrency miners, and decentralized storage providers, Io.net’s Solana-based decentralized network powers machine learning and artificial intelligence computations.

The announcement of the beta platform launch and the newly established relationship with Render Network took place in November 2023 at the Solana Breakpoint conference in Amsterdam.

With its most recent upgrade, Io.net claims that it is the first cloud service to enable clustering of Apple silicon chips for use in machine learning. On a global scale, engineers may cluster Apple chips for machine learning and artificial intelligence computing.

The availability of inexpensive GPU computing resources on io.net for use cases involving AI and ML has been thoroughly investigated before. To pay its GPU and CPU computing providers, the platform uses Solana’s blockchain.

Tory Green, COO of io.net, claims that the architecture of Solana is optimal for handling the volume of transactions and inferences that io.net will enable. Thousands of inferences and related microtransactions are required to utilize the hardware, which the infrastructure obtains via GPU computing power in clusters.

Users of io.net may now contribute computing power from a variety of Apple Silicon chips, thanks to the upgrade. The M1, M1 Max, M1 Pro, and M1 Ultra are all part of this extensive lineup, as are the M2, M2 Max, M2 Pro, and M2 Ultra, as well as the M3, M3 Max, and M3 Pro.

The 128 MB RAM architecture of Apple’s M3 processors is more powerful than Nvidia’s top-of-the-line A100-80 GB graphics cards, according to Io.net. The improved neural engine in Apple’s M3 processors is sixty percent quicker than in the M1 series, according to Io.net.

Model inference, which involves feeding real-time data into an AI model to generate predictions or solve problems, is another area where these processors shine because to their unified memory architecture. Support for Apple chips, according to Io.net founder Ahmad Shadid, might help technology keep up with the increasing need for machine learning and artificial intelligence computing power.

The founder stated:

“This is a massive step forward in democratizing access to powerful computing resources, and paves the way for millions of Apple users to earn rewards for contributing to the AI revolution.”

Millions of Apple product owners may now provide their unused computational resources and chips to AI and ML use cases thanks to Apple hardware support.

Read Entire Article