BSV
$66.13
Vol 200.88m
-0.85%
BTC
$98156
Vol 117357.99m
5.02%
BCH
$482.24
Vol 2147.35m
10.61%
LTC
$89.31
Vol 1384.96m
6.92%
DOGE
$0.38
Vol 9607.21m
2.74%
Getting your Trinity Audio player ready...

A recent paper from a group of Huawei researchers is calling for “embodied” artificial intelligence (AI) as the next big thing for the emerging ecosystem, poking holes in the argument for scaling.

Achieving artificial general intelligence (AGI) will not be a walk in the park for frontier companies, given the absence of a consensus on the definition of the concept, according to the paper. Generally, AGI refers to the intelligence that exceeds or is at par with human abilities, possessing the ability to learn and solve challenges.

Despite the leaps in large language models (LLMs), AI is far from achieving
superintelligence, with models described as “static and unable to evolve with time and experience.” The researchers argue that the best path to attaining superintelligent AI systems is through embodying AI rather than the widespread belief in scaling LLMs.

“It is a prevalent belief that simply scaling up such models, in terms of data volume and computational power, could lead to AGI,” the paper read. “We contest this view. We propose that true understanding, not only propositional truth but also the value of propositions that guide us how to act, is achievable only through E-AI agents that live in the world and learn of it by interacting with it.”

For the researchers, giving AI a body capable of interacting with its surroundings will offer several benefits, including achieving general intelligence. Four components are necessary for embodiment to take place, with perception at the top of the pile.

The second trait required the ability to take action based on perceived data, with the researchers subdividing the functionality into reactive and goal-directed actions. The capability of reactive actions is primarily for “self-preservation,” while goal-directed actions are necessary to achieve “complex, high-level objectives.”

Memory is another essential component for embodied agents to possess to achieve general intelligence. Finally, the ability to learn from memory is a distinguishing factor for intelligent AI systems, with its reliance on simulators and other emerging technologies offering new learning pathways for systems.

Challenges to embodying AI

The paper highlights several downsides associated with embodying AI, including “noise and uncertainty,” which can affect the system’s decision-making abilities.

Hardware limitations pose another challenge for embodying AI, given the cost and energy consumption rates of GPU clusters.

There is also the concern that embodied AI systems will proceed with an “egocentric” perspective, which may open a whole new issue. The researchers are predicting ethical dilemmas in communication with humans, data collection methods, and outputs, given the novelty of the technique.

In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek’s coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.

Watch: What does blockchain and AI have in common? It’s data

Recommended for you

Blockchain enables autonomous AI agents to learn
Utilizing blockchain tech, a group of Belgian scientists enabled autonomous AI agents to learn and communicate securely, contributing to the...
September 17, 2024
WhatsOnChain gets own UTXO endpoints for BSV blockchain services
With ElectrumX set to retire in October, WhatsOnChain is gearing up to implement a new UTXO set of API endpoints,...
September 16, 2024
Advertisement