How can new MIT chips help with neural networks?
New scientific work on neural networks may reduce their power and resource requirements to the point where engineers could put their powerful capabilities into much more diverse sets of devices.
That can have a huge impact on everything in our lives, from how we prepare food to how we go to the doctor, or how we get around using our cars or public transportation.
Some of this groundbreaking work is on display at MIT, where some electrical engineering and computer science students are looking at how to improve the design and build of AI/ML systems.
Specifically, the efforts of Abhishek Biswas, an MIT graduate student, and various colleagues are getting a lot of attention in technology press.
Techcrunch talks about how the evolution of neural network science could promote “computing at the edge” and put more powerful technologies into portable battery-driven devices.
Forbes says that Biswas’ breakthrough could “put artificial intelligence inside your blender.”
In general, the advances of the MIT scientists are making waves partly because it's evident how these achievements can affect our consumer technologies, as well as those used for government or business purposes.
Essentially, the type of processor evolution that Biswas describes has to do with co-locating functions in a chip environment. In a Science Daily article, the writer explains how most traditional processors have memory that is stored outside of the processing area, and data is shuttled back and forth. However, this need for the movement of stored memory data takes a lot of power.
Biswas talks about the “dot product” or core operation that helps neural networks work. These scientists are also considering the use of binary weights to simplify systems – and this idea has really been a fundamental part of computer science ever since before the first personal computers were invented.
By promoting these kinds of hardware changes, scientists are providing more versatility for the machine learning and artificial intelligence tools that are changing how we use technologies. By moving from purely deterministic linear programming to a system where computers mimic human brain activity, we're about to embark on a new adventure with much more powerful technologies at our fingertips.
More Q&As from our experts
- What is the difference between little endian and big endian?
- How can unstructured data benefit your business's bottom line?
- What are some of the dangers of using machine learning impulsively without a business plan?
- Machine Learning
- Artificial Neural Network
- Deep Learning
- Artificial General Intelligence
- Computer Science
- Linear Programming
Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
- The CIO Guide to Information Security
- Robotic Process Automation: What You Need to Know
- Data Governance Is Everyone's Business
- Key Applications for AI in the Supply Chain
- Service Mesh for Mere Mortals - Free 100+ page eBook
- Do You Need a Head of Remote?
- Web Data Collection in 2022 - Everything you need to know