Tech & Science ARM unveils two new AI chip designs to ride the machine learning wave

13:45  14 february  2018
13:45  14 february  2018 Source:   theverge.com

The Pixel 2’s Visual Core photo processor now works with Instagram, WhatsApp and Snapchat

  The Pixel 2’s Visual Core photo processor now works with Instagram, WhatsApp and Snapchat Visual Core was a strange little surprise from Google when it revealed it back in October. The system company waited three weeks until after the Pixel 2’s announcement to talk up the system-on-a-chip that had been lying in wait inside the handset the whole time. For most of its life, the photography-focused processor has been devoted to improving photos taken through Google’s own applications, offering HDR+ technology for the Pixel camera app. The technology essentially uses Google’s computational photography and machine learning to improve photo quality.

British chip designer ARM is the latest firm to prime the AI pump with specialized hardware, unveiling two new processor designs today that it The designs are for the ARM Machine Learning (ML) Processor, which will ARM , of course, isn’t alone in trying to ride the AI wave with optimized silicon.

British chip designer ARM is the latest firm to prime the AI pump with specialized hardware, unveiling two new processor designs today that it promises will deliver “a transformational amount of compute capability” for companies building machine learning -powered gadgets.

a hand holding a cellphone: ARM’s new chip designs could find their way into smartphones, but also gadgets like security cameras and drones. © Provided by The Verge ARM’s new chip designs could find their way into smartphones, but also gadgets like security cameras and drones. British chip designer ARM is the latest firm to prime the AI pump with specialized hardware, unveiling two new processor designs on February 13 that it promises will deliver “a transformational amount of compute capability” for companies building machine learning-powered gadgets.

The designs are for the ARM Machine Learning (ML) Processor, which will speed up general AI applications from machine translation to facial recognition; and the ARM Objection Detection (OD) Processor, a second-generation design optimized for processing visual data and detecting people and objects. The OD processor is expected to be available to industry customers at the end of this month, while the ML processor design will be available sometime in the middle of the year.

Another cash machine blown up as thieves strike again

  Another cash machine blown up as thieves strike again It was the second time an ATM in Darlington has been destroyed in four months.The latest incident happened at 1.40am outside the Heron Foods store on Cockerton Green, Darlington.

Hacker News new | comments | show | ask | jobs | submit. login.

“These are new, ground-up designs, not based on existing CPU or GPU architectures,” ARM’s vice president of machine learning, Jem Davies, told The Verge.

ARM says AI chips will eventually “trickle down” to all mobile devices

As with all of ARM’s chips, the company will not be making the processors itself, but it will instead license the designs to third-party manufacturers. In the past, ARM’s customers have included chipmakers like Broadcom, but also hardware firms like Apple, which tweaks ARM’s designs for its own devices. The ML processor will primarily be of interest for makers of tablets and smartphones, while the OD processor could be put to a more diverse range of uses, from smart security cameras to drones.

Davies said the company was already in talks with a number of phone makers interested in licensing the ML chip, but would not name any specific firms. At the moment, specialized AI processors only appear in high-end devices, like Apple’s latest crop of iPhones and Huawei’s Mate 10. But, Davies is confident that the ubiquity of AI applications is going to mean that these chips will quickly become standard-issue across a range of price points.

X-ray images show woman riding conveyor belt with handbag through security machine

  X-ray images show woman riding conveyor belt with handbag through security machine She didn't want it to be unattended.Security footage shows the traveler riding down the conveyor belt and getting out at the other end with her belongings at the Dongguan Railway Station in southern China, the BBC reports.

ARM unveils two new AI chip designs to ride the machine learning wave - The Verge https On the other hand, if it is a dedicated AI chip specialized for machine learning processing, machine learning processing on the terminal side is possible, which is superior in response and security.

Ride -Sharing. ARM has unveiled its next generation of processor designs , a new microarchitecture named Dynamiq. Chips built using Dynamiq will be easier to configure, says ARM , allowing manufacturers to connect together a wider variety of CPUs.

“Our belief from talking to the market is that this will trickle down very, very fast indeed,” Davies told The Verge. “In China they’re already talking about putting this in entry-level smartphones from next year.”

These chip designs will not just be useful for smartphones, though, and will help power the next generation of Internet of Things (IoT) devices. Like many companies developing AI chips, ARM is evangelical about the importance of edge computing — meaning processing is done on-device, rather than sending data back to the cloud. This has been a big factor in phone companies’ adoption of AI chips, as on-device computation has a number of advantages over cloud computing. It’s more secure, as the data can’t be intercepted in transit; it’s quicker and more reliable, as users don’t have to wait for their data to be processed by remote servers; and it costs less — for both the customer and the provider.

MIT has a new chip to make AI faster and more efficient on smartphones

  MIT has a new chip to make AI faster and more efficient on smartphones These chips process data seven times faster using 95 percent less power than traditional means.Neural networks are made up of lots of basic, interconnected information processors that are interconnected. Typically, these networks learn how to perform tasks by analyzing huge sets of data and applying that to novel tasks. They're used for now-typical things like speech recognition, photo manipulation, as well as more novel tasks, like reproducing what your brain actually sees and creating quirky pickup lines and naming craft beers.

Machine Learning on FPGAs: Neural Networks - Продолжительность: 18:08 Intel FPGA 29 515 просмотров. IBM gets small, unveils 5-nanometer computer chip design using new approach - Продолжительность: 3:01 Digital Trends 5 537 просмотров.

The artificial intelligence boom is prompting chip companies to build specialized hardware. ARM is only the latest firm to get involved, following new chips from the likes of Intel, Google, and Qualcomm.

“Google said that if every user used voice search for just three minutes a day the company would have to double the number of servers it has,” notes Davies. As more smart devices start running more intensive AI applications, he says, “there just won’t be enough bandwidth available online. You’re going to break the internet.” Davies adds that although today’s chip designs are targeted at mobile devices, the broader chip architecture could scale up to provide AI chips for servers as well.

Patrick Moorhead, principal analyst at Moor Insights & Strategy, told The Verge that the new chip designs made sense for ARM, as more companies transition their computational workload from analytics to machine learning. However, he thought that the impact these chips would have on the mobile industry would be limited. “The mobile market is flat now, and I think this new kind of capability will help drive refresh [consumers upgrading their phones] but not increase sales to where smartphones are growing again,” said Moorhead.

ARM, of course, isn’t alone in trying to ride the AI wave with optimized silicon. Qualcomm is working on its own AI platform; Intel unveiled a new line of AI-specialized chips last year; Google is building its own machine learning chips for its servers; and, trying to take advantage of this moment of upheaval, ambitious startups like Graphcore are entering the industry, fueled by venture capital and eager to unseat incumbents.

As Davies puts it, this is a “once-in-a-generation inflection,” with everything to play for. “This is something that’s happening to all of computing.”

Japanese scientists invent floating 'firefly' light .
<p>Japanese engineering researchers say they have created a tiny electronic light the size of a firefly which rides waves of ultrasound, and could eventually figure in applications ranging from moving displays to projection mapping.</p>Named Luciola for its resemblance to the firefly, the featherweight levitating particle weighs 16.2 mg, has a diameter of 3.5 mm (0.14 inch), and emits a red glimmer that can just about illuminate text.

—   Share news in the SOC. Networks

Topical videos:

This is interesting!