Digital Sovereignty & Cyber Security
6.8K views | +0 today
Follow
Digital Sovereignty & Cyber Security
From cyberwar to digital encryption, security issues to state sovereignty
Your new post is loading...
Your new post is loading...
Scooped by Philippe J DEWOST
Scoop.it!

Taiwan – at the Center of a Worldwide Go Game Between China and the US

Taiwan – at the Center of a Worldwide Go Game Between China and the US | Digital Sovereignty & Cyber Security | Scoop.it
Silicon Geopolitics
Philippe J DEWOST's insight:
A must read paper on how Taiwan’s fate could impact western Tech in the next decade
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

The Looming Battle Over AI Chips

The Looming Battle Over AI Chips | Digital Sovereignty & Cyber Security | Scoop.it

For some years now, there has been a tension between the world’s largest tech companies— Alphabet, Amazon.com, Facebook, Apple, Microsoft, Baidu, and Alibaba Group Holding—and the chip companies they rely on, especially Intel and Nvidia.

While the giants buy massive quantities of Intel’s (ticker: INTC) microprocessors, and Nvidia’s (NVDA) graphics chips, or GPUs, to power their data centers, they are also in an arms race to have the best artificial-intelligence-based machine-learning functions. Because of this, there was always the possibility the giants might decide to buy fewer off-the-shelf parts and make their own custom chips to get an edge on one another.

That prospect burst onto the scene again last week as Bloomberg reported that job listings at Facebook (FB), and remarks by unnamed sources, indicate that the social-networking giant is working on making its own chips.

The development, if true, is not surprising. Barron’s wrote 2½ years ago about how AI might push the giants to make their own parts (“Watch Out Intel, Here Comes Facebook,” Oct. 31, 2015). One of the chief sources in that article was none other than Facebook’s guru of machine learning, Yann LeCun.

Facebook declined to make LeCun available, but in that 2015 interview he outlined a dilemma Facebook confronts with machine learning that has probably not changed since then.

Facebook receives hundreds of millions of photographs from its users on a daily basis. Its computers must analyze, within a couple of seconds of a picture being uploaded, whether to show that picture to one of your friends, to block it for questionable content, and to tag the images with your friends’ names, using facial recognition—all examples of machine learning.

As LeCun explained, machine learning is breaking the current generation of chips. The amount of media continues to rise and down the road is more complex media. Imagine a future where people upload 3-D models of places they’ve been from their next-generation smartphones.

“The amount of infrastructure if we use the current type of CPU [central processing unit] is just going to be overwhelming,” he remarked.

LeCun said that Facebook is receptive to Intel or another vendor making its own neural-network processor, but he warned, “If they don’t, then we’ll have to go with an industry partner who will build hardware to specs, or we’ll build our own.”

LeCun and Facebook may have decided now is the time to go it alone. Intel’s plans to have its own AI chip have not yet borne fruit in terms of shipping parts. Nvidia is really the undisputed leader in AI chips. That brings with it a certain anxiety of relying on a single vendor.

 

Nvidia, moreover, increasingly views its software for programming its chips, called CUDA, as a kind of vast operating system that would span all of the machine learning in the world, an operating system akin to what Microsoft (MSFT) was in the old days of PCs. That sort of preeminence is doubtless disturbing to the giants, who want their AI to have a unique flavor and advantage.

But the main reason for custom chips is that Facebook and the others simply think they can do better. Chips for machine learning rely on algorithms and data, and the giants know both of those more intimately than the chip makers. They have the intellectual property that really matters.

LeCun and other scholars of machine learning know that if you were starting with a blank sheet of paper, an Nvidia GPU would not be the ideal chip to build. Because of the way machine-learning algorithms work, they are bumping up against limitations in the way a GPU is designed. GPUs can actually degrade the machine learning’s neural network, LeCun observed.

“The solution is a different architecture, one more specialized for neural networks,” said LeCun.

All that seemed mere speculation back in 2015, but it may now be a conclusion Facebook and others can’t avoid. Alphabet’s Google has already made its own chip—the TPU, as it’s called—for machine learning. Google and its brethren have the funds to pursue almost limitless experiments to see what they can make.

At the same time, AI chip start-ups such as Silicon Valley’s Cerebras Systems are pursuing radically new chip designs. Although Cerebras is in stealth mode, its work appears to rest on a completely different kind of math than what GPUs use—“sparse matrix” math—which may be better suited to machine learning.

The risks to Nvidia are minimal at present. The company can still sell tons of chips to every company that doesn’t have the deep pockets of Facebook or Google. The reality of machine learning and chip design, however, means a future in which Nvidia’s role is going to diminish. Advanced Micro Devices (AMD) is Nvidia’s closest competitor, and it has an opportunity as the challenger. Intel, Qualcomm (QCOM), and Broadcom (AVGO) also may prove to be contenders, but their ability to compete is probably less than the start-ups building the right designs from scratch.

As for mergers and acquisitions, except for the smallest companies, such as Cerebras, it’s unlikely Facebook wants to buy Nvidia or any large chip maker. They already know that at the end of the day, the most valuable intellectual property in AI is found in the algorithms chugging away in their own data centers. 

Philippe J DEWOST's insight:

Where is Europe ?

Silicon is difficult, as hardware usually is, yet at the end it is the ultimate lever for control. Why else would China try to push out any foreign chip (as they did for Intel in their latest Supercomputer) ?

No comment yet.
Rescooped by Philippe J DEWOST from cross pond high tech
Scoop.it!

Why Microsoft Says ARM Chips Can Replace Half of Its Data Center Muscle

Why Microsoft Says ARM Chips Can Replace Half of Its Data Center Muscle | Digital Sovereignty & Cyber Security | Scoop.it
A few announcements that came out of last week’s Open Compute Summit in Santa Clara and the Google Cloud Next conference in San Francisco however showed that while Intel’s lead may be massive, it’s under bigger threat than may have appeared.
Philippe J DEWOST's insight:

Wintel coming to an end. ARM's rise in the server space was forecasted yet totally misunderstood in France. Hoping that we will draw the lesson and look way more seriously into OpenRisc-V.

No comment yet.
Rescooped by Philippe J DEWOST from cross pond high tech
Scoop.it!

SoftBank is buying ARM for $32 billion — because everything’s a computer now

SoftBank is buying ARM for $32 billion — because everything’s a computer now | Digital Sovereignty & Cyber Security | Scoop.it

Japan’s SoftBank is buying U.K.-based chip design firm ARM Holdings for about $32 billion, according to the FT.

Why? Everything is a computer now, and ARM has been one of the winners of the mobile revolution.

ARM designs chips — but doesn’t actually make them — for a huge variety of devices. It dominates the market for smartphones — Apple is a big client, as is Samsung — and its chips shows up in other consumer gadgets, as well as more-industrial-like devices and “internet of things” sensors.

The number of chips containing ARM processors reached almost 15 billion in 2015, up from about 6 billion in 2010.

The move is a big one for SoftBank CEO Masa Son after his would-be successor, former Google executive Nikesh Arora, stepped away from the company last month. (Talks presumably started while Arora was still there.)

One key question is whether other firms will let SoftBank purchase ARM or if there will be a bidding war. Apple, arguably ARM’s most important client, and Intel, which lost the mobile chip war to ARM, are both potential buyers.

The offer is already a generous multiple. As the FT notes, it’s some 70 times ARM’s net income last year. That’s around the same price-to-earnings ratio as Facebook stock.

Philippe J DEWOST's insight:

As Brexit has removed ARM from Europe, will it be left as the impotent witness of what we shall call an ARM's race ?

This news echoes the announcement of World's new #1 Supercomputer, that is chinese again, but more interestingly no longer features any Intel processor inside but domestic LongSoon chips.

The Silicon race is on its way to a US - Asia bipolar configuration, with Europe being left alone due to the combined effect of Brexit and ARMXit : time for investi(gati)ng (in) open source hardware architectures such as RISC-V ...

Philippe J DEWOST's curator insight, July 18, 2016 1:14 AM

ARM takeover by SoftBank is the Tech Brexit of this summer.

This thunderstrike in a blue ocean (pardon me, sky) might trigger a war where we will all of a sudden remember how important it is having a war chest.

There are underlying geopolitics ongoing as evidenced by the progress made by the LongSoon chinese processor now powering world #1 Supercomputer.

It might also signal the beginning of the end of the ARM era, and should have more people focusing on open source silicon architectures such as RISC-V

Scooped by Philippe J DEWOST
Scoop.it!

New Intel firmware boot verification bypass enables low-level backdoors

New Intel firmware boot verification bypass enables low-level backdoors | Digital Sovereignty & Cyber Security | Scoop.it
By replacing a PC's SPI flash chip with one that contains rogue code, an attacker can can gain full, persistent access.
Philippe J DEWOST's insight:
Open Sourcing may be one of the only ways to clean such mess.
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign

Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign | Digital Sovereignty & Cyber Security | Scoop.it
A fundamental design flaw in Intel's processor chips has forced a significant redesign of the Linux and Windows kernels to defang the chip-level security bug.Programmers are scrambling to overhaul the open-source Linux kernel's virtual memory system. Meanwhile, Microsoft is expected to publicly introduce the necessary changes to its Windows operating system in an upcoming Patch Tuesday: these changes were seeded to beta testers running fast-ring Windows Insider builds in November and December.Crucially, these updates to both Linux and Windows will incur a performance hit on Intel products. The effects are still being benchmarked, however we're looking at a ballpark figure of five to 30 per cent slow down, depending on the task and the processor model. More recent Intel chips have features – such as PCID – to reduce the performance hit. Your mileage may vary.
Philippe J DEWOST's insight:
Another reason to seriously consider open source hardware architectures ?
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Samsung Possible Defection From ARM to RISC-V is a huge signal in the IoT chip war to come

Samsung Possible Defection From ARM to RISC-V is a huge signal in the IoT chip war to come | Digital Sovereignty & Cyber Security | Scoop.it

Could Samsung be the first big defection from ARM since the SoftBank takeover?

It was always thought that, when ARM relinquished its independence, its customers would look around for other alternatives.
The nice thing about RISC-V is that it’s independent, open source and royalty-free.
And RISC-V is what Samsung is reported to be using for an IoT CPU in preference to ARM.
Now SoftBank made a point of saying that its take-over of ARM was to get into IoT. If Samsung is now going to RISC-V for its IoT CPU, this affects the scale of Softbank’s aspirations and may persuade others to defect to RISC-V.
The Samsung RISC-V MCU is said to be aimed squarely at the ARM Cortex M0.
Nvidia and Qualcomm are already using RISC-V in the development of GPU memory controllers and IoT processors.
Although, as Intel found, it’s almost impossible to replace an incumbent processor architecture in a major product area, which means that ARM’s place as the incumbent architecture in cellphones is secure, at the moment there is no incumbent processor architecture in IoT or MCU – so these are up for grabs by any aspiring rival processor architecture.

Philippe J DEWOST's insight:

X86 architecture gave Intel dominance of the large PC market before hitting the smartphone wall.

Cortex architectures gave ARM dominance of the much larger smartphone market before hitting the SoftBank wall.

RISC-V may be the next architecture for the even much larger IoT market (in volume at least).

Intel is a US corporation, ARM was once a british company now under japanese flag : the nice thing with RISC-V is that it is an independent, open source, and royalty free architecture.

This will have consequences over the next decade in the computing race between the US and Asia (think Loogson and now ARM), and may be an opportunity for Europeans to step in and avoid to remain as "The Pacific" of cyber tests.

Philippe J DEWOST's curator insight, November 30, 2016 1:19 AM

X86 architecture gave Intel dominance of the large PC market before hitting the smartphone wall.

Cortex architectures gave ARM dominance of the much larger smartphone market before hitting the SoftBank wall.

RISC-V may be the next architecture for the even much larger IoT market (in volume at least).

Intel is a US corporation, ARM was once a british company now under japanese flag : the nice thing with RISC-V is that it is an independent, open source, and royalty free architecture.

This will have consequences over the next decade in the computing race between the US and Asia (think Loogson and now ARM), and may be an opportunity for Europeans to step in and avoid to remain as "The Pacific" of cyber tests.