Larrabee and Intel's acquisition of Neoptica

Wednesday 28th November 2007, 12:12:00 PM, written by Arun

On October 19th, Neoptica was acquired by Intel in relation to the Larrabee project, but the news only broke on several websites in the last 2 days. We take a quick look at what Intel bought, and why, in this analysis piece.

Neoptica's Employees

  • 8 employees (including the two co-founders) according to Neoptica's official website.
  • 3 have a background with NVIDIA's Software Architecture group: Matt Pharr (editor of GPU Gems 2) and Craig Kolb who were also Exluna co-founders, and Geoff Berry. Tim Foley also worked there as an intern.
  • 2 are ex-Electronic Arts employees: Jean-Luc Duprat (who also worked at Dreamworks Feature Animation) and Paul Lalonde.
  • Nat Duca comes from Sony, where he led the development of the RSX tool suite and software development partnerships.
  • Aaron Lefohn comes from Pixar, where he worked on GPU acceleration for rendering and interactive film preview.
  • Pat Hanrahan also was on the technical advisory board. He used to work at Pixar, where he was the chief architect of the Renderman Interface protocol. His PhD students were also responsible for the creation of both Brook and CUDA.

Neoptica's Vision

  • Neoptica published a technical whitepaper back in March 2007. Matt Pharr also gave a presentation at Graphics Hardware 2006 that highlights some similar points.
  • It explained their perspective on the limitations of current programmable shading and their vision of the future, which they name "programmable graphics". Much of their point resides on the value of 'irregular algorithms' and the GPU's inability to construct complex data structures on its own.
  • They argue that a faster link to the CPU is thus a key requirement, with efficient parallelism and collaboration between the two. Only the PS3 allows this today.
  • They further claim the capability to deliver many round-trips between the CPU and the GPU every frame could make new algorithms possible and improve efficiency. They plead for the demise of the unidirectional rendering pipeline.

Neoptica's Proposed Solution

  • Neoptica claims to have developed a deadlock-free high-level API that abstracts the concurrency between multiple CPUs and GPUs despite being programmed in C/C++ and Cg/HLSL respectively.
  • These systems "deliver imagery that is impossible using the traditional hardware rendering pipeline, and deliver 10x to 50x speedups of existing GPU-only approaches."
  • Of course, the claimed speed-up is likely for algorithms that just don't fit the GPU's architecture, so it's more accurate to just say a traditional renderer just couldn't work like that rather than claim huge performance benefits.
  • Given that only the PS3 and, to a lesser extend, the XBox360 have a wide bus between the CPU and the GPU today, we would tend to believe next-generation consoles were their original intended market for this API.
  • Of course, Neoptica doesn't magically change these consoles' capabilities. The advantage of their solution would be to make exotic and hybrid renderers which benefit from both processors (using CELL to approximate high-quality ambient occlusion, for example) much easier to develop.

Intel's Larrabee

  • Larrabee is in several ways (but not all) a solution looking for a problem. While Intel's upcoming architecture might have inherent advantages in the GPGPU market and parts of the workstation segment, it wouldn't be so hot as a DX10 accelerator.
  • In the consumer market, the short-term Larrabee strategy seems to be add value rather than try to replace traditional GPUs. This could be accomplished, for example, through physics acceleration and this ties in with the Havok acquisition.
  • Unlike PhysX, however, Larrabee is fully programmable through a standard ISA. This makes it possible to add more value and possibly accelerate some algorithms GPUs cannot yet handle, thus improving overall visual quality.
  • In the GPGPU market, things will be very different however, and there is some potential for the OpenGL workstation market too. We'll see if rumours about the latter turn out true or not.
  • The short-term consumer strategy seems to make Larrabee a Tri- and Quad-SLI/Crossfire competitor, rather than a real GPU competitor. But Intel's ambitions don't stop there.
  • While Larrabee is unlikely to excel in DX10 titles (and thus not be cost-competitive for such a market), its unique architecture does give it advantages in more exotic algorithms, ones that don't make sense in DX10 or even DX11. That (and GPGPU) is likely where Intel sees potential.

Intel's Reasoning for Neoptica

  • Great researchers are always nice to have on your side, and Intel would probably love to have a next-generation gaming console contract. Neoptica's expertise in CPU-GPU collaboration is very valuable there.
  • Intel's "GPU" strategy seems to be based around reducing the importance of next-generation graphics APIs, including DX11. Their inherent advantage is greater flexibility, but their disadvantage is lower performance for current workloads. This must have made Neoptica's claims music to their ears.
  • Furthermore, several of Neoptica's employees have experience in offline rendering. Even if Larrabee didn't work out for real-time rendering, it might become a very attractive solution for the Pixars and Dreamworks of the world due to its combination of high performance and (nearly) infinite flexibility.
  • Overall, if Intel must stand a chance to conquer the world of programmable graphics in the next 5 years, they need an intuitive API that abstracts the details while making developers remember how inflexible some parts of the GPU pipeline remain both today and in DirectX 11. Neoptica's employees and expertise certainly can't hurt there.

While we remain highly skeptical of Intel's short-term and mid-term prospects for Larrabee in the consumer market, the Neoptica and the Havok acquisitions seem to be splendid decisions to us, as they both potentially expand Larrabee's target market and reduce risk. In addition, there is always the possibility that as much as Intel loves the tech, they also love the instant 'street cred' in the graphics world they get from picking up that group of engineers.

We look forward to the coming years as these events and many others start making their impact.


Discuss on the forums

Tagging

intel ± larrabee, pixar

Related intel News

RWT explores Haswell's eDRAM for graphics
RWT: An Updated Look at Intel's Quick Path Interconnect
32nm sixsome over at RealWorldTech
Intel Core i3 and i5 processors launched
Analysis: Intel-TSMC announcement more complex than reported
Intel and TSMC join forces to further Atom
Fudzilla: Intel 45nm Havendale MCM replaced by 32nm+45nm MCM
Intel announce Core i7 processors, reviews show up
Intel's Aaron Coday talks to Develop about Larrabee
Larrabee to also be presented at Hot Chips