While we are still waiting for an official Bow launch, Intel in a ‘Ask us anything‘ showcased the architecture of the Arc as well as A770 and the A750. We have some pretty juicy details to share. You can watch the full video here.
The Arc range
To start, we present to you the complete Arc range. Although we have seen this image before, Intel reassures us that Arc is on its way and not cancelled. The initial launch lineup will feature only 4 GPUs, namely:
- Bow A380 (Spear)
- Bow A580 (still to be launched)
- Bow A750 (still to be launched)
- Arc A770 8/16 GB (still to be launched)
The blue team said some time ago that the 8 GB variant for the Bow A770 will be exclusive to AIB partners. Only the A770 16 GB will be treated as the ‘Limited edition‘.
XeSS and RT performance
Intel then presents its RT performance which is up to Nvidia. We dug up the XeSS yesterday’s technology. In short, the image quality is smoother compared to DLSS. Under extreme test conditions (bushes, trees) pushing most AA (Anti-Aliasing) technologies to their limits, we see XeSS falling prey to blurry edges and miscalculated shadows.
The RT performance has already been shown by the blue team. We see them getting closer and closer to NVIDIA territory, however, real hands-on experience will tell the whole story.
A770 and A750
Coming to the main topic of today’s discussion, we note that Bow A770 and the Bow A750. This is the most detailed spec sheet we have seen so far. The A770 (16 GB) and the A750 have the same TDP (225W). Intel has stated that the Arc A770 can overclock for 2.7GHz with air cooling. Additionally, we are seeing support for PCIe Gen 4×16. These GPUs come with a standard 3 years warranty and support the Windows 10/11,Ubuntu BONE.
Interestingly, the Arc series “officially” only supports DX12 Ultimate. This is where Intel’s architecture begins to break down because the difference between DX11 and DX12 in terms of performance, it’s night and day. Overall the specs seem solid for a 1st generation GPUs. It could have actually threatened both Nvidia and AMD, if these GPUs were launched in time. However, Intel has done their homework and we have to give them that.
How Intel’s RT Works
That’s the “nerdy” part for the most part. However, let’s start by understanding some terminologies.
Basic rasterization works like this. you have some 3D objects inside a scene which, using certain algorithms, are overwritten to form a 2D image (with a 3D effect) on your screen. Your screen basically goes in the xy coordinate plane with no concept of a 3rd (z) coordinate.
After that, each pixel is shaded with colors using another algorithm built into the program. This is how you see images on your screen.
Laser trace requires accurate reflection of light after hitting a surface, but how do you know which surface is reflecting how much light? To combat this, a ray is projected (Ray-casting) from your camera (screen) in the form of an imaginary line which strikes the triangles (building blocks of 3D models) present in a scene. But there are millions of these triangles. Similarly, you will need a lot of ray tracing for all the lighting effects. here BVH join the game.
BVH represented Hierarchy of bounding volumes. Each object is surrounded by an imaginary box made up of smaller boxes. The operation is as follows, the ray tracing hits the big box (A), which then opens the smaller boxes (B), which after hitting the perfect box of B, enter other smaller boxes (C) for precise ray tracing.
BVH is a data structure stored in the GPU cache memory for easy access when needed later. This is different from NVIDIA’s approach and can open up different opportunities for game developers. It will be interesting to see this technology in a few years.
In short, Intel has truly outdone itself because such persistence, even after so many setbacks, shows that Intel is here to stay. Including RT in the 1st iteration was never necessary, but they did and with surprisingly good results. In the same way, XeSS also looks great, but with a few caveats. These GPUs are expected to launch “anytime”, given how Intel is fueling the Arc hype train.
(Featured image credit goes to hothardware)