Jump to content

First Intel Xeon W-3500 review lands with shocking realization — Intel excels at scientific computing and ML, but lags desperately everywhere else


Recommended Posts

  • Author


When you buy through links on our articles, Future and its syndication partners may earn a commission.

 Intel Xeon W-3500.  Intel Xeon W-3500.

Credit: Intel

Puget Systems has published a detailed content creation review of the Intel Xeon W-3500 series. The company’s latest workstation processors are an update of the W-3400 series, offering increased core counts and cache but maintaining the core architecture. This new chips aim to address Intel’s lagging performance in the high-end desktop (HEDT) content creation space compared to AMD’s Threadripper series.

Puget Systems had a full retail sample of the high-end Xeon w9-3595X but used pre-production samples for other models, meaning real-world performance might vary slightly. For consistency, benchmarking used standardized setups, ensuring RAM speed and cooling factors were controlled.

In Adobe After Effects, an application benefiting from multicore CPUs, Intel’s processors showed some performance improvements, although AMD’s Threadripper held the lead. Similarly, the Xeon processors showed only incremental gains in Adobe Premiere Pro and DaVinci Resolve, with Threadripper still leading in single-threaded and multi-core performance.

Showing some gains

For video editing and motion graphics, the Xeons performed respectably but failed to surpass AMD’s offerings. Specifically, Premiere Pro showed minor improvements, while the RAW codec performance was positive. DaVinci Resolve further highlighted AMD’s dominance, although Intel resolved previous issues with odd core count models underperforming.

Adobe Photoshop tests confirmed that these high-core count processors weren’t the best choice due to the application’s latency sensitivity and single-core reliance. AMD’s Threadripper dominated here as well.

In Unreal Engine tests and CPU rendering benchmarks (Cinebench, V-Ray, Blender), the Xeons showed some gains, particularly in Blender with a 10-15% improvement. However, AMD’s higher-core models were faster, completing tasks notably quicker.

Blender benchmarks Intel Xeon W-3500Blender benchmarks Intel Xeon W-3500

Blender benchmarks Intel Xeon W-3500

Summing up

At the end of its review, Puget Systems said, “The new Intel Xeon W-3500 family of processors is a fine refresh to an existing product stack but leaves a lot to be desired if Intel wants to compete with AMD in the HEDT space for Content Creation. As is typical, the performance gain depends a lot on the particular application, but, in general, gains are from 0-20% with a bias towards multi-threaded applications due to the increased core count.”

Puget noted that while it didn’t test the new chips for scientific computing and HPC/ML applications in this review (as the focus was on content creation) this is one area where the Intel Xeon W-3500 series will shine and it plans run a comparison for that in the future.

More from TechRadar Pro



Source link

#Intel #Xeon #W3500 #review #lands #shocking #realization #Intel #excels #scientific #computing #lags #desperately

The post First Intel Xeon W-3500 review lands with shocking realization — Intel excels at scientific computing and ML, but lags desperately everywhere else appeared first on 247 News Center.

Source Link

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

Cookie Consent & Terms We use cookies to enhance your experience on our site. By continuing to browse our website, you agree to our use of cookies as outlined in our We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.. Please review our Terms of Use, Privacy Policy, and Guidelines for more information.