My Westmere Xeon Brothers: How's Your Performance?

Hey gang, so I’ve mostly been playing Pokemon X and I have to say I’m enjoying it. However, my understanding (as with most emulators) is that Citra is extremely single-core dependent, and only seems to use 1 thread out of 24 on my dual Xeon rig (X5675’s) which certainly offer decent single-threaded performance for that generation, but is only about half of what we see today with Intel’s best single-core performance. They’re not overclocked, just stock. I’d bet a single Xeon W from that generation, which is easily overclockable, could see huge gains. Or of course one of these X-series chips with a board that allows overclocking…

I’ve seen things dip as low as about 40%, but 60-120% is a more average spread. Dips especially in battles with more than 1 pokemon on-screen. Whether 2 or 4, performance seems similar. In the overworld, performance is decent, usually ranging from 90-110% with huge gains indoors. I of course have accurate multiplication and geometry disabled.

I’m running an RX 480 with this, so I can increase resolution indefinitely without performance changing, so I know my CPU is the bottleneck. Do you guys see linear performance gains? In other words, do the latest Intel offerings, overclocked (which have about double the single-threaded performance on these old Xeons) only see about 2x performance gain? Or are other things like AVX/AVX2 support from what, Sandy Bridge, and then Haswell respectively, leading to disproportionate performance gains? I guess these old Xeons have neither. How does Ryzen fare, point per point on single-threaded performance?

Also, are we going to see any kind of multi-threading with Citra? Is there any support under the hood for it, even if it’s kind of experimental or unstable? I think it goes without saying that it’ll help older rigs like mine, or mobile CPU’s from a few generations ago, would see huge benefits, but I know it ain’t easy and I don’t mean to suggest that it is.

Is there any kind of unified data that Citra collects to show performance in certain games on current builds, with an assortment of CPU’s? That could actually be EXTREMELY useful data to collect IMO.

Sorry for the novel lol. Either way, loving Citra so far, even if it’s not perfect on my old rig. Thanks for what you guys do, and for the community support. Always appreciated.

Remember that AMD’s OpenGL drivers are less than optimal, in fact you may experience higher performance with a weaker Nvidia GPU than with a stronger intel CPU (on your current setup).

Maybe… But I can pretty much run any resolution from native to (what almost 4K?) with no performance loss, so I’m thinking the issue is pretty much the CPU. But then again, it’s always worth trying.

Maybe I’ll try to grab a 750Ti or something for cheap just to see. But I don’t think it’ll probably help much.

Quick bump. Would love more feedback on this. :slight_smile: