I finally got around to picking up Valhalla and ran into a very odd issue when Benchmarking my settings. First off I'm fully aware that Valhalla does not support SLI and also am familiar that in both Origins and Odyssey that if you left SLI enabled you'd not only get a slight hit to FPS but there was a noticeable amount of stutter. Not a big deal really and the only reason I'm still use SLI is I usually upgrade my GPU every other generation and I was due to upgrade my pair of 1080tis with the 3000 series but we all know how easy it is to acquire one.
Back to the issue at hand if you want to call it that. My first benchmark I'd forgotten to disable SLI but let it run through just out of curiosity so when my non-SLI run came up 13fps less than the SLI enabled run, I was a bit surprised. At first I thought maybe Ubisoft had added support for it but both the in game HUD and MSI Afterburner show 0% GPU utilization when it's enabled so I'm really not sure where the boost, albeit small is coming from. I even thought about it possibly auto offloading PhysX to the 2nd GPU and it just not showing up on the usage but I dismissed that after manually dedicating the 2nd GPU for it and remembering that I don't think any AC title since Black Flag has used PhysX in the first place.
Anyway, I just thought I'd ask and see if anyone else encountered similar behavior or can explain what's causing it. Like I said, it's not really an issue, if anything it's handy to not need to disable it when I want to play Valhalla.