• 4 Posts
  • 71 Comments
Joined 2 years ago
cake
Cake day: June 9th, 2023

help-circle

  • I vaguely recall playing one of the two about 20 years ago (looking at the screenshots, I think it was the second game). It was a bonus game on a CD of some computer or gaming magazine. Even two decades ago and this shortly after release, it felt unbelievably dated and clunky already. The PC port was also complete garbage, with lots of bugs, awful visuals even by PS1 port standards and poor controls.

    If you’re nostalgic for these games, they might be worth revisiting (although you’re probably remembering them being more impressive than they actually were), but if you’re not, I doubt they are worth picking up, even with the improvements from gog.

    Just to compare these two to another dinosaur game from that era that received similarly poor reviews as the PC version of Dino Crisis, Trespasser was far more sophisticated and fun, in my opinion at least - and certainly a technical marvel by comparison. It’s not just that it’s fully 3D, with huge open areas (not possible on PS1, of course), but also the way it pioneered physics interaction. My favorite unscripted moment was a large bipedal dinosaur at the edge of the draw distance stumbling - possible thanks to the procedural animations - and bumping into the roof of a half-destroyed building, resulting in its collapse. That’s outrageous for 1998! I’ve only ever seen this happen once at this spot in the game, so it’s certainly not scripted.





  • Looking at the screenshots, I thought it was a port of a mid-gen PS4 game, but apparently, it’s a one year old former PS5 exclusive. Then again, this might explain the modest hardware requirements. You don’t see the minimum GPU on a AAA open world game being a GTX 1060 6 GB (a card from 2016) very often anymore. Perhaps it’ll run well on the Steam Deck, which is always appreciated. Reviews are solid enough that I might pick it up on sale.




  • DdCno1@beehaw.orgtoPC Gaming@lemmy.caCrysis VR MOD
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    It’s based on the Xbox 360/PS3 console port of the game. People figured this out pretty quickly, because a VTOL flying mission that these consoles couldn’t handle was missing from the remaster as well (later added back in with a patch). Colors are oversaturated, texture, object and lighting quality are down- or sidegraded (many not worse on a purely technical level, but different without being better for no reason, as if the outsourced Russian devs had a quota of changed assets to fill), lots of smaller and larger physics interactions are gone, because they were never part of that old console port. AI (quite a bit selling factor on the original as well, on top of the graphics and physics) is simplified as well. The added sprinkles of ray-tracing features here and there, as well as some nicer water physics do not make up for the many visual and gameplay deficiencies. On top of that, there are game-breaking glitches that weren’t part of the original.

    The biggest overall problem I have with it is that it just doesn’t look and feel like Crysis anymore and instead has the look of a generic tropical Unity Engine survival game. The original had a very distinct visual identity, a muted, realistic look, but with enough intentional artistic flourishes that made it more than just a groundbreaking attempt at photorealism. You can clearly see this if you compare the original hilltop sunrise to the remaster. Crysis also had an almost future-milsim-like approach to its gameplay that is now a shell of its former self.

    I will admit that to the casual player, many of these differences are minor to unnoticeable. If you haven’t spent far too much time with the original, you’re unlikely to notice the vast majority of it and might just notice how ridiculously saturated everything looks.

    At the very least you can still buy the original on PC. On gog, it’s easy to find, but on Steam, it’s hidden for some reason [insert speculation as to why here]: If you use Steam’s search function, only the remaster appears. You have to go to the store page of the stand-alone add-on Crysis Warhead (the only Crysis game that did not receive the remaster treatment, likely because it was never ported to console), which can be purchased in a bundle with the original as the Maximum Edition (this edition also does not appear in the search results): https://store.steampowered.com/sub/987/




  • This statement simply isn’t correct. I can procure much faster chips as a consumer, even at the low end. This isn’t the fastest single board computer either, not by a long shot. Like I said in another comment, it’s only about as fast as a 2010 Macbook Pro. That’s not “very fast” by any metric.

    I’m using a Core i3-N305 based single-board computer (Odroid H4) for my Plex server and it performs easily twice as well at just 3W more - while being x86 and fully compatible with any relevant OS without having to modify boot loaders and drivers or worry about incompatibilities. Reducing its power draw to the 12W of this chip would still easily outperform the Rockchip and would allow for a smaller heat sink. Best of all, MSRP is nearly the same compared to the CM3588 with the RK3588 (admittedly without RAM). You’d have to do something to the rear IO to make it slim enough for use in a laptop project, but that’s trivial on a project like this.




  • This is a highly impressive project, not just for a high school senior, but it should be stressed that this is nowhere near as powerful as a similarly priced modern laptop. This is a legendary school project, impressive enough to open doors to universities and lay the foundation for a successful career in the computer industry, but not really something you should try and build yourself if you’re looking for a laptop in this price range.

    A Geekbench 5 single-core score of 492 and a multi-core score of 2019 points are about comparable to a Macbook Pro from fifteen years ago. There is a small NPU present on the chip, which the old Macbook doesn’t have, but if that’s not important to your use case (which is very likely), then this device is not suitable for anything but the most basic tasks and will feel sluggish with any current software. There’s a reason the video barely shows the device in use, because it just wouldn’t be very pleasant to look at.




  • Can we stop with the fake frame nonsense? They aren’t any less real than other frames created by your computer. This is no different from the countless other shortcuts games have been using for decades.

    Also, input latency isn’t “sacrificed” for this. There is about 10 ms of overhang with 4x DLSS 4 frame gen, which however gets easily compensated for by the increase in frame rates.

    The math is pretty simple on this: At 60 fps native, a new frame needs to be generated every 16.67 ms (1000 ms / 60). Leaving out latency from the rest of the hard- and software (since it varies a lot between different input and output devices and even from game to game, not to mention, there are many games where graphics and e.g. physics frame rate are different), this means that at three more frames generated per “non-fake” frame and we are seeing a new frame on screen every 4.17 ms (assuming the display can output 240 Hz). The system still accepts input and visibly moves the view port based on user input between “fake” frames using reprojection, a technique borrowed from VR (where older approaches are working exceptionally well already in my experience, even at otherwise unplayably low frame rates - but provided the game doesn’t freeze), which means that we arrive at 14.17 ms of latency with the overhang, but four times the amount of visual fluidity.

    It’s even more striking at lower frame rates: Let’s assume a game is struggling to run at the desired settings and just about manages to achieve 30 fps (current example: Cyberpunk 2077 at RT Overdrive settings and 4K on a 5080). That’s one native frame every 33.33 ms. With three synthetic frames, we get one frame every 8.33 ms. Add 10 ms of input lag and we arrive at a total of 18.33 ms, close to the 16.67 ms input latency of native 60 fps. You can not tell me that this wouldn’t feel significantly more fluent to the player. I’m pretty certain you would actually prefer it over native 60 fps in a blind test, since the screen gets refreshed 120 times per second.

    Keep in mind that the artifacts from previous generations of frame generation, like smearing and shimmering, are pretty much gone now, at least based on the footage I’ve seen, and frame pacing appears to be improved as well, so there really aren’t any downsides anymore.

    Here’s the thing though: All of this remains optional. If you feel the need to be a purist about “real” and “fake frames”, nobody is stopping you from ignoring this setting in the options menu. Developers will however increasingly be using it, because it enables previously impossible to run higher settings on current hardware. No, that’s not laziness, it’s exploiting hardware and software capabilities, just like developers have always done it.

    Obligatory disclaimer: My card is several generations behind (RTX 2080, which means I can’t use Nvidia’s frame gen at all, not even 2x, but I am benefiting from the new super resolution transformer and ray reconstruction) and I don’t plan on replacing it any time soon, since it’s more than powerful enough right now. I’ve been using a mix of Intel, AMD and Nvidia hardware for decades, depending on which suited my needs and budget at any given time, and I’ll continue to do use this vendor-agnostic approach. My current favorite combination is AMD for the CPU and Nvidia for the GPU, since I think it’s the best of both worlds right now, but this might change by the time I’m making the next substantial upgrade to my hardware.