In The Market For A New Graphics Card

Home Page Forums General Chat In The Market For A New Graphics Card

Viewing 15 posts - 16 through 30 (of 69 total)
  • Author
    Posts
  • #1931068
    Anonymous
    Inactive
    Rank: Rank-1

    @viai
    I was typing when you're post hit. I might have to rethink this. How much better do you feel like the 4070 is compared to the 3060? Is there heating issues? Air cooled or liquid?

    #1931071
    DirkDiggler
    Participant
    Rank: Rank-1

    The 4070 TI, not the 4070. It is significantly faster than the 3060. But you can pick up a 3060 now a days for about $250 USD. The 4070 TI will run around $750+ (Sorry if the pricing seems off. I'm guestimating from my currency.

    #1931074
    Anonymous
    Inactive
    Rank: Rank-1

    I'll have to look into this some more. I'm still liking what I'm hearing about that 3060 12GB. Thanks for the insight, it's given me some things to think about.

    #1931080
    VIAI
    Participant
    Rank: Rank-2

    @chupacabre409

    It's noticeably faster in a lot of situations, but overall it was less than I expected. I think DAZ is the bottleneck for me working with Genesis 8.1 more than the card.

    The main benefit with the 4070 TI in DAZ is that the Iray preview is generated fast enough that I can use that as the primary window, so I could avoid the texture shading view until I get really complex with multiple figures. Genesis 9 is slower because of the high textures, but still fast enough it's a non-issue.

    The 3060 TI was also pretty quick on this to be fair, maybe only a couple of seconds longer per change. Those seconds add up during a big project though, and it's a nice feeling not having to wait for loading screens.

    Everything else in the machine: z790 motherboard, i7-12700k CPU, 32 GB DDR5 RAM (link because I know not all RAM is equal but no idea what the numbers mean - also, the RGB lights are really annoying so heads up on that one), water cooled CPU but fans for everything else. I hope that helps!

    #1931081
    queensknight
    Participant
    Rank: Rank Overload

    @viai, the thing limiting your performance might be your system RAM--the 4070 TI has 12GB of VRAM, so DAZ will want a minimum of 48GB (4x VRAM) to fully unpack the scene data. With 32GB of system RAM, DAZ can handle scenes that use up to 8GB of VRAM (out of your 4070TI's 12) efficiently, but if your scene requires more than 8GB of VRAM you'll hit the system RAM paging wall. Upgrade to 48GB of system RAM and you'll be able to use the full 12GB of VRAM at full speed.

    #1931096
    queensknight
    Participant
    Rank: Rank Overload

    @chupacabre409, like @gts6 said, compositing is almost always part of any non-trivial digital artwork.

    Assembling larger artwork in pieces is standard industry practice, for practical reasons. Large tableaus are almost always rendered piecemeal and assembled as composites blended together with Photoshop or Affinity Photo. These sometimes get referred to as a "2.5D" apps because they can fake some 3D effects with 2D images. It won't ever be as "perfect" as a true 3D (PBR) rendering would be, but it often crosses the "good enough to fool the average person" threshold, and can save metric tonnes of render time.

    Also look at the Topaz AI suite--particularly Gigapixel AI and Denoise AI, both of which I use daily, as there's simply nothing better on the market at any price. If your rendering hardware limits the (practical) resolution of your artwork, Gigapixel AI can enlarge it up to 6x with incredibly good results. I've used it at 4x enlargement for large banner printings for movie premieres, etc.

    Denoise AI can also be a godsend when it comes to underlit scenes, or scenes you don't have enough time to render as completely as you'd like. In DAZ those tend to come out as "grainy" renders that look a lot like underlit photos on digital cameras, which is one of the types of "noise" that Denoise AI excels at removing, so instead of playing the diminishing-returns game and letting DAZ render the scene overnight, stop it after an hour or so and run the output through Denoise AI. Surprisingly often, that's more than enough to produce a high-quality result.

    #1931105
    queensknight
    Participant
    Rank: Rank Overload

    One last point I'll throw in for consideration, before I shut up 🙂

    Don't limit yourself to thinking that you have to do the creating and the rendering on the same machine. Doing it that way is actually pretty inconvenient, since your machine is too busy during the rendering process to let you continue creating. In an ideal world you'd send that render job to another machine and free up your workstation to work on the next scene.

    DAZ Studio supports this workflow in the form of Iray Server, which can run on a spare computer (or three) running Linux. These machines don't need to have the latest and greatest GPUs, but they do need to be NVidia GPUs. The idea is that instead of rendering the scene on your workstation, the render job gets sent to the Iray Server, which distributes the pieces across however many machines are available in your home-based "render farm". Even if the hardware on these servers is less impressive than your workstation, and the renders themselves aren't any faster, the real tangible benefit is that your workstation is freed up the moment you send off the render job.

    If you don't have a spare machine and suitable GPU hanging around, though, there are render farms online that you can rent by the hour (of processor time). Boost for DAZ is the officially-sponsored one, but iRender offers a similar service for DAZ users. In either case, you get to choose the server hardware you want to use for the job at hand, and that might include multiple GPUs--up to 3x A6000s, or 4x 4090s--with a lot more VRAM and system RAM than you might have available yourself.

    Where I find this handy is when I have work deadlines and I need to render a lot of detailed scenes in a matter of hours, not days. I can prepare the scene files, upload them to a cloud rendering service, and continue working. A few hours later I download the finished renders that would have taken me several days to do on my workstation (while not being able to use my workstation for anything else).

    This is also useful for rendering larger scenes than your VRAM and system RAM can handle. Even with my setup I run into scenes that I can't optimize enough to fit into VRAM, so I'm forced to choose between breaking the scene down further (into smaller scenes) or rendering them on hardware with more resources. Sending those to a cloud rendering service is the quickest and easiest solution.

    And of course, if you don't have an NVidia GPU at all (hello Mac users!), being able to have your non-trivial scenes rendered in a reasonable amount of time means farming the rendering work out to a service like this, so that it doesn't have to be rendered using your CPU.

    The cost issue with renting a cloud render farm like this is not as bad as it sounds, especially if you weigh it against the cost of upgrading your own hardware to a comparable level. A few dollars an hour to play with a rendering box that would set you back $10k+ to buy is not a hard sell. I probably use the service once a month for large batches, animation sequences, etc., but it's not a part of my day-to-day workflow.

    #1931112
    Anonymous
    Inactive
    Rank: Rank-1

    @queensknight
    All of that is really good information, thank you man.

    Years ago I use to do digital painting in photoshop and the tutorial I was following had me painting 4-6x larger then the final product. After painting it and smoothing everything out you would reduce the resolution to the final size. The process had a tendency to make the painting cleaner. Is rendering the image, then using gigapixel to increase the size, manipulate the image in photoshop/lightroom then reducing it a thing that would produce cleaner/better images?

    #1931142
    queensknight
    Participant
    Rank: Rank Overload

    @chupacabre409

    Is rendering the image, then using gigapixel to increase the size, manipulate the image in photoshop/lightroom then reducing it a thing that would produce cleaner/better images?

    That practice of working with oversized versions of the work and then reducing the resolution was always a kludge that people resorted to because the scaling algorithms of the day were terrible and tended to introduce unwanted artifacts. The only way to avoid those artifacts was to make sure you always used powers of two, which was very limiting.

    The trouble was that when enlarging something you had to introduce new information--you had to figure out what all the new pixels should contain, based on their neighbors (i.e. "interpolation"). There were some basic algorithms that just doubled or quadrupled every adjacent pixel, which is very fast but looks awful. The "fix" was to scale the image back down again, which helped reduce aliasing effects (i.e. "the jaggies").

    Nowadays we have smarter algorithms that we even have the hubris to call "AI", to do that interpolation and anti-aliasing for us in ways that are more "content-aware". Which is to say that the AI detects that the image contains a face, or a human figure, for example, and it knows a lot about what faces and human bodies look like, so when it has to invent new pixels in an enlargement it has a better idea about what those pixels should look like. Similarly, when reducing the resolution it takes the content into consideration and fixes the jaggies for you.

    All of which is to say that AI-based tools for image scaling are much better than the algorithms of even just ten years ago, so there's less need to do that old-style up-and-down technique to hide the jaggies. Download the Gigapixel AI and Denoise AI trials and play around with them to see what I mean.

    #1931147
    Frank21
    Participant
    Rank: Rank 5

    That doesn't get around the fact that any new clarity or details in an image is faked with AI, ie. not contained within the original image but added on. I've never seen any reason to upgrade from my old 2080 as it does whatever I ask of it... provided you know its limitations and what you're doing. No problem with 6 G8s yesterday in a scene with lots of other geometry.... 22 minutes.
    Without trying to offend anybody, I would suggest that this obsession with high-end GPUs is down to lack of knowledge of how to optimise DS scenes/lighting or just plain laziness, so GPU brute force and faked AI is seen as an alternative to that.

    #1931163
    SkippyTheMeh
    Participant
    Rank: Rank 3

    To add to what everyone else has said, make sure you have a power supply in your PC that is capable of providing enough power for a 3060 and has a spare PCI-E 8-pin connector.

    #1931224
    gladson1976
    Participant
    Rank: Rank 3

    @frank22 You sure know how to burst a bubble 🙂

    #1931239
    eelgoo
    Moderator
    Rank: Rank 7

    Frank does have a valid point, though.

    Whilst expensive shiny is all very nice, with care & optimisation, you can get good results from more modest, affordable VFM equipment.

    I have 32Gig RAM & a GTX1070 TI

    I have no complaints about its performance for what I wish to use it for.

    🙂

    #1931247
    Anonymous
    Inactive
    Rank: Rank-1

    @queensknight
    Thank you so much for taking the time to explain these things. I assumed that was what was going on and was leaning into operating in that same way. You're saving me some time, very helpful!


    @frank22

    Something I've learned over the years is the disappointment of spending the money to stay on the bleeding edge of tech. I'm now of the mindset of perusing performance per dollar if that makes any sense. It's one of the reasons I love this little 1050Ti so much. It's not the best but per dollar it's been the best I've ever owned. It would be different if my job depended on it, then I'd flesh out a MF-ing monster system lol. I like your approach to this bud.


    @SkippyTheMeh

    When I built this system I had the 3060ti in mind and bought This power supply. I think it comes with an 8-pin but I don't recall. I'm not pulling the power that I should out of this system and I've learned that it isn't good to be doing that. It can cause instability with the power supply but everything so far is running smooth so far.

    MOBO: ROG Strix X570-E Gaming
    CPU: AMD Ryzen 5 5600X
    RAM: G.Skill Trident Z Neo Series 32GB (2 x 16GB) SDRAM PC4-28800 DDR4 3600
    PS: RMx Series™ RM750x — 750 Watt 80 PLUS® Gold Certified Fully Modular PSU
    AIR Cooler: Noctua NH-D15
    SSD1 (OS): 980 PRO PCIe® 4.0 NVMe SSD 500GB
    SSD2: 980 PRO PCIe 4.0 NVMe™ SSD 2TB
    HHD: Western Digital 6TB WD Blue PC Internal Hard Drive HDD - 5400 RPM, SATA 6 Gb/s, 256 MB Cache

    I'm looking to upgrade my backup drive and leaning into getting a 10TB western digital black. I don't know if there is a better option but it seems good. I didn't think I would ever fill a 6TB but here I am collecting assets like baseball cards lol.

    If anyone has a recomendation for a better backup, please chime in! I was looking at larger drives, the western digital "red" series but some reviews mentioned that they have a higher fail rate. I can't help but feel like there's some BS going on in promoting the "black" but idk.


    @gladson1976

    I'll talk PC shop all day man. Not something I ever get tired of and it's just F-ing cool to hear what some professionals in the digital industry are working with. I couldn't justify spending the money on that but man is it "neat" lol.

    @all
    Thanks for taking the time to explain these things, it's been very helpful!

    #1931251
    Anonymous
    Inactive
    Rank: Rank-1

    @eelgoo
    I like that approach man, a performance/cost assessment. Like I mentioned, I've in the past tried to stay on the bleeding edge and only ever felt disappointed within 3 months or so. I just can't justify doing that anymore.

Viewing 15 posts - 16 through 30 (of 69 total)
  • You must be logged in to reply to this topic.

 

Post You Might Like