|
Post by Mr Tretton on Aug 24, 2014 23:16:10 GMT -5
Right, and they are far away from where you last were with all of the enemies respawned. It keeps sending me back to that quick travel place whenever I quit and come back.
|
|
|
Post by duplissi on Aug 25, 2014 10:29:42 GMT -5
I just started playing Borderlands 2, it's pretty terrible, well, as a single player game. I get that it's better as co op. The save point system is horrendous (that's what really pissed me off), and it's optimized horribly. Ridiculous CPU gobbler. Glad I got it for free, but sucks that I was looking forward to it. Unless I'm missing something with the save system, I'm about to uninstall forever. CPU Gobbler? In my experience the game can run on a calculator. I get well over 100 fps in it.
|
|
|
Post by AuTo on Aug 25, 2014 12:18:51 GMT -5
Future-proofing is a concept I don't buy into. Everything I buy, I mentally plan to resell a few years down the line and replace with newer, better, more power-efficient things. Besides, soon after you buy that, DDR4 will become the standard. Also, DirectX 12 will be supported by existing cards IIRC. I believe it is Kepler and GCN based cards and up... Not entirely sure though. AuTo, why are you going with a 9590? It's an old part (Old as in the same architecture as my almost two year old cpu) by this point and IMO, unless you are on a budget all of AMD's CPUs are not worth getting. Note that they are working on a new arch from the ground up, that should be out by the end of 2015/early 2016. It is quite obvious you are not on a budget considering the 295x2. I'm waiting on Haswell E, and DDR4. That will be my next upgrade. i7 5820K? 6 Cores of Haswell IPC? for 340 bucks? Um, yes please. My 8350 has served me well for almost two years now, its time for a refresh. Its just a placeholder card, the most expensive behind the titan. Im not upgrading until dx12 has been out for a year or two, allowing the high end cards to come out. I get very little money per month to save, so I plan maybe 6 months to a year in advance for what I want in those price ranges. I bought my 6990 a few gens in and it still works valiantly even with watch dogs. Yeah there are much more powerful cards, but I dont need any more power when what i have will suffice for another year or two. However, it is starting to show its age and its time to start planning for its retirement by fire, in a server, working nonstop until it dies. A fitting death that I feel any graphics card would want to go.
|
|
|
Post by duplissi on Aug 25, 2014 13:34:50 GMT -5
I believe it is Kepler and GCN based cards and up... Not entirely sure though. AuTo, why are you going with a 9590? It's an old part (Old as in the same architecture as my almost two year old cpu) by this point and IMO, unless you are on a budget all of AMD's CPUs are not worth getting. Note that they are working on a new arch from the ground up, that should be out by the end of 2015/early 2016. It is quite obvious you are not on a budget considering the 295x2. I'm waiting on Haswell E, and DDR4. That will be my next upgrade. i7 5820K? 6 Cores of Haswell IPC? for 340 bucks? Um, yes please. My 8350 has served me well for almost two years now, its time for a refresh. Its just a placeholder card, the most expensive behind the titan. Im not upgrading until dx12 has been out for a year or two, allowing the high end cards to come out. I get very little money per month to save, so I plan maybe 6 months to a year in advance for what I want in those price ranges. I bought my 6990 a few gens in and it still works valiantly even with watch dogs. Yeah there are much more powerful cards, but I dont need any more power when what i have will suffice for another year or two. However, it is starting to show its age and its time to start planning for its retirement by fire, in a server, working nonstop until it dies. A fitting death that I feel any graphics card would want to go. No, the 9590 is what I am curious about. Why that cpu? You clearly have a high budget ceiling so why limit to the 9590?
|
|
|
Post by AuTo on Aug 25, 2014 14:19:54 GMT -5
Its just a placeholder card, the most expensive behind the titan. Im not upgrading until dx12 has been out for a year or two, allowing the high end cards to come out. I get very little money per month to save, so I plan maybe 6 months to a year in advance for what I want in those price ranges. I bought my 6990 a few gens in and it still works valiantly even with watch dogs. Yeah there are much more powerful cards, but I dont need any more power when what i have will suffice for another year or two. However, it is starting to show its age and its time to start planning for its retirement by fire, in a server, working nonstop until it dies. A fitting death that I feel any graphics card would want to go. No, the 9590 is what I am curious about. Why that cpu? You clearly have a high budget ceiling so why limit to the 9590? Because frak intel. I am not paying to unlock features that should be there from the get-go
|
|
|
Post by duplissi on Aug 25, 2014 14:22:21 GMT -5
No, the 9590 is what I am curious about. Why that cpu? You clearly have a high budget ceiling so why limit to the 9590? Because frak intel. I am not paying to unlock features that should be there from the get-go I share the sentiment, but right now AMD doesn't offer the performance I'm looking for. Supposedly they will at the end of next year, but I can't wait. So i7 5820K it is for me.
|
|
|
Post by Mr Tretton on Aug 25, 2014 16:05:29 GMT -5
I just started playing Borderlands 2, it's pretty terrible, well, as a single player game. I get that it's better as co op. The save point system is horrendous (that's what really pissed me off), and it's optimized horribly. Ridiculous CPU gobbler. Glad I got it for free, but sucks that I was looking forward to it. Unless I'm missing something with the save system, I'm about to uninstall forever. CPU Gobbler? In my experience the game can run on a calculator. I get well over 100 fps in it. Ah, I forgot that I added some SGSSAA to it back when I downloaded it last year. That's why. Looks great that way but variable fps. FXAA is ew. Anyway, yeah, deleted forever. Horrible save system.
|
|
|
Post by MidnytRain on Aug 25, 2014 18:41:58 GMT -5
CPU Gobbler? In my experience the game can run on a calculator. I get well over 100 fps in it. Ah, I forgot that I added some SGSSAA to it back when I downloaded it last year. That's why. Looks great that way but variable fps. FXAA is ew. Anyway, yeah, deleted forever. Horrible save system. I use FXAA. People don't like it because the image gets a bit foggy-looking, but I hate shimmering and weird-looking lines even more. It makes some games look really dated.
|
|
|
Post by duplissi on Aug 25, 2014 21:12:38 GMT -5
Ah, I forgot that I added some SGSSAA to it back when I downloaded it last year. That's why. Looks great that way but variable fps. FXAA is ew. Anyway, yeah, deleted forever. Horrible save system. I use FXAA. People don't like it because the image gets a bit foggy-looking, but I hate shimmering and weird-looking lines even more. It makes some games look really dated. MLAA or Temporal SMAA for me. I don't use MSAA much because of the performance hit it incurs. Hands down though SSAA produces the best image.
|
|
Eiffel
Berserker

Oberst-Gruppenführer
Stylin' & Racially Proflin'
Posts: 3,642
|
Post by Eiffel on Aug 26, 2014 14:28:20 GMT -5
Education.
|
|
|
Post by duplissi on Aug 28, 2014 16:42:13 GMT -5
I just started playing Borderlands 2, it's pretty terrible, well, as a single player game. I get that it's better as co op. The save point system is horrendous (that's what really pissed me off), and it's optimized horribly. Ridiculous CPU gobbler. Glad I got it for free, but sucks that I was looking forward to it. Unless I'm missing something with the save system, I'm about to uninstall forever. The bold is the only complaint of yours I share. It's an amazing game all around, in my opinion. I just fear that the Pre-Sequel will be as poorly optimized. PhysX on High in a full-out brawl will bring the game under 60fps for me pretty frequently, and this is with 670 SLI and a 4670k. AuTo: If you don't save that much per month, getting that much RAM isn't a good idea. Actually, I wouldn't blow my cash on any of that for 3 grand. I'd just get a TITAN Z. That's kind of pointless, you get much less with a TitanZ. He is looking to get the 295X2 which is marginally faster than the TitanZ at half the price. Hell, 2 Titan Blacks are a better buy than a TitanZ...
|
|
|
Post by duplissi on Aug 28, 2014 18:10:29 GMT -5
That's kind of pointless, you get much less with a TitanZ. He is looking to get the 295X2 which is marginally faster than the TitanZ at half the price. Hell, 2 Titan Blacks are a better buy than a TitanZ. Not if you want g-sync, Nvidia fur effects and just generally Nvidia products, though. I'm off of AMD for the foreseeable future. The part I can't understand is sticking with AMD processors at that price range, and getting that much RAM, personally. Intel Master Race. I agree with you about the cpu bit. Although Intel master race. eh, intel is a despicable company, right up there with apple. I'm getting a i7 5820K not because I like them but because I have no other choice if I want something faster. If things were equal between them I would get AMD every time. but they aren't sadly. But Gsync? Freesync. Nvidia fur effects? TressFX. etc. In the end though a TitanZ is a colossal waste of money! Unless you specifically need the double precision and have limited space (even this is suspect since the TitanZ is a triple slot card) in your case so you cannot fit two Titan Blacks. SLI 780ti is faster, SLI Titan Black is faster, Crossfire R9 290X is faster, R9 295X2 is faster, and all of these options are CHEAPER.
|
|
|
Post by AuTo on Aug 30, 2014 21:38:30 GMT -5
Intel is a horrible company. I absolutely REFUSE to pay for intel products, I could have gotten an intel server cheaper than what I paid for my AMD server, but I would have had to pay an additional $300 on top of it just to get RAID5. Seriously. ark.intel.com/products/49593/Intel-RAID-Activation-Key-AXXRAKSW5 (Not the one for the server I was looking at) Intel is not the master race, they are no more than the stereotypical jews of the computer world. My AMD Phenom II X6 1090T is still very much relevant even all these years later, same with my Radeon 6990 AND the CPU has a lifetime warranty. So what if intel is minutely better is it really worth paying 2-3x the price? No. Why? Because I could buy 2-3 additional computers and cluster them to act as one. Oh, you have PCI-e 3.0? No you don't. You have to pay extra to unlock that in the newest intel processors. ark.intel.com/products/82930/Intel-Core-i7-5960X-Processor-Extreme-Edition-20M-Cache-up-to-3_50-GHzOn top of that, intel doesn't use physical cores in multi-core products, they use virtual cores, whereas AMD uses physical cores.
|
|
|
Post by bladesfist on Aug 31, 2014 8:41:59 GMT -5
Intel is a horrible company. I absolutely REFUSE to pay for intel products, I could have gotten an intel server cheaper than what I paid for my AMD server, but I would have had to pay an additional $300 on top of it just to get RAID5. Seriously. ark.intel.com/products/49593/Intel-RAID-Activation-Key-AXXRAKSW5 (Not the one for the server I was looking at) Intel is not the master race, they are no more than the stereotypical jews of the computer world. My AMD Phenom II X6 1090T is still very much relevant even all these years later, same with my Radeon 6990 AND the CPU has a lifetime warranty. So what if intel is minutely better is it really worth paying 2-3x the price? No. Why? Because I could buy 2-3 additional computers and cluster them to act as one. Oh, you have PCI-e 3.0? No you don't. You have to pay extra to unlock that in the newest intel processors. ark.intel.com/products/82930/Intel-Core-i7-5960X-Processor-Extreme-Edition-20M-Cache-up-to-3_50-GHzOn top of that, intel doesn't use physical cores in multi-core products, they use virtual cores, whereas AMD uses physical cores. So much exaggeration.
|
|
|
Post by duplissi on Aug 31, 2014 9:20:54 GMT -5
Not sure where you got this from, Sure in the past Intel tried to make you pay to unlock features on your CPU, but that attempt failed. Where in that link does it say that they are doing so now? Also... Right now it is AMD that uses "Pseudo cores". They are real cores but they are different from traditional cores in that each module has two cores and they share some hardware and resources, theres more to it, but that is the simplest way to put it. Only way I could understand this statement is if you were referring to Hyper Threading, and no one refers to the amount of threads an Intel CPU can handle to the amount of cores... Most people understand that each core can handle two threads on a hyperthreaded CPU.
|
|
|
Post by AuTo on Sept 2, 2014 8:35:42 GMT -5
AuTo I don't feel like I was ripped off when I bought my 4670K for about 220 or so some time ago. And yeah, it has 4 physical cores so I don't know what you're even going on about. I don't think anyone buying a 4-core i7 is being "tricked" into thinking they're buying something with 8 physical cores. Anyway, I'll stick to Intel until and unless AMD starts making better gaming processors, as I do not suffer from brand loyalty. When was the last time you bought an AMD CPU? I don't suffer from any brand loyalty. Ive only worked designing them.. intel = 2 physical cores being shown as 4. amd = 4 physical cores
|
|
|
Post by duplissi on Sept 2, 2014 8:46:26 GMT -5
AuTo I don't feel like I was ripped off when I bought my 4670K for about 220 or so some time ago. And yeah, it has 4 physical cores so I don't know what you're even going on about. I don't think anyone buying a 4-core i7 is being "tricked" into thinking they're buying something with 8 physical cores. Anyway, I'll stick to Intel until and unless AMD starts making better gaming processors, as I do not suffer from brand loyalty. When was the last time you bought an AMD CPU? I don't suffer from any brand loyalty. Ive only worked designing them.. intel = 2 physical cores being shown as 4. amd = 4 physical cores No, that only happens if they have hyperthreading, and they aren't shown as physical cores but logical cores. Sure each logical core gets its own little graph in the performance tab of task mgr but it would state 2 physical 4 logical.
|
|
|
Post by AuTo on Sept 2, 2014 9:28:14 GMT -5
When was the last time you bought an AMD CPU? I don't suffer from any brand loyalty. Ive only worked designing them.. intel = 2 physical cores being shown as 4. amd = 4 physical cores No, that only happens if they have hyperthreading, and they aren't shown as physical cores but logical cores. Sure each logical core gets its own little graph in the performance tab of task mgr but it would state 2 physical 4 logical. Now look at the actual die.
|
|
|
Post by duplissi on Sept 2, 2014 11:45:02 GMT -5
When was the last time you bought an AMD CPU? I don't suffer from any brand loyalty. Ive only worked designing them.. intel = 2 physical cores being shown as 4. amd = 4 physical cores Yeah - you're wrong. The connections between physical cores are different on some AMD and Intel CPUs, but these are not logical cores. Source: am computer engineer, also looked into this years ago when I was deciding between Phenom II and gen-1 i7 (My gen-1 i7 crushed the Phenoms relentlessly). Also in regards to FreeSync: www.reddit.com/r/pcgaming/comments/2f9hnj/amd_only_certain_new_radeons_will_work_with/Lol. Doesn't sound so 'free' to me. ... You are missing the point. Free as in no license fees, free as in part of the vesa standard, free as in not proprietary. Its a new tech, the scalars and GPUs need to support the standard. Kind of a duh moment.
|
|
|
Post by AuTo on Sept 2, 2014 17:57:43 GMT -5
When was the last time you bought an AMD CPU? I don't suffer from any brand loyalty. Ive only worked designing them.. intel = 2 physical cores being shown as 4. amd = 4 physical cores Yeah - you're wrong. The connections between physical cores are different on some AMD and Intel CPUs, but these are not logical cores. Source: am computer engineer, also looked into this years ago when I was deciding between Phenom II and gen-1 i7 (My gen-1 i7 crushed the Phenoms relentlessly). Also in regards to FreeSync: www.reddit.com/r/pcgaming/comments/2f9hnj/amd_only_certain_new_radeons_will_work_with/Lol. Doesn't sound so 'free' to me. You didn't answer my question.. lol
|
|