Message boards : Graphics cards (GPUs) : GT 640 DDR3
Author | Message |
---|---|
http://www.anandtech.com/show/5911/nvidia-announces-retail-gt-640-ddr3, | |
ID: 25521 | Rating: 0 | rate:
![]() ![]() ![]() | |
hm, i just buyed one. lol | |
ID: 25723 | Rating: 0 | rate:
![]() ![]() ![]() | |
Should work in the same way as the bigger Kepler cards. Expect ~1/4 the performance of a full GK104 at similar clocks. | |
ID: 25725 | Rating: 0 | rate:
![]() ![]() ![]() | |
http://www.palit.biz/palit/vgapro.php?id=1894 | |
ID: 25726 | Rating: 0 | rate:
![]() ![]() ![]() | |
hm, i just buyed one. lol well, at least you could. ;) got that thing up and running? | |
ID: 25731 | Rating: 0 | rate:
![]() ![]() ![]() | |
Not yet, waiting for the ordering confirmation, still waiting , its saturday u know. :o/ | |
ID: 25732 | Rating: 0 | rate:
![]() ![]() ![]() | |
12.24 Gflops per wattage = means 65 * 12.24 = ~ 800 gflops , not bad :) 384 Shader * 2 Operations per clock * 900 MHz = 691 GFlops. However, 2 Ops/clock per shader is a theoretical maximum, which can hardly be achived in the real world. But we've been calculating nVidia Flops like this for a long time, so at least it's consistent for the sake of comparison. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 25737 | Rating: 0 | rate:
![]() ![]() ![]() | |
I expect the GT 640 should just about match a GTX 550 Ti in terms of runtime performance, but maybe the 4.2 app will be favourable to it. Obviously the 65W TDP is much more attractive than the 116W of the GTX550Ti. | |
ID: 25750 | Rating: 0 | rate:
![]() ![]() ![]() | |
Bad news , for me, the ordered GT 640 was a GT 620. A database error by the vendor. | |
ID: 25770 | Rating: 0 | rate:
![]() ![]() ![]() | |
Very Good News ! | |
ID: 25789 | Rating: 0 | rate:
![]() ![]() ![]() | |
PAOLA task 4.2 on its way to completion. Almost about 5 hours , anybody who completed it or another with a GT 640 ? Approximately completion about 8 hours. | |
ID: 25793 | Rating: 0 | rate:
![]() ![]() ![]() | |
ok IBUCH took about 6 hours. | |
ID: 25800 | Rating: 0 | rate:
![]() ![]() ![]() | |
Performance roughly as expected. It would probably match the GTX 550 Ti, if it could run the CUDA 3.1 app, but on the CUDA 4.2 app it's actually matching a GTX460, due to the GTX600 series improvements in performance over the GTX400/500 series. Cool. | |
ID: 25805 | Rating: 0 | rate:
![]() ![]() ![]() | |
Performance roughly as expected. It would probably match the GTX 550 Ti, if it could run the CUDA 3.1 app, but on the CUDA 4.2 app it's actually matching a GTX460, due to the GTX600 series improvements in performance over the GTX400/500 series. Cool. well, REALLY cool! considering it's only drawing ~50W out of the wall, time to go green.. | |
ID: 25806 | Rating: 0 | rate:
![]() ![]() ![]() | |
it must be about 65 Watt who´s coming out of the wall, | |
ID: 25807 | Rating: 0 | rate:
![]() ![]() ![]() | |
Why must it draw 65 W? Because TDP is stated as 65 W by nVidia? Well.. no. Cards usually draw less than TDP at GPU-Grid and crunching in general. Consider this: you're probably not even at 100% GPU utilization, let alone using the TMUs etc. | |
ID: 25809 | Rating: 0 | rate:
![]() ![]() ![]() | |
but I hope you're not pushing the card to 2x the power consumption ;) I hope too (looking at my electricity bill), for the rest of the month i can only run this card :D. I am almost at 50,-€ for this month, thats quiet enough for me. i have not tryed to overvoltage this card, i am hope others may try this before me, please :D. | |
ID: 25810 | Rating: 0 | rate:
![]() ![]() ![]() | |
Long term overvolting is even less recommended on lesser cards. | |
ID: 25811 | Rating: 0 | rate:
![]() ![]() ![]() | |
Well.. overvolting it still wouldn't cost much. | |
ID: 25818 | Rating: 0 | rate:
![]() ![]() ![]() | |
I think undervolting maybe an option. | |
ID: 25822 | Rating: 0 | rate:
![]() ![]() ![]() | |
MIS Kombustor perhaps? | |
ID: 25824 | Rating: 0 | rate:
![]() ![]() ![]() | |
nope, same results. i must be waiting until they evolved that all softwares. :-/ | |
ID: 25825 | Rating: 0 | rate:
![]() ![]() ![]() | |
Apparently Voltage auto-adjusts on reference cards, so you can't manually do it, but it should increase with the GPU clock. | |
ID: 25827 | Rating: 0 | rate:
![]() ![]() ![]() | |
hm , i expirienced more raising the clock results in errors, no automatic voltage raising here on this card :( | |
ID: 25830 | Rating: 0 | rate:
![]() ![]() ![]() | |
If it causes errors then don't increase the clocks. | |
ID: 25831 | Rating: 0 | rate:
![]() ![]() ![]() | |
everthings fine now, i think the card needed some "burn-in" time. Now voltage settings work but low benefit. | |
ID: 25836 | Rating: 0 | rate:
![]() ![]() ![]() | |
Which clock speeds do you reach? | |
ID: 25842 | Rating: 0 | rate:
![]() ![]() ![]() | |
what will happen when the gt 640 starts to run a cuda 3.1 app ? | |
ID: 25860 | Rating: 0 | rate:
![]() ![]() ![]() | |
oops ! I did it :( | |
ID: 25865 | Rating: 0 | rate:
![]() ![]() ![]() | |
Great ! U did well. Now i am crushing only 4.2 tasks . jeepieh ! | |
ID: 25869 | Rating: 0 | rate:
![]() ![]() ![]() | |
Cant clock the GT 640 , under Windows 7 i can use all 3 cards at the same time. That saves me 100 watts per hour. great | |
ID: 25886 | Rating: 0 | rate:
![]() ![]() ![]() | |
Side note: 100 watts per hour -> 100 W. Power is already divided by time. | |
ID: 25899 | Rating: 0 | rate:
![]() ![]() ![]() | |
Side note: 100 watts per hour -> 100 W. Power is already divided by time. ok, then i saved 0.1 kw/h. greetings to the fury ones :) | |
ID: 25900 | Rating: 0 | rate:
![]() ![]() ![]() | |
Each hour you save 0.1 kWh :) | |
ID: 25901 | Rating: 0 | rate:
![]() ![]() ![]() | |
ooook, i managed my overclocking problem so far. Used "Nvidia Inspector" to clock the gt 640. | |
ID: 25902 | Rating: 0 | rate:
![]() ![]() ![]() | |
the "overclocking" doesnt work, in real there was no overclocking. | |
ID: 25913 | Rating: 0 | rate:
![]() ![]() ![]() | |
Does have anyone news about runtimes of long tasks with Cuda4.2 on GTX640 cards? | |
ID: 26077 | Rating: 0 | rate:
![]() ![]() ![]() | |
had pretty much task failures? i overclocked by 250 Mhz , now its only 200 Mhz , just trying. | |
ID: 26078 | Rating: 0 | rate:
![]() ![]() ![]() | |
As I am not very good in computers, might the GT640 work for GPUGRID in a second slot with a GTX670 in the first? My motherboard has two slots and the computer for the GTX640 actually does not exist. | |
ID: 26083 | Rating: 0 | rate:
![]() ![]() ![]() | |
that will work, no problem. I think but only the GTX is not a triple slot graphics card. | |
ID: 26086 | Rating: 0 | rate:
![]() ![]() ![]() | |
The GT640 should work in the second slot, though you might want to check your PCIE slot performance - if it's only PCIE x1 then if it even works it would be very poor. If both slots are PCIEx16 and stay X16 when the second slot is occupied then that's optimal. If the first slot drops to X8 and the second is only X4, then you would probably still do more work, but would be loosing a fair bit of performance through bandwidth limitations. | |
ID: 26087 | Rating: 0 | rate:
![]() ![]() ![]() | |
The running times have changed ? now a WU of PAOLA Long Run take approximately 21 hours. | |
ID: 26092 | Rating: 0 | rate:
![]() ![]() ![]() | |
@klepel: the GTX560 xxxCore Edition is quite fast at GPU-Grid, way faster than the GT640. But it's also a totally different beast power-consumption-wise. Could your power supply, case cooling, ears and electricity bill stand such a card? IMO that would be the main question. GT640 is completely tame in comparison. | |
ID: 26101 | Rating: 0 | rate:
![]() ![]() ![]() | |
A reference GTX560 Ti 448 is almost twice as fast as a GT640; ~1.85, but the 65W vs 210W TDP means the GTX560Ti448 uses ~3.25 times the power of a GT 640. | |
ID: 26110 | Rating: 0 | rate:
![]() ![]() ![]() | |
Thanks a lot for your advice! Considering also your comments on the forum “Lower level 600 release?”, I think it will be best to wait until a release of GTX 660 (Kepler), as I do not need this card urgently, as I do have a spare GT8400: | |
ID: 26116 | Rating: 0 | rate:
![]() ![]() ![]() | |
Waiting sounds about right for you then. And a quality 650 W PSU won't have any problems pushing CPU, GT640 and a GTX680 :) | |
ID: 26144 | Rating: 0 | rate:
![]() ![]() ![]() | |
hello skgiven and ETA, i put two gt 640 in my PCIE2 16x slots and the memory controller load is at 70%. that means the system is not fully busy (~ 30% load free) or am i wrong and a PCIE3 is required ? gpu load is 96-98%. another strange thing: the first card in slot: GPU-temp : 53.0 °C Fan speed: 35 % VDDC: 0.9870 Volt the second card in slot: GPU-temp : 59.0 °C Fan speed: 47% VDDC: 1.0120 Volt both cards are running with the same speed (clock and memory). :/ greetz ____________ http://www.rechenkraft.net | |
ID: 26346 | Rating: 0 | rate:
![]() ![]() ![]() | |
96-98% GPU utilization is high. | |
ID: 26354 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hello fellow volunteers: I have a 1GB GT640 running in PCIex1 (on my laptop using the PCIexpress slot). | |
ID: 26588 | Rating: 0 | rate:
![]() ![]() ![]() | |
With current Paola Wus and a GT 640 u cannot claim a 24 hour bonus, it tooks about 27 Hours to complete. | |
ID: 26707 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hey guys, | |
ID: 33562 | Rating: 0 | rate:
![]() ![]() ![]() | |
If not the GT640, which would be a good card at about 50W TDP? Why TDP and not real wattage ? Some data from the total system, Intel i7 with GTX 650 TI (see first test last year): 62 W - Idle (EIST off, fixed voltage: idle/load 0.996V) > CPU usage 0% 135 W - GPUgrid (GPU OC +100Mhz GPU clock as Mem clock , standard voltage) > CPU usage 12% 152 W - GPUgrid + Einstein@iGPU (HD4000, OpenCL) > CPU usage 15% 176 W - GPUgrid + Einstein + Docking@CPU (Hyperthreading on 7 cores) > CPU usage 100% 160 W - Einstein paused for testing > CPU usage 100% 135 W - Docking + Einstein paused for testing > CPU usage 12% 62 W - BOINC paused for testing > CPU usage 0%
| |
ID: 33568 | Rating: 0 | rate:
![]() ![]() ![]() | |
This project favors faster, mid-range to high end cards. | |
ID: 33569 | Rating: 0 | rate:
![]() ![]() ![]() | |
I'm using the GTX 570 in my "main"-computer, which I only run when I'm at home. | |
ID: 33571 | Rating: 0 | rate:
![]() ![]() ![]() | |
Your i5 has an 84W TDP and it's pricey. While it won't consume 84W it might use around 60 or 70W crunching CPU projects. The other components would probably use over 30W. Even a GeForce GTX 650 Ti would use around 100W by itself, so your not going to make 100W starting from an i5. Around 200W is doable though, but that's still not a good setup. | |
ID: 33573 | Rating: 0 | rate:
![]() ![]() ![]() | |
0,1 * 24 * 365 = 876 kWh * 0,25 € (Germany) = 219 Euro, only for energy, each year. If you crunch with 1x workunit per day with the GTX 570, you get points & contribute to the project, but don´t have to pay for the new hardware or find a place for it as time for service, backups and so on. Plus you are more flexible when new hardware (minimum) requirements are defined. Maybe the up to now low end hardware is pushed out of the game in few months. For my part I suspended all 24/7 crunchers and concentrate now on one single cruncher, which is a compromise between energy consumption (on the long run), loudness (low overclocing in summer, higher oc in winter) and performance (overclocking range, undervolting option). Looking backward the last years there is a quick development in hardware releases, programming and new standards (CUDA). Middle class componentes will be fast graduated to low end, high end to middle class. But high end consume too much energy, so you are forced to replace in case of inefficieny, not in every period but in those with huge efficiency advantages. Best wishes for your decision ! | |
ID: 33574 | Rating: 0 | rate:
![]() ![]() ![]() | |
Yea it's difficult with our high energy prices... :/ | |
ID: 33576 | Rating: 0 | rate:
![]() ![]() ![]() | |
Is the jump to 2GB in a 650 Ti a big deal? You are talking about $50 more once you factor in the lack of rebate. | |
ID: 33602 | Rating: 0 | rate:
![]() ![]() ![]() | |
Is the jump to 2GB in a 650 Ti a big deal? You are talking about $50 more once you factor in the lack of rebate. Not much to the speed of the card but there have been WU's on here that have taken over 1Gb or close to 1Gb of memory which makes them either slow or impossible to run on a 1Gb card. | |
ID: 33603 | Rating: 0 | rate:
![]() ![]() ![]() | |
As SK said, GT640 DDR3 was never recommended for GPU-Grid due to these reasons: | |
ID: 33680 | Rating: 0 | rate:
![]() ![]() ![]() | |
Message boards : Graphics cards (GPUs) : GT 640 DDR3