« Workstation Specs | Main | New iLogic Web Series »

06/08/2010

Comments

Jay Tedeschi

I would add that if you have a system which can be configured with more than one physical drive, then you should absolutely add one... and here's why.

In a single drive system all drive access is obviously going to take place on that one drive. When running computationally intensive tasks, there are several types of disk access taking place. First and foremost are the data files themselves, and in conjunction with them, the temp files, undo, etc generated by the application working with the aforementioned data files. However, the OS is also touching several types of files concurrently with this... DLL's, Swap and temp files at a minimum. In a single drive system, all of these operations must run sequentially, however if I add a second drive I now have the flexibility to isolate specific operations to specific locations. For example, in a laptop with a single 5400 rpm drive, an added 7200 rpm drive lets me place the most frequently accessed large file types, in this case Swap and data, on the second faster drive.

If we are working with a desktop system, with multiple drive bays, then our configuration can get even more imaginative. I am a big fan of RAID 0 configurations and have been running them for years. The nVidia SLI system I am building right is going to have a 500 Gb, 7200 rpm applications drive and a pair of 1 Tb 10,000 rpm drives striped together into what the system will see as a single 2 Tb drive. The advantage of a RAID configuration like this is that disk read/writes are twice as fast because half of all data is written to each drive simultaneously. Even better, when the Swap file is located on this drive there is "almost" no disadvantage to hitting virtual memory.

I could go on and on with regards to this subject, but I have work to do so I will cut it short here. However, if anyone has any questions just ask.

scott

So in my system specs on the previous post, i had an 80gig SSD, what if that remains, that would be bootable, have inventor and office installed on that drive and any other commonly used programs. Any other programs, say ones that are opened only once every 1-2 days are installed on a 2nd 10,000 rpm HDD, that drive would also house the workspace and project folders for inventor? Thing is i think 80 gig will just about squeeze, win 7, inventor series 2011, office 2007/2010 and the usaul internet required apps, plus several gigs maybe upto 10 gig of workspace data. This is considering we have a network that houses all of our work data and we only check out what we need.

scott

Rob,
Thankyou for this post, it backs up what i thought all along. I will forward it onto my boss and get them to show the contracted IT people. I think in our case at SMI i would have put the CPU after the graphics card, but this is interesting. I originally said to my colleagues that i thought a dual core processor was plenty sufficient considering everyday inventor tasks dont support multi core processors. With the money being better spent on SSD's and RAM.

scott

Jay & Rob,

one other thing i forgot. i actually spec'd ECC Ram on the previous post because it happened to be the cheapest 1066MHz 4 GB sticks i could find here in NZ. But the MOBO and in fact most consumer MOBO's dont support ECC memory. Nearly all workstations ship with the way more expensive ECC memory, and i feel it is being 'shoved' down design businesses throats. My view is that it is great in servers, or on workstations that do high numbers of computations, where data accuracy is of utmost importance. Like for FEA, CFD, CAE or similar Engineering software functions. Whereas for 90% of CAD uses its pointless, and you would benefit more from higher frequency or density memory of the NON-ECC memory.

cheers for your input

scott

ric

the cpu speed is the most important after the ram if you work with big assemblies

The comments to this entry are closed.

RSS Feed

  •   Subscribe

Survey