SAS v SATA, 7200 vs 10k

Discussion in 'Hardware' started by Mikeyboy, Dec 28, 2012.

  1. Mikeyboy

    Mikeyboy Kilobyte Poster

    279
    2
    32
    Probably a fairly obvious answer, but just wanted some real world experience responses...

    I'm looking to set up a new (test, home lab) server, and wondering what are the best options for disks...

    the server will be esxi, and I will probably look to run the host OS from SD / USB to use all the local disks for vms...

    my aim is to use the HP VSA storage lefthand thingy, and use local disks as my datastores... have been testing this out and I seemed to get pretty good performance, this was using a 10k SAS drive... if I was to put this setup in place, I would probably aim to get a couple of 1tb drives to use as datastore disks, in a mirror, and then shove a bunch of other disks in which I already own, in a raid 5, as data storage...

    anyway the main question is, am I likely to see a drop in performance if I use 7200 sata disks as datastore disks, or would it be negligible? am I better off going for faster disks, but I would not be able to afford as much space then... I am thinking of 3.5" disks as these offer larger capacity, but would 2.5" be better to go for...

    opinions please :)
     
    Certifications: VCP,MCSA, MCP, MCDST, MCITP, MCTS, A+, N+
  2. dmarsh
    Honorary Member 500 Likes Award

    dmarsh Petabyte Poster

    4,305
    503
    259
    10k SAS is better seek and general performance than most SATA drives.

    Why do you need lots of space for a home lab ?

    If you don't whats wrong with SSD ?
     
  3. Mikeyboy

    Mikeyboy Kilobyte Poster

    279
    2
    32
    dont particularly need lots of space, but currently have 750gb-ish space for VMs, dont really want to shrink this, have about 20 VMs and this is likely to grow - again probably don't need all them but dont really want to scale back, cant imagine larger capacity SSDs are too cheap!
     
    Certifications: VCP,MCSA, MCP, MCDST, MCITP, MCTS, A+, N+
  4. dmarsh
    Honorary Member 500 Likes Award

    dmarsh Petabyte Poster

    4,305
    503
    259
    Test VM's can normally be around 15-20 GB. You could therefore get around 12 on a 256 GB SSD and get far better performance.

    Fast hard drives likely to cost around £60, two is gonna cost £120, £150 buys a decent 256 GB SSD.

    RAID 5 is not cheap either. I wouldn't bother for home use of test VM's.
     
    Last edited: Dec 28, 2012
  5. Mikeyboy

    Mikeyboy Kilobyte Poster

    279
    2
    32
    Hmm well ignore the RAID-5 aspect for now, I already have 4x1tb disks in a raid 5, and new hardware would come with raid card so would be able to setup raid 5 at no extra cost.

    Dont particularly need the extra space on datastores, just nice to have it there to play with. I should mention I will be using this hardware for my home media streaming / file storage stuff as well, hence the raid 5 data storage area bit.

    I want a raid-1 for my datastores, test environment or not, I want some redundancy! So looking really at 2 disks, of undecided size. Currently I use 2x 7200 2.5" drives as my datastore (using starwind ISCSI) and performance is OK, just wondering if I was to use the same sort of setup but using the VSA approach would I get decent performance... or am I better off just getting some lower capacity 10k / SAS drives or even 7200 SAS drives? or just suck it and see :)
     
    Certifications: VCP,MCSA, MCP, MCDST, MCITP, MCTS, A+, N+
  6. Shinigami

    Shinigami Megabyte Poster

    896
    40
    84
    Like dmarsh said, SSD's have their advantages, even if they're more expensive (happy to see prices dropping at a swift pace) for the amount of given space you get, I could add my little bit of experience from having maintained a small (just over ten VMs) lab on my company issued laptop.

    Lappy has 16GB ram, just enough to run a DC and several other unified communications servers with roles spread out (Exchange, Lync and the like). Installing servers or patching them were a pain in the posterior, usually the single lappy disk (still a 7.2k rpm, although probably crippled in some manner as laptop HDD's are to account for vibrations and so on) would crawl to a crunch if I tried to work on more than two installs/patches, nevermind the usual 'work' you do on a lab. This was a 500GB disk in case you were wondering.

    Time to time I do reinstalls, and a few months back I upgraded the disk to a 500GB SSD. Still enough space to hold +16 VM's for 'lab' needs (average disk space in dynamic mode with Hyper-V being 22GB or so if I give them 1-1.5GB ram). And as I'm on holiday, I'm rebuilding my lab to try some new configs.

    So out with the old, in with a new 'clean' install, and what before took me on average of 2 hours to prepare 2 machines (OS install with Service Pack and full patches), has now dropped down considerably! Whilst it still takes some time to apply a Service Pack or to perform updates, I'm able to work on 8 machines in parallel meaning that after 2 hours, I've already installed 6 machines (in full, SP and updates), I'm about to begin the updates on another 2 machines and I'm half way through an SP install on 2 other boxes.

    So it 'seems' like I'm working about 4 times faster with the SSD, and 500GB is enough for my 'current' needs. That time saved is worth it IMO for the cash I spent. It sometimes took several evenings to patch my lab, now I can see it taking half an evening tops. This could mean that sometimes just 'patching' an environment took a week when all the time you had in the world, was an hour or two every evening in between dinner, time spent with partner/friends and so on... Not very constructive when 'patch tuesdays' take place once every 4 weeks on average...

    Hey, we IT guys need time to relax and play as well :)

    Just my two cents and of course, taking regular backups of your environment when relying on a single disk is a must.
     
    Last edited: Dec 28, 2012
    Certifications: MCSE, MCITP, MCDST, MOS, CIW, Comptia
    WIP: Win7/Lync2010/MCM

Share This Page

Loading...
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.