What's the deal with Vista?

Discussion in 'Software' started by UCHEEKYMONKEY, Feb 25, 2007.

  1. UCHEEKYMONKEY
    Honorary Member

    UCHEEKYMONKEY R.I.P - gone but never forgotten. Gold Member

    4,140
    58
    214
    12 Months ago MS announced they would be releasing a new windows platform designed for the 64 bit processors only.

    Yet the new vista platform is designed for 32 bit and 64 bit processors? MS stated they wanted to move away from 32 bit platform and stating the 64 bit platform would mean better security.

    Has anyone any feedback on this?

    Has longhorn been shelved and does the new vista offer better security than winxp pro?
     
    Certifications: Comptia A+
    WIP: Comptia N+
  2. Mr.Cheeks

    Mr.Cheeks 1st ever Gold Member! Gold Member

    5,373
    89
    190
    not surethe exact differences between 32 and 64bit, but M$ would lose a lot of business if they release vista only on 64bit machines...

    longhorn i believe is the new m$ server 2k7 and yes it will be alot more secure than xp, even vista is more secure than xp.
     
  3. UCHEEKYMONKEY
    Honorary Member

    UCHEEKYMONKEY R.I.P - gone but never forgotten. Gold Member

    4,140
    58
    214
    Cheeks have you got vista?

    and if so can you use firefox with vista?
     
    Certifications: Comptia A+
    WIP: Comptia N+
  4. Mr.Cheeks

    Mr.Cheeks 1st ever Gold Member! Gold Member

    5,373
    89
    190
    nope, my comp dont meet the minimum specs...

    dunno, quick browse of FF website and there is no mention of Vista on there... someone will give you a def answer though
     
  5. Raffaz

    Raffaz Kebab Lover Gold Member

    2,976
    56
    184
    Certifications: A+, MCP, MCDST, AutoCAD
    WIP: Rennovating my house
  6. zimbo
    Honorary Member

    zimbo Petabyte Poster

    5,215
    99
    181
    Umm the difference being that the 32 and 64 is the number of bits wide the address bus is thus 64 bits is *alot more* addresses than 32.. and dont forget 32 bit software runs on 64 bit platforms but at 32 bit performance.. once you go 64 bit - both hardware and software must run at 64 bit for you to truelly get the "64 bit performance"
     
    Certifications: B.Sc, MCDST & MCSA
    WIP: M.Sc - Computer Forensics
  7. Mr.Cheeks

    Mr.Cheeks 1st ever Gold Member! Gold Member

    5,373
    89
    190
    DUH! cant believe i never realised that!
     
  8. zimbo
    Honorary Member

    zimbo Petabyte Poster

    5,215
    99
    181
    Err dont be so hard.. i only truely understood the practical side of it once i had to build a 64 bit machine for a client..
     
    Certifications: B.Sc, MCDST & MCSA
    WIP: M.Sc - Computer Forensics
  9. Mr.Cheeks

    Mr.Cheeks 1st ever Gold Member! Gold Member

    5,373
    89
    190
  10. Crito

    Crito Banned

    505
    14
    0
    Even lowly Celerons and Semprons are 64-bit nowadays, most people just don't realize it. From the hardware-side there's really no reason to still be living in a 32-bit world. On software-side problem for Windows is binary third-party drivers. Microsoft has to either wait for companies to release 64-bit drivers or reverse engineer them all themselves. Because Linux is comprised primarily of open source software, the transition to 64-bit has been a lot easier there. Virtually everything that run on a 32-bit Linux platform runs on 64-bit one, the ONE major exception being Flash Player (which is closed-source.)
    http://www.petitiononline.com/lin64swf/petition.html
     
    Certifications: A few
    WIP: none
  11. Mathematix

    Mathematix Megabyte Poster

    969
    35
    74
    To the layperson 64bit simply means that they can stuff more memory into their hardware. Remember the 4GB main memory limit on PCs on 32bit machines? Calculate 2^32, which is 4,294,967,296 which is 4GB of unique addressable memory locations. With 64bit we have 2^64 (18,446,744,073,709,551,616).

    Now, 2^40 bytes is equal to one terabyte (1,099,511,627,776 bytes), so:

    2^64 / 2^40 = 18,446,744,073,709,551,616 / 1,099,511,627,776 = 16,777,216 terabytes of address space!

    We are not going to see machines with that amount of main memory in our lifetimes, but it does show the potential for computing power in the future.

    If we are not going to such massive address spaces now, why do we need 64bit technologies? Ultimately it's to do with the limitations imposed by the speed of the electron and the thermodynamic properties if the silicon on which current chips are made. The theory behind matters is that if we are limited by how fast we can physically process data along a 32bit path then we must widen the path. As well as having multicore machines, widening the data path also permits much faster throughput, so a 3GHz 32bit architecture, irrespective of being single or multicore, will be slower than a 64bit architecture. Note that the 64bit will not be twice as fast, but noticeably faster.

    That is it in a nutshell without moving onto software specifics at the low level.
     
    Certifications: BSc(Hons) Comp Sci, BCS Award of Merit
    WIP: Not doing certs. Computer geek.
  12. Mr.Cheeks

    Mr.Cheeks 1st ever Gold Member! Gold Member

    5,373
    89
    190
    Cheers for that Mathsman... Looks like you put your computer science degree into use then!
     
  13. Crito

    Crito Banned

    505
    14
    0
    If you could go back in time and ask a Commodor 64 (K) developer what he thought of 32-bit computing, I'm sure he'd tell you we'd all be long dead and burried before 4GB wouldn't be enough. So I wouldn't bet on it if I were you.

    Just when you think you've hit a wall someone comes out with 3D memory cubes:
    http://www.infoworld.com/article/04/05/26/HNreveo_1.html
    And looks like photons will replace electrons as computing's sub-atomic particle of choice within out lifetime:
    http://www.trnmag.com/Stories/2003/040903/Fiber_loop_makes_quantum_memory_040903.html
     
    Certifications: A few
    WIP: none
  14. Mathematix

    Mathematix Megabyte Poster

    969
    35
    74
    No probs, mate! Always trying to put it to good use. :biggrin

    Fair point, and with Moore's Law stating times for the doubling of computing power to average every 1.5 years getting shorter and shorter it is easy to be optimistic about what we can hope for in the future. But, we must remember one important fact: with each milestone achieved, the difficulty in achieving the next increases exponentially. The leap from 64KB to 4GB, although not at all easy, is still a lot easier than jumping from 4GB to 16.5TB!

    How many hundreds of millions of years did it take us to get from walking to the first steam powered cars of the 1900s? How long will it take for us to get from the car to the hoverboard as seen in Back to the Future? Not as long as from walking to the car, but theories exist as to how it can be done, but there is still no actual hoverboard.

    I'm sure you get my drift. :)
     
    Certifications: BSc(Hons) Comp Sci, BCS Award of Merit
    WIP: Not doing certs. Computer geek.
  15. Crito

    Crito Banned

    505
    14
    0
    When it comes to perceptions of speed and size it's all relative. One of the dumbest questions recruiters frequently ask me is "what's the largest database you've administered". I want to say "a 500 meg database on a 386-25Mhz when HDD throughput was 500KB/sec and the biggest tape drive you could get was 80 megs". But I'm afraid they wouldn't get it.

    Anyway, it's been a geometric progression thus far. So, if we applied Moore's law to memory (which it doesn't really, but I'm going to do it anyway, LOL) and there's a doubling of usage every 1.5 years (starting at 4GB or 2^32), then it'll only take 48 years before we exhaust the 64-bit address space. :ohmy

    Does seem like we've hit a bit of a brick wall with electrons though, but I'm confident we'll overcome that obstacle.
     
    Certifications: A few
    WIP: none
  16. Mathematix

    Mathematix Megabyte Poster

    969
    35
    74
    The more advanced technology becomes, the more we will move away from Moore's Law. It's not only to do with advances in hardware manufacture, as it was when Moore made his predictions, but it is now more weighted towards the demands of more feature-rich, process-intensive software.

    The only thing allowing hardware to marginally keep up is the cost of development of software and rapidly increasing team sizes for current and 'nextgen' applications.
     
    Certifications: BSc(Hons) Comp Sci, BCS Award of Merit
    WIP: Not doing certs. Computer geek.

Share This Page

Loading...
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.