Saturday, August 14, 2010

COMPUTER COMPONENTS

Central Processing Unit (CPU) or the processor
             -is the portion of a computer system that carries out the instructions of a computer program, and is the primary element carrying out the computer's functions.
              -refers to the microprocessor chip.
3 Parts of CPU
1. Control Unit-act as overall manager
2. Arithmetic Logic Unit(ALU)-performs mathematical and logical operations.
3. Main Memory -is the primary memory
          RAM(Random Access Memory)
               -memory used by the computer to run programs.
               -a temporary storage memory.
         ROM(Read Only Memory)
               -no matter how many times i turned off the computer the software
that I install will not be lost.

Motherboard -this is where the core components of your computer reside which are listed below. Also the support cards for video, sound,networking and more are mounted into this board.
Microprocessor -this is the brain of the computer. It performs commands and instructions and controls the operation of the computer.
Power Supply-comes with the case, but this component is mentioned separately since there are various types of power supplies. (250-500 watts)
Audio Card-is the voice of your system.
                   -is a computer expansion card that facilitates the input and output of audio signals to and from a computer under control of computer programs.
Floppy Drive-is a small disk storage divice that today typically has about 1.4 megabytes of memory.
Hard Drive-is a non-volatile storage device for digital data.
Network Card-use to connect to high speed internet access. (ex.cable or DSL lines)
Video Card-provides visual image to the monitor.
                   -used in computer games to provide additional memory visual effect.
Monitor - This device which operates like a TV set lets the user see how the computer is responding to
thie commands.                                                                                                                                            
Keyboard - This is where the user enters text commands into the computer.
Mouse - A point and click interface for entering commands which works well in graphical environments.
            -is an electronic device that controls the coordinates of cursor on your computer screen.

Printer -use to print out,prints whatever is on the monitor onto paper.
Video Camera-a webcam,a video capture device connected to a computer.
Uninterruptible Power Supply-essentially a battery back up in case of power failure.
(UPS)280VA-400VA

Friday, August 13, 2010

Computer Concept

Discussion on the film viewing:

INPUT DEVICES                                                     
input devices on basic computer are used
to input data.
- mouse                                                                          
- keyboard                                                                            
- microphone                                                                
- network                                                                      
OUTPUT DEVICES
-monitor
-printer
-speaker
-network
STORAGE - also act as input and output devices.
                     - allow to store data update data and instruction on how do we actually process data in other words a computer program.

 -flash drive 
 -hard disc  
 -optical disc  
 -network disc
     A basic computer system can be represented above and it concludes the presentation of computer concept.


    The server offers a lot of users access the same information.

    For example : 
                    We have 3 or more workstation we need a server so that those workstation can access to the data and all those workstation  is all connected on the server to get information.

     Another is the NIC all those workstation and server they are all connected to the network.The NIC ( Network Interface Card ) it is a card that that you will install to the computer to plug either internet or wireless.It is used to connect to the network.


    The computer is the invention that change the world.There are so many high tech machine invented like robots the androids that can act like a human.They can do what people do. Computer  is very helpful to our lives we can also use it to communicate to our relatives and friends in the other country through chatting.

    British Young Mathematician Turing  -  a man who accomplished his successes without outside motivation to do so.


    Babbage Concepts - ( 1850 ) create a machine that could change the world.
               Charles Babbage ( The father of computer  ) - a british mathematics professor a computer scientist who orginated the idea of a programmable computer.He invented the Difference Engine and it actually work.


    Universal Machine - the machine could solve the mathematical problem.

    Thursday, August 12, 2010

    NUMBER SYSTEM

    A number system is the set of symbols used to express quantities as the basis for counting, determining order, comparing amounts, performing calculations, and representing value. It is the set of characters and mathematical rules that are used to represent a number.
     A number system is a way of counting things. It's a way of identifying the quantity of something.

    1. Decimal has a base of 10 and its symbol is 0,1....9 and it use by humans.
    2. Binary has a base of 2 and its symbol is 0,1 and it use by computer.
    3. Octal has a base of  8 and it symbol is 0,1...7.
    4. Hexadecimal has a base of 16 and its symbol is 0,1...9 and A,B...F.
    Decimal,Binary,Octal,Hexadecimal can be converted with each other.

    • To convert binary to octal the rule is just group the binary into 3.
    • To convert binary to hexa the rule is just group the binary into 4.

    FLOW CHART

    Flow Chart Defined

                  A flow chart is a graphical or symbolic representation of a process. Each step in the process is represented by a different symbol and contains a short description of the process step. The flow chart symbols are linked together with arrows showing the process flow direction. 

    3 Alternate Definitions of Flow Chart

    As-Is Flowchart

    The first cool thing about flow charts is that they let you see the process flow at a glance, so my first alternate definition of "Flow Chart" is a Snap Shot of your Business Processes. This is commonly called an As-Is Flowchart. You can tell a lot about the complexity (and often over-complexity) of many business processes just by looking at an as-is flow chart of them - without even reading the text in the symbols. You can easily see the flow of information and materials, branches in the process, opportunities for infinite loops, the number of process steps, inter-departmental operations, and more.

    Process Zoom Len

    The second cool thing about flow charts is that they let you see the process flow at different levels, so my second alternate definition of "Flow Chart" is a Zoom Lens for your Business Processes. Flow charts are often categorized in 3 levels: high-level (aka, 30,000 ft. level), mid-level and low-level (detailed). A high-level flow chart could be a process defined at the company-wide or large system level. Mid-level flow chart could be a process defined at the department level, and a low-level flow chart could be a process defined at working level.
    Some flow chart tools (including Microsoft Excel) allow you to add hyperlinks to flow chart symbols. The hyperlinks let you click on a flow chart symbol, drilling down from a high-level process step to a detailed set of process flow steps. This truly gives you the zoom lens capability.

    Process Test Bed

     The third cool thing about flow charts is that they let you perform risk-free experiments, so with that in mind my third and final alternate definition of "Flow Chart" is a Process Test Bed. All process improvements require change, and most changes involve risk, require work, cost money, or instill some level of emotional uncertainty and fear. You can mitigate each of these by creating process flow charts of any proposed business operation changes. Each flow chart can be a "what-if" that helps the involved players more easily see the risks involved. Personally, I do before and after flow charts on all significant process changes.

     

     

    A Note on Flowchart Symbols

    Different flow chart symbols have different meanings. The most common flow chart symbols are:
    • Terminator: An oval flow chart shape indicating the start or end of the process.
    • Process: A rectangular flow chart shape indicating a normal process flow step.
    • Decision: A diamond flow chart shape indication a branch in the process flow.
    • Connector: A small, labeled, circular flow chart shape used to indicate a jump in the process flow.
    • Data: A parallelogram that indicates data input or output (I/O) for a process.
    • Document: used to indicate a document or report (see image in sample flow chart below).
    [A complete list of flow chart symbols can be found in the Flowchart Symbols Defined article.]

    A really simplistic flow chart showing the flow chart symbols described above can be seen below:
    Simple FlowBreeze flow chart example

    Wednesday, August 11, 2010

    HARDWARE AND SOFTWARE

    HARDWARE
              
        Hardware starts functioning once software is loaded.
            
    Definition:Devices required to store and execute (or run) the software.
    Function:  Hardware serve as the delivery system for software solutions. The hardware of a computer is   infrequently changed, in comparison with software and data, which are “soft” in the sense that they are readily created, modified, or erased on the computer.
    Examples: CD-ROM, monitor, printer, video card, scanners , label makers, routers , and modems.
    Types: Motherboard, CPU, RAM, BIOS, power supply, video display controller, computer bus, CD-ROM drive, floppy disk, zip drive.





    SOFTWARE

         To deliver its set of instructions, Software is installed on hardware. 

    Definition:Collection of instructions that enables a user to interact with the computer. Software is a program that enables a computer to perform a specific task, as opposed to the physical components of the system (hardware).
                     :Second essential parts of the computer.
    Function:   To perform the specific task you need to complete.
    Examples:Quickbooks, Adobe Acrobat, Winoms-Cs, Internet Explorer , Microsoft Word , Microsoft Excel 
    Types:System software, Programming software, and Application software.

    Tuesday, August 10, 2010

    Activity 2

    LIST THE 5 MOST COMMON TYPES OF COMPUTER SYSTEMS.

    Personal computer: A small, single-user computer based on a microprocessor.






      








    Workstation: A powerful, single-user computer. A workstation is like a personal computer, but it has a more powerful microprocessor and, in general, a higher-quality monitor.







    Minicomputer: A multi-user computer capable of supporting up to hundreds of users simultaneously.   









         
         Mainframe: A powerful multi-user computer capable of supporting many hundreds or thousands of users simultaneously.


          

         


         Supercomputer: An extremely fast computer that can perform hundreds of millions of instructions per second.





        IDENTIFY TWO UNIQUE FEATURES OF SUPERCOMPUTERS                 





        1.  Massively Parallel Processing (MPP), is to chain together thousands of commercially available  microprocessors utilizing parallel processing techniques.
         
                   



        2.    Beowulf cluster, or cluster computing, employs large numbers of personal computers interconnected by a local area network and running programs written for parallel processing.

                      





        DESCRIBE A TYPICAL USE FOR MAINFRAME COMPUTERS
        -A mainframe is usually used for critical applications and bulk data processing (for example census processing, large company payrolls).



        DIFFERENTIATE WORKSTATIONS FROM PERSONAL COMPUTER
                   
        A workstation is a powerful, single-user computer it is like a personal computer, but it has a more powerful microprocessor and, in general, a higher-quality monitor while personal computer is a small, single-user computer based on a microprocessor.


        IDENTIFY THE FOUR TYPES OF PERSONAL COMPUTER







        1.Desktop computers are used in many places such as home, school and business.







        2.Laptops are second common type used of personal computers.





         




        3 Notebook use a variety of techniques, known as flat-panel technologies, to produce a lightweight and non-bulky display screen.









        4.Palmtop computers are also types of personal computers and are so small that they can fit into the pocket of your coat. These computers have slow speed and less working capacity than previous two types of the computers. Palmtop Computers are mostly used by small businessmen and salesmen.

        Computer History

        The History of the Computer - First PC's and the Future Computer Timeline

        As time marches on, it becomes more and more difficult to recall our world before the invention of the computer. For those who tire over answering an infinite number of daily emails, imagining this world may seem like a pleasant dream. But for the rest of modern society, this is probably not something we'd like to imagine. Whether you are a technophile, or someone who simply requires a computer for your day-to-day activities, you've probably wondered at least once who is responsible for the modern computer.

        The History of the Computer



        First, a necessary digression: Some will argue that the first computer was invented 5000 years ago when the Sumerians developed the abacus. But for those of you who can’t remember World History 101, the abacus was a man-made wooden calculating tool that allowed the user to formulate and keep track of easy math problems. The simple fact is that a distinction has now been drawn between these calculators and modern computers.


        The First Electronic Computer


        The First Electronic Computer


        So, let's move forward to the commonly accepted definition of the modern digital computer. A Nazi by the name of Konrad Zuse developed the first freely programmable computer. Zuse’s computer required three basic elements: a control, a memory, and a calculator for the arithmetic he needed to process. Zuse continued to build upon his work over the years by developing the first algorithmic programming language, and in 1941 he completed the first fully functioning electro-mechanical computer. Following this progress, Zuse was unable to convince the Nazi government to support his work for a computer based on electronic valves because the cost to create such a machine would have taken hundreds of thousands of dollars. The Nazi’s thought they were close to winning the war and felt no need to support further research.

        Remarkably, the 1960’s were a decade known for a lot more than the invention of the microchip. However, this invention is arguably one of the most important in the history of modern man. At the time, Jack Kilby and Robert Noyce (founder of Intel) were not partners, they actually didn’t even know each other, but as fate would have it they both invented almost identical integrated circuits (a.k.a. microchip) at nearly the same time. To put it simply, these integrated circuits allowed computers to run more with fewer parts. Most notably, it was the microchip that enabled man to fly into space and to land on the moon. Regardless of running more with less, in the days of vacuum tubes and the early microchip, a computer with less than a megabyte of memory would fill up 1/4 of a soccer field, and cost millions of dollars to produce. And it wasn’t until the 1970’s when the microchip allowed a computer to fit on the top of a desk instead of filling the entire house.

        Photobucket


        Another important year for the computer was 1962. This was the year known for the Cuban Missile Crisis, but also the year the first computer game was invented. “Spacewar” was invented by a team of geeks from MIT, and was led by a young computer programmer by the name of Steve Russell. It took the team 200 hours to write the first version of Spacewar. What was significant about the game was that the operating system was the first to allow multiple users to share one computer simultaneously. But it wasn’t until the 1970’s that the computer moved outside of the expensive university and into the living room.


        The First Personal Computer


        The First Personal Computer



        For the controversial price of $666.66 a piece, Steve Wozniak and Steve Jobs built by hand the first 50 personal computers in 1976 (answers.com). It has now been over thirty years since that introduction of the first personal computers to the world market in 1977. And since their introduction --The Commodore PET, Apple II, and TRS-80-- the world has been forever changed. This is perhaps most notable in that, during the first three decades of availability of consumer computers, they have dramatically changed the way billions of people now conduct their daily lives. Additionally, we have witnessed the growth of what is now a global, multi-billion dollar industry. The rate at which the technology has grown and the increased availability of products over the years has had a dramatic impact on the affordability and subsequent distribution of personal computers at the consumer level.
        The First Personal Computer



        Initially, one of these computers, along with a printer and programs, would cost the consumer in the range of $2000-$3000 dollars. That might not sound like much, but when adjusted for inflation in 2008, that would be like spending $7,000 to $10,000 for a computer with four to 16 kb of RAM. Considering that the average family income in 1977 was in between $13,000 and $16,000, this was not a regular household item obviously. Therefore, if you were to purchase one of these machines in the year it was released, you would be spending more for a computer than the most popular new car at that time, the Ford Pinto, which sold for just under $2000.

        Apple Makes Personal Computers Affordable




        Skip ahead a couple of years. It’s 1984, and millions of Americans are watching the Raiders pummel the Redskins in Super Bowl XXIII. When during a commercial break in the third quarter, a one-minute ad is aired for Apple’s new personal computer. This Orwell-inspired advertisement helps Apple bolster sales for its $2500 Macintosh Computer to 50,000 units sold within the first two months on the market. This feat had never before been accomplished in the personal computing industry, and marked a turning point in the market for such devices. Regardless of the turning point in the market for Apple, the average computer cost was still too high for the average consumer. And personal computers were only in 7.9% of American Households, of which, the majority were households that made over $50,000. Adjusted for inflation in 2008, that would be households that made over $105,000 a year.

        The First PC


        The First PC



        Within a matter of a decade, the percentage of households that owned personal computers would more than quadruple to 36.6%, thanks in large part to one computer company - Dell. Michael Dell began his company, while still in college in the mid-eighties, and by 1997 Dell, Inc. had become the largest seller of PCs and successfully shipped its 10 millionth system. Dell sought to build its business model around the practice of individually assembling each personal computer. Not only did this model set Dell apart from the competition, but so did the company’s consumer-oriented focus which allowed for customers to customize their computers during the ordering process. By the mid-nineties, competition within the industry had driven prices to a more affordable $1000 to $2000, and, as a result, more and more people from diverse backgrounds were able to purchase personal computers.

        Fast-forward to today where technology has become affordable enough for 62% of the U.S. population to own computers. Therefore, within a twenty-year time frame, the availability of personal computers has increased to the point that nearly 190 million people in the U.S. now reap the technological benefits of computing. That means around 170 million people who couldn’t afford a personal computer twenty years ago now had the means to purchase one on their own. Anyone can look through a catalog in the Sunday newspaper and find PCs that are selling for as little as $300. Adjust that for the rate of inflation and that would be like spending $85 for a PC in 1977. Do the numbers and that comes out to being 95% less than the going rate when personal computing first emerged on the market, with exponentially greater computing power on top of that.

        The Future Timeline of the Computer


        The Future Timeline of the Computer


        Why is this important? Everyone knows that technology is more expensive when it first comes out. When we compare the availability and cost of personal computers in 1977 to 2008, we can begin to see how much cheaper technology is available for today. And when we begin to understand this, revolutionary ideas like the $100 laptop come about.

        Taking that even one step further, a team of MIT students have set out to make a very simple computer that would retail for just $12. It is loosely based on the old Apple II, with Nintendo-like controls to perform basic functions. It’s these ideas that begin to shape our world and make it a better place. This effort is to make technology available for all people, from all backgrounds, around the world. Because of the advances in the affordability of technology, it is now becoming a reality, even in the Third World. These revolutionary innovators who consistently push the envelope of what is thought to be possible, will continue to transform the way we go about our daily lives, and open new realms of opportunity across the globe.

        History of Computers


        A Look Back at Computing

        Computers have become one of the most important parts of modern society. Nearly everything that is modern required or uses computer related technology in some way. But how did computers as we know them come to exist? Did someone sitting in his lab just one day say, "Aha! I've got it! The computer!"? Well, no, that is not how this happened. Rather, many years of brilliant ideas and research from many different individuals contributed to modern computing. The field is constantly evolving at a pace unlike anything before it as techniques are polished and new breakthroughs are made.

        The Early days (1,000 B.C. to 1940)

        Ancient Civilations   
            Computers are named so because they make mathematical computations at fast speeds. As a result, the history of computing goes back at least 3,000 years ago, when ancient civilizations were making great strides in arithmetic and mathematics. The Greeks, Egyptians, Babylonians, Indians, Chinese, and Persians were all interested in logic and numerical computation. The Greeks focused on geometry and rationality [1], the Egyptians on simple addiction and subtraction [2], the Babylonians on multiplication and division [3], Indians on the base-10 decimal numbering system and concept of zero [4], the Chinese on trigonometry, and the Persians on algorithmic problem solving. [5] These developments carried over into the more modern centuries, fueling advancements in areas like astronomy, chemistry, and medicine.
        Pascal, Leibnitz, and Jacquard
            During the first half of the 17th century there were very important advancements in the automation and simplification of arithmetic computation. John Napier invented logarithms to simplify difficult mathematical computations. [6] The slide rule was introduced in the year 1622 [7], and Blaise Pascal spent most of his life in the 1600's working on a calculator called the Pascaline. [9] The Pascaline was mostly finished by 1672 and was able to do addition and subtraction by way of mechanical cogs and gears. [8] In 1674 the German mathematician Gottfried Leibnitz created a mechanical calculator called the Leibnitz Wheel. [10] This 'wheel' could perform addition, subtraction, multiplication, and division, albeit not very well in all instances.
            Neither the Pascaline or Leibnitz wheel can be categorized as computers because they did not have memory where information could be stored and because they were not programmable. [5] The first device that did satisfy these requirements was a loom developed in 1801 by Joseph Jacquard. [11] Jacquard built his loom to automate the process of weaving rugs and clothing. It did this using punched cards that told the machine what pattern to weave. Where there was a hole in the card the machine would weave and where there was no hole the machine would not weave. Jacquard's idea of punched cards was later used by computer companies like IBM to program software.
        Babbage
        http://www.sfgate.com/blogs/images/sfgate/techchron/2008/05/23/Babbage_Engine_-_Complete_Exhibit__2_498x332.jpg
           Charles Babbage was a mathematics professor at Cambridge University who was interested in automated computation. In 1823 he introduced the Difference Engine, the largest and most sophisticated mechanical calculator of his time. Along with addition, subtraction, multiplication, and division to 6 digits-- the Difference Engine could also solve polynomial equations. [12] It was never actually completed because the British Government cut off funding for the project in 1842. [15] After this Babbage began to draw up plans for an Analytical Machine, a general-purpose programmable computing machine. [13] Many people consider this to be the first true computer system even though it only ever existed on paper. The Analytical Machine had all the same basic parts that modern computer systems have. [5] While designing the Analytical Machine, Babbage noticed that he could perfect his Difference Engine by using 8,000 parts rather than 25,000 and could solve up to 20 digits instead of just 6. He drew schematics for a Difference Engine no. 2 between 1847 and 1849.
            After twelve years spent trying to get his Difference Engine No. 2 built, Babbage had to give up. The British Government was not interested in funding the machine and the technology to build the gears, cogs, and levers for the machine did not exist in that time period. Babbage's plans for the Difference Engine and Difference Engine No. 2 were hidden away after his death, and finally resurfaced around 150 years after they'd each been conceived. In 1991 a team of engineers at the Science Museum in London completed the calculating section of Babbage's Difference Engine. [14] In 2002 the same museum created a full fledged model of the Difference Engine No. 2 that weighs 5 tons and has 8,000 parts. [16] Miraculously, it worked just as Babbage had envisioned. A duplicate of this engine was built and was sent to the Computer History Museum in Mountain View, CA to be demonstrated and displayed until May 2009.
        Hollerith
            In America during the late 1800's there were many immigrants pouring in from all over the world. Officials at the U.S. Census Bureau estimated that it would take ten to twelve years to do the 1890 census. By the time they finished it would be 1900, and they'd have to do the census all over again! The problem was that all of the calculations for the census were performed manually. To solve their problems the U.S. Census Bureau held a competition that called for proposals outlining a better way to do the census. [17] The winner of the competition was Herman Hollerith, a statistician, who proposed that the use of automation machines would greatly reduce the time needed to do the census. He then designed and built programmable card processing machines that would read, tally, and sort data entered on punch cards. The census data was coded onto cards using a keypunch. Then these cards were taken to a tabulator (counting and tallying) or sorter (ordering alphabetically or numerically). [18]
            Hollerith's machines were not all-purpose computers but they were a step in that direction. They successfully completed the census in just 2 years. The 1880 census had taken 8 years to complete and the population was 30% smaller then, which meant that automated processing was definitely more efficient for large scale operations. [5] Hollerith saw the potential in his tabulating and sorting machines, so he left the U.S. Census Bureau to found the Computer Tabulating Recording Company. His punch-card machines became national bestsellers and in 1924 Hollerith's company changed its name to IBM after a series of mergers with other similar companies. [19] The computer age was about to begin.


        Birth of Computers (1940-1950)

        WWII   
            World War II brought concerns about how to calculate the logistics of such a large scale battle. The United States needed to calculate ballistics, deploy massive amounts of troops, and crack secret codes. The military started a number of research projects to try and build computers that could help with these tasks and more. In 1931 the U.S. Navy and IBM began working together to build a general-purpose computer called the Mark 1. It was the first computer to use the base-2 binary system, was programmable, and made of vacuum tubes, relays, magnets, and gears. The Mark 1 was completed in 1944. [20] The Mark 1 had a memory for 72 numbers and could perform 23-digit multiplication in 4 seconds. [5] It was operational for 15 years and performed many calculations for the U.S. Navy during WWII.
        http://es.geocities.com/jhonyjorge/p35b/historia/eniac.gif
            The Mark 1 was still a mix of electronic and mechanical. At the same time as the Mark 1, however, there was another project taking place. During WWII the United States army was building new artillery that required firing tables. These firing tables were created by way of intense mathematical calculation that took a very long time to manually compute. To help make this process process quicker the Army started a project in 1943 to build a completely electronic computing device. [21] J. Presper Eckert and John Mauchly headed the project and eventually created the Electronic Numerical Integrator and Calculator (ENIAC), which was completed in 1946. The ENIAC had 18,000 vacuum tubes and absolutely gigantic; 100 feet long, 10 feet high, and 30 tons. It was about a thousand times faster than the Mark 1 at multiplying numbers and 300 times faster at addition. [22]
            Another computer designed during WWII was the Colossus, by Alan Turing. This computer cracked the German Enigma code, helping us win the war against the Nazis. Germany themselves were designing a computer much like the ENIAC, code named the Z1. The Z1 project, headed by Konrad Zuse, was never completed. [23]
        Von Neumann
            Though the computers developed in the second World War were definitely computers, they were not the kind of computers we are used to in modern times. Jon Von Neumann helped work on the ENIAC and figured out how to make computers even better. The ENIAC was programmed externally with wires, connectors, and plugs. Von Neumann wanted to make programming something that was internalized. Instead of rerouting wires and plugs, a person could write a different sequence of instructions that changes the way a computer runs. Neumann created the idea of the stored computer program, which is still implemented today in computers that use the 'Von Neumann Architecture'. [24]


        First Generation (1950 - 1957)

        http://www.wired.com/news/images/full/univac_550x438.jpg     The first computer to implement Von Neumann's idea was the EDVAC in 1951, developed in a project led by Von Neumann himself. At the same time a computer using stored programs was developed in England, called the EDSAC. [25] The EDVAC was commercialized and called the UNIVAC 1. It was sold to the U.S. Bureau of the Census in March, 1951. This was actually the first computer ever built for sale. [26] The UNIVAC 1 made a famous appearance on CBS in November, 1952 during the presidential election. [27] The television network had rented the computer to boost ratings, planning to have the computer predict who would win the election. The UNIVAC predicted very early on that Eisenhower would beat Stevenson, which was correct. Network executives were skeptical and did not go live with the prediction until they had arrived at the same conclusion using manual methods. The UNIVAC sat right behind CBS staff during the broadcast, and it was the first time that many people had the chance to see this elusive new technology called the computer.
            IBM's first production computer was the IBM 701 Defense Calculator, introduced in April, 1952. [28] The IBM 701 was used mostly for scientific calculation. The EDVAC, EDSAC, UNIVAC 1, and IBM 701 were all large, expensive, slow, and unreliable pieces of technology-- like all computers of this time. [29] Some other computers of this time worth mentioning are the Whirlwind, developed at Massachussets Institute of Technology, and JOHNNIAC, by the Rand Corporation. The Whirlwind was the first computer to display real time video and use core memory. [33] The JOHNNIAC was named in honor of Jon Von Neumann. Computers at this time were usually kept in special locations like government and university research labs or military compounds. Only specially trained personnel were granted access to these computers. Because they used vacuum tubes to calculate and store information, these computers were also very hard to maintain. First generation computers also used punched cards to store symbolic programming languages. [5] Most people were indirectly affected by this first generation of computing machines and knew little of their existence.


        Second Generation (1957 - 1965)

        http://images.encarta.msn.com/xrefmedia/sharemed/targets/images/pho/t373/T373925A.jpg     The second generation of computing took place between 1957 and 1965. Computers were now implementing transistors, which had been invented in 1947 by a group of reseachers at Bell Laboratories, instead of vacuum tubes. [30] Because of the transistor and advances in electrical engineering, computers were now cheaper, faster, more reliable, and cheaper than ever before. More universities, businesses, and government agencies could actually afford computers now.
            In 1957 the first FORTRAN compiler was released. FORTRAN was the first high-level programming language ever made. [31] It was developed by IBM for scientific and engineering use. In 1959, the COmmon Business-Oriented Language (COBOL) programming language was released. Where FORTRAN was designed for science and engineering, COBOL was designed to serve business environments with their finances and administrative tasks. [32] These two programming languages essentially helped to create the occupation of a programmer. Before these languages, programming computers required electrical engineering knowledge.
            This generation of computers also had an increase in the use of core memory and disks for mass storage. A notable computer to mention from this time period is the IBM System/360, a mainframe computer that is considered one of the important milestones in the industry. It was actually a family of computer models that could be sold to a wide variety of businesses and institutions. [37]


        Third Generation (1965 - 1975)

        http://www.bambi.net/computer_museum/pdp1_crt_front.jpg
            The third generation of computing spanned from 1965 to 1975. During this time integrated circuits with transistors, resistors, and capacitors were etched onto a piece of silicon. This reduced the price and size of computers, adding to a general trend in the computer industry of miniaturization. In 1960 the Digital Equipment Corporation introduced the Programmed Data Processor- 1 (PDP-1), which can be called the first minicomputer due to its relatively small size. [34] It is classified as a third generation computer because of the way it was built, even though it was made before 1965. The PDP-1 was also the computer that ran the very first video game, called Spacewar (written in 1962). [35]
            The software industry came into existence in the mid 1970's as companies formed to write programs that would satisfy the increasing number of computer users. Computers were being used everywhere in business, government, military, and education environments. Because of there target market, the first software companies mostly offered accounting and statistical programs. [5] This time period also had the first set of computing standards created for compatibility between systems.
            E-mail originated sometime between 1961 and 1966, allowing computer users to send messages to each other as long as they were connected through a network. [38] This is closely tied to the work that was being done on Advanced Research Projects Agency Network (ARPANET), networking technology and innovation that would one day bring the internet. [50]


        Fourth Generation (1975 - 1985)

            The fourth generation of computing spanned from 1975 to 1985. Computer technology had advanced so rapidly that computers could fit in something the size of a typewriter. These were called microcomputers, the first one being the Altair 8800. The Altair 8800 debuted in 1975 as a mail-order hobby kit. Many people acknowledge the Altair 8800 as the computer that sparked the modern computer revolution, especially since Bill Gates and Paul Allen founded Microsoft with a programming language called Altair BASIC-- made specifically for the 8800. [36] Now that computers could fit on desks they became much more common.
            A small company called Apple Computer, Inc. was established in 1976 and single handedly changed the industry forever. Steve Wozniak and Steve Jobs began to sell their Apple 1 computer that same year, and it quickly gained popularity. It came with a keyboard and only required a monitor to be plugged into the back of the system, which was a novel idea for computers at that time.  The Apple II was released the next year and was the first mass produced microcomputer to be commercially sold, and also ushered in the era of personal computing.
            In 1981, Microsoft Disk Operating System (MS-DOS) was released to run on the Intel 8086 microprocessor. [39] Over the next few years MS-DOS became the most popular operating system in the world, eventually leading to Microsoft Windows 1.0 being released in 1985. [40] In 1984 Apple introduced their Mac OS, which was the first operating system to be completely graphical. Both Mac OS and Windows used pull-down menus, icons, and windows to make computing more user-friendly. Computers were now being controlled with a mouse as well as keyboard. The first mouse was developed in 1981 by Xerox. [41]
            Software became much more common and diverse during this period with the development of spreadsheets, databases, and drawing programs. Computer networks and e-mail became much more prevalent as well.
            The first truly portable computer, called the Osborne 1, was released in 1981. [37] Portable computers like the TRS-80 Model 100 / 102 and  IBM 5155 followed afterward. [38]
            Not all the computers of the time were small, of course. There were still being supercomputers built with the aim of being as fast as possible. These supercomputers were sold to companies, universities, and the military. An example of one such supercomputer is the Cray-1, which was released in 1976 by Cray Research. [39] It became one of the best known and most successful supercomputers ever for its unique design and fast speed of 250 MFLOPS.
            This generation was also important for the development of embedded systems. These are special systems, usually very tiny, that have computers inside to control their operation. [42] These embedded systems were put into things like cars, thermostats, microwave ovens, wristwatches, and more.


        Fifth Generation (1985 - Present)

        http://www.trendygadget.com/wp-content/uploads/2008/02/new_macbook_pro_1.jpg
            The changes that have occurred since 1985 are plentiful. Computers have gotten tinier, more reliable, and many times faster. Computers are mostly built using components from many different corporations. For this reason, it is easier to focus on specific component advancements. Intel and AMD are the main computer processor companies in the world today and are constant rivals. [42] There are many different personal computer companies that usually sell their hardware with a Microsoft Windows operating system preinstalled. Apple has a wide line of hardware and software as well. [45] Computer graphics have gotten very powerful and are able to display full three dimensional graphics at high resolution. [41] Nvidia and ATI are two companies in constant battle with one another to be the computer graphics hardware king.
            The software industry has grown a lot as well, offering all kinds of programs for almost anything you can think of. Microsoft Windows still dominates the operating system scene. In 1995 Microsoft released Windows 95, an operating system that catapulted them to a new level of dominance. [46] In 1999 Apple revamped its operating system with the release of Mac OS X. [47] In 1991 Linus Torvalds wrote the Linux kernel that has since spawned countless open source operating systems and open source software. [44]
            Computers have become more and more online orientated in modern times, especially with the development of the World Wide Web. Popular companies like Google and Yahoo! were started because of the internet. [43]
            In 2008 the IBM Roadrunner was introduced as the fastest computer in the world at 1.026 PFLOPS. [40] Fast supercomputers aid in the production of movie special effects and the making of computer animated movies. [48][49]