Information technology

development, management, and use of computer-based information systems
(Redirected from IT)

Information technology (or IT) is a term that encompasses all forms of technology used to create, store, exchange, and use information in its various forms (business data, voice conversations, still images, motion pictures, photos, multimedia presentations, and other forms, including those not yet conceived).[1]

The modern term “Information Technology” was coined by Leavitt and Whisle. In 1958 a Harvard Business Review included, “The new technology does not yet have a single established name. We shall call it information technology.”[2]

Information technology is the concept that includes every process of information flow, such as data collection, processing, storage, search, transmission, and reception. In the information society, information technology is one of the most necessary industries. As technology advances day by day, IT has developed into an essential part. It is also attracting future industries. In the future, the use of information technology will be more important in the overall industry. The development of the internet made it possible for the world to be connected. IT will be a power that makes an increasing number of promising occupations and science and technology.[3]

Now, people have been using it to refer to other aspects of technology. It now covers many more fields of study than it has covered in the past.

Four basic periods Characterized by a principal technology used to solve the input, processing, output and communication problems of the time:

  1. Premechanical,
  2. Mechanical,
  3. Electromechanical, and
  4. Electronic

A. The Premechanical Age: 3000 B.C. – 1450 A.D.

change
  1. Writing and Alphabets—communication.
    1. First humans communicated only through speaking and picture drawings.
    2. 3000 B.C., the Sumerians in Mesopotamia (what is today southern Iraq) devised cuneiform
    3. Around 2000 B.C., Phoenicians created symbols
    4. The Greeks later adopted the Phoenician alphabet and added vowels; the Romans gave the letters Latin names to create the alphabet we use today.
  2. Paper and Pens—input technologies.
    1. Sumerians' input technology was a stylus that could scratch marks in wet clay.
    2. About 2600 B.C., the Egyptians write on the papyrus plant
    3. around 100 A.D., the Chinese made paper from rags, on which modern-day papermaking is based.
  3. Books and Libraries: Permanent Storage Devices.
    1. Religious leaders in Mesopotamia kept the earliest "books"
    2. The Egyptians kept scrolls
    3. Around 600 B.C., the Greeks began to fold sheets of papyrus vertically into leaves and bind them together.
  4. The First Numbering Systems.
    1. Egyptian system:
      • The numbers 1-9 as vertical lines, the number 10 as a U or circle, the number 100 as a coiled rope, and the number 1,000 as a lotus blossom.
    2. The first place value numbering systems similar to those in use today were invented between 100 and 200 A.D. in India who created a nine-digit numbering system.
    3. Around 875 A.D., the concept of zero was developed.
  1. The First Calculators: The Abacus. 
 
The First Calculator: The Abacus


One of the very first information processors.

B. The Mechanical Age: 1450 – 1840

change
  1. The First Information Explosion.
    1. Johann Gutenberg (Mainz, Germany)
      • Invented the movable metal-type printing process in 1450.
    2. The development of book indexes and the widespread use of page numbers.
  2. The first general purpose "computers"
    • Actually people who held the job title "computer: one who works with numbers."
  3. Slide Rules, the Pascaline and Leibniz's Machine.
     
    Slide Rule

Early 1600s, William Oughtred, an English clergyman, invented the slide rule.

C. The Electromechanical Age: 1840 – 1940.

change

The discovery of ways to harness electricity was the key advance made during this period. Knowledge and information could now be converted into electrical impulses.

  1. The Beginnings of Telecommunication.
    1. Voltaic Battery.
      • Late 18th century.
    2. Telegraph.
      • Early 1800s.
    3. Morse Code.
    4. Telephone and Radio.
    5. Followed by the discovery that electrical waves travel through space and can produce an effect far from the point at which they originated.
    6. These two events led to the invention of the radio

2. Electromechanical Computing

  1. Herman Hollerith and IBM.
    Herman Hollerith (1860–1929) in 1880.
  2. Mark 1
  • Howard Aiken, a Ph.D. student at Harvard University
  • Built the Mark I
  • Completed January 1942
  • 8 feet tall, 51 feet long, 2 feet thick, weighed 5 tons, used about 750,000 parts

D. The Electronic Age: 1940 – Present.

change
  1. First Tries.
    • Early 1940s
    • Electronic vacuum tubes.
  2. Eckert and Mauchly.
    1. ENIAC, fixed, not stored, program

3. The First Stored-Program Computer(s)

  • Early 1940s, Mauchly and Eckert began to design the EDVAC - the Electronic Discreet Variable Computer.
    • John von Neumann's influential report in June 1945:
      • "The Report on the EDVAC"
    • British scientists used this report and outpaced the Americans.
      • Max Newman headed up the effort at Manchester University
        • Where the Manchester Mark I went into operation in June 1948--becoming the first stored-program computer.
      • Maurice Wilkes, a British scientist at Cambridge University, completed the EDSAC (Electronic Delay Storage Automatic Calculator) in 1949—two years before EDVAC was finished.
        • Thus, EDSAC became the first stored-program computer in general use (i.e., not a prototype).
  1. The First General-Purpose Computer for Commercial Use: Universal Automatic Computer (UNIVAC).
  • Late 1940s, Eckert and Mauchly began the development of a computer called UNIVAC (Universal Automatic Computer)
  • Remington Rand.
  • First UNIVAC delivered to Census Bureau in 1951.
  • But, a machine called LEO (Lyons Electronic Office) went into action a few months before UNIVAC and became the world's first commercial computer.

3. The Four Generations of Digitalg Computing.

The First Generation (1951–1958).

  1. Vacuum tubes as their main logic elements.
  2. Punch cards to input and externally store data.
  3. Rotating magnetic drums for internal storage of data and programs

The Second Generation (1959–1963).

  1. Vacuum tubes replaced by transistors as main logic element.
    • AT&T's Bell Laboratories, in the 1940s
    • Crystalline mineral materials called semiconductors could be used in the design of a device called a transistor
  2. Magnetic tape and disks began to replace punched cards as external storage devices.
  3. Magnetic cores (very small donut-shaped magnets that could be polarized in one of two directions to represent data) strung on wire within the computer became the primary internal storage technology.
    • High-level programming languages
      • e.g., FORTRAN and COBOL

The Third Generation (1964–1979).

Individual transistors were replaced by integrated circuits.

  • Magnetic tape and disks completely replace punch cards as external storage devices.
  • Magnetic core internal memories began to give way to a new form, metal oxide semiconductor (MOS) memory, which, like integrated circuits, used silicon-backed chips.
  • Operating systems
  • Advanced programming languages like BASIC developed.
  1. The Fourth Generation (1979–Present).
    1. Large-scale and very large-scale integrated circuits (LSIs and VLSICs)
    2. Microprocessors that contained memory, logic, and control circuits (an entire CPU = Central Processing Unit) on a single chip.
      • Which allowed for home-use personal computers or PCs, like the Apple (II and Mac) and IBM PC.
        • Apple II released to public in 1977, by Steve Wozniak and Steve Jobs.
          • Initially sold for $1,195 (without a monitor); had 16k RAM.
        • First Apple Mac released in 1984.
        • IBM PC introduced in 1981.
          • Debuts with MS-DOS (Microsoft Disk Operating System)
      • Fourth generation language software products
        • e.g., VisiCalc, Lotus 1-2-3, dBase, Microsoft Word, and many others.
        • Graphical User Interfaces (GUI) for PCs arrive in early 1980s
          • Apple's GUI (on the first Mac) debuts in 1984.
          • Microsoft Windows debuts in 1985.[5]
            • Windows wouldn't take off until version 3 was released in 1990

Field of Study

change
  • Bachelor of Information Technology (abbreviations BITBInfTechB.Tech(IT) or BE(IT)) is an undergraduate academic degree that generally requires three to five years of study.[6] While the degree has a major focus on computers and technology, it differs from a Computer Science degree in that students are also expected to study management and information science, and there are reduced requirements for mathematics. However, people pursue an MBA in IT, a 2-year degree,[7] to attain managerial roles and advance their careers. A degree in computer science can be expected to concentrate on the scientific aspects of computing, while a degree in information technology can be expected to concentrate on the business and communication applications of computing. There is more emphasis on these two areas in the electronic commerce, e-business and business information technology undergraduate courses. Specific names for the degrees vary across countries, and even universities within countries.

This is in contrast to a Bachelor of Science in Information Technology which is a bachelor's degree typically conferred after a period of three to four years of an undergraduate course of study in Information Technology (IT).[8] The degree itself is a Bachelor of Science with institutions conferring degrees in the fields of information technology and related fields.

Many employers require software developers or programmers to have a Bachelor of Science in Computer Science degree; however, those seeking to hire for positions such as network administrators or database managers would require a Bachelors of Science in Information Technology or an equivalent degree.[9] Graduates with an information technology background are able to perform technology tasks relating to the processing, storing, and communication of information between computers, mobile phones, and other electronic devices. Information technology as a field emphasizes the secure management of large amounts of diverse information and its accessibility via a wide variety of systems both local and world-wide.[10]

change

References

change
  1. Rouse, Margaret. "IT (information technology)." September 2005. http://searchdatacenter.techtarget.com/definition/IT
  2. Leavitt, Harold J.; Whisler, Thomas L. (1958-11-01). "Management in the 1980's". Harvard Business Review. ISSN 0017-8012. Retrieved 2022-10-14.
  3. "2022 Technology Industry Outlook". Deloitte United States. Retrieved 2022-10-14.
  4. Butler, Jeremy G. "A History of Information Technology and Systems." Summer 1997. http://www.tcf.ua.edu/AZ/ITHistoryOutline.htm Archived 2012-08-05 at the Wayback Machine
  5. "From Windows 1 to Windows 10: 29 years of Windows evolution". the Guardian. 2014-10-02. Retrieved 2022-06-20.
  6. "Study a Bachelor of Information Technology". www.jcu.edu.au. Retrieved 2024-07-13.
  7. "List of MBA Courses : Specialisations & Jobs". Learning Routes.
  8. "BSc IT (Information Technology) - Course Details, Syllabus, Subjects, Top Colleges, Scope". Shiksha.com.
  9. School of Computing Homepage. Cis.usouthal.edu. Retrieved on 2013-10-05.
  10. "Network and Computer Systems Administrators". Occupational Outlook Handbook. United States Bureau of Labor Statistics. 2012-03-29. Retrieved 2013-12-01.

Other websites

change