Information technology (IT) refers to the use of computers in an enterprise or other business environment to store, retrieve, transmit and manipulatedata or information. IT is considered a subset of information and communication technology (ICT). An information technology system (IT system) is typically an information system, a communication system, or, more specifically, a computer system that is operated by a limited group of users – including all hardware, software, and peripherals.
Since the Sumerians of Mesopotamia began writing in about 3000 BC, humans have been storing, retrieving, manipulating and disseminating information, but the term information technology in the modern sense first appeared in the Harvard Business Review. Published in a 1958 article. Authors Harold J. Leavitt and Thomas L. Whisler commented that “new technologies do not yet have a definitive name. We call it information technology (IT).” Their definitions include three categories: processing techniques, statistical and mathematical methods in decision making. The application, as well as the simulation of high-order thinking through computer programs.
The term is commonly used as a synonym for computers and computer networks, but it also includes other information distribution technologies such as television and telephone. Some products or services in the economy are associated with information technology, including computer hardware, software, electronics, semiconductors, the Internet, telecommunications equipment and e-commerce.
Based on the storage and processing technologies used, it is possible to distinguish between four different stages of IT development: pre-machinery (3000 – 1450 BC), machinery (1450-1840), electromechanical (1840-1940) and electronics (1940) – Now). This article focuses on the most recent period (electronic version), which began around 1940.
History of Computer Technology:
The device has been used to help calculate for thousands of years, possibly in the form of a counting stick. The Antikythera mechanism, dating back to the first century BC, is generally considered to be the earliest known mechanical simulation computer and the earliest known gear mechanism. Until the 16th century, comparable gear sets appeared in Europe, and it was not until 1645 that the first mechanical calculator capable of performing four basic arithmetic operations was developed.
Electronic computers using relays or valves began to appear in the early 1940s. The electromechanical Zuse Z3 was completed in 1941 and is the world’s first programmable computer and one of the first modern standards to be considered a complete computer. The giant image developed during the Second World War to decrypt German information was the first electronic digital computer. Although it is programmable, it is not universal and can only perform a single task. It also lacks the ability to store programs in memory; programming with plugs and switches to change internal wiring. The first identifiable modern electronic digital stored-program computer was Manchester Baby, which began its first program on June 21, 1948.
Transistors developed by Bell Labs in the late 1940s have reduced the design energy consumption of next-generation computers. The first commercial storage program, the Ferranti Mark I, contains 4,050 valves and consumes 25 kilowatts. In contrast, the first transistor computer developed at the University of Manchester and started operating in November 1953 consumed only 150 watts in its final version.
Electronic Data Processing:
Early computers like Colossus used a perforated strip, a long strip of paper, and the data was represented by a series of holes, and the technology is now obsolete. The electronic data storage used in modern computers dates back to World War II, when a delay line memory was developed to eliminate the clutter of radar signals, the first of which was the mercury delay line. The first random access digital storage device is a Williams tube based on a standard cathode ray tube, but the information and delay line memory stored in are unstable because it must be constantly refreshed and therefore lost, once power is lost. The earliest form of non-volatile computer storage was the drum, which was invented in 1932 for the Ferranti Mark 1, the world’s first commercial general purpose computer.
IBM introduced the first hard drive in 1956 as a component of its 305 RAMAC computer system. Today, most digital data is still stored on the hard disk or optically stored on a medium such as a CD-ROM. : 4-5 Until 2002, most of the information was stored on analog devices, but digital storage capacity exceeded analog quantities for the first time. As of 2007, nearly 94% of the globally stored data was digitally implemented: 52% for hard drives, 28% for optical devices, and 11% for digital tape. It is estimated that the global ability to store electronic device information has grown from less than 3 Bytes in 1986 to 295 Bytes in 2007, doubling every three years.
The Database Management System (DMS) emerged in the 1960s to address the problem of storing and retrieving large amounts of data accurately and quickly. The early such systems were IBM’s Information Management System (IMS), which was still widely deployed more than 50 years later. IMS stores data in a hierarchical manner, but in the 1970s, Ted Codd proposed an alternative relational storage model based on set theory and predicate logic and familiar concepts of tables, rows and columns. In 1981, Oracle released the first commercial relational database management system (RDBMS).
All DMSs are made up of components that allow many users to simultaneously access the data they store while maintaining their integrity. All databases are common at one point, and the data structures they contain are defined and stored separately from the data itself in the database schema.
In recent years, Extensible Markup Language (XML) has become a popular format for data representation. Although XML, data can be stored in a common file system. It is usually stored in a relational database to take advantage of their “powerful implementation through years of theoretical and practical efforts to validate.” As an evolution of the Standard Generalized Markup Language (SGML), XML’s text-based structure provides machine and human readable advantages.
The relational database model introduces a structured query language (SQL) that is independent of programming languages based on relational algebra.
The terms “data” and “information” are not synonymous. Anything stored is data, but it becomes information only when it is organized and rendered meaningfully. Most of the world’s digital data is unstructured and even stored in a variety of physical formats within an organization. Data warehouses were developed in the 1980s to integrate these different stores. They typically contain data extracted from a variety of sources, including external sources, such as the Internet, to facilitate the organization of decision support systems (DSS).