Information technology (IT) is concerned with technology to treat information. The acquisition, processing, storage and dissemination of vocal, pictorial, textual and numerical information by a microelectronics-based combination of computing and telecommunications are its main fields.
The term in its modern sense first appeared in a 1958 article published in the Harvard Business Review, in which authors Leavitt and Whisler commented that "the new technology does not yet have a single established name. We shall call it information technology (IT).".
Some of the modern and emerging fields of Information technology are next generation web technologies, bioinformatics, cloud computing, global information systems, large scale knowledgebase, etc. Advancements are mainly driven in the field of computer science.
Computer science or computing science (abbreviated CS) is the study of the theoretical foundations of information and computation. It is a field that is closely related to IT. It also includes practical techniques for their implementation and application in computer systems. Computer scientists invent algorithmic processes that create, describe, and transform information and formulate suitable abstractions to design and model complex systems.
The general public sometimes confuses computer scientists with other computer professionals having careers in information technology, or think that computer science relates to their own experience with computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement software such as games and web-browsers, and using that understanding to create new programs or improve existing ones.