An Introduction to Information Technology
- 15th March 2016
- Posted by: Stacie Jansen van Vuren
- Category: Technology
The most commonly accepted definition of Information Technology is the use of computers and software to manage information. In recent years there has been a shift in focus from single computers to computer networks and the main credit for this must be given to the internet. The internet has become the number one choice for communication whether that choice is email, social media, VoIP calling or instant messaging services. However, to truly appreciate the advances in Information Technology we need to take a look at its history.
The history of Information Technology can be divided into four main ages:
The Premechanical Age: This is the earliest age of Information Technology and is defined as the period from 3000 BC to 1450 AD. This is considered to be the time when humans first began to communicate using simple language and basic drawings known as petroglyphs. Petroglyphs were generally carved into rock and these were the seed from which early alphabets grew. During this period number systems also began to emerge and with the creation of numbers came the need to process them. Thus the abacus was developed and this was the first information processor.
The Mechanical Age: The mechanical age is the time between 1450 AD and 1840 AD and it was during this period that we begin to see the connections to our current information technology. Interest in information technology blossomed in this era and basic technologies were invented to satisfy the human thirst for calculations. It was also during this time that Charles Babbage was credited with the development of the first automated computing machine and Ada Lovelace’s work is considered to be the first example of computer programming. The machine was unfortunately never completed; however, it was the beginning of the journey that led to modern day computers.
The Electromechanical Age: This age is when information technology began to take the form that we know today. Between 1840 and 1940 telecommunication began to emerge and one of the first creations was the telegraph in the early 1800’s. In 1835 Samuel Morse created the Morse Code and the telephone was brought to us by Alexander Graham Bell in 1876. 1894 saw Guglielmo Marconi develop the radio and all of these inventions led to massive advances in the field of information technology. To demonstrate this, it must be mentioned that in 1940, Harvard University created the Mark 1 which was the first large-scale automatic digital computer which was programmed using punch cards. This creation led to the exploration of developing smaller versions that could be used in businesses and, eventually, homes.
The Electronic Age: This is the information technology age that we are fortunate enough to live in at this time. For detailed information on the inventions that have taken place during the Electronic Age of information technology, you can read all about it here.
Now that computers form the main resource for information technology, it is important to ensure that we have qualified professionals who are able to manage this infrastructure. In order to do so it is necessary to gain the appropriate training and certifications that will enable Information Technology Professionals to perform their roles in an effective and efficient manner.
The Value of Information Technology Certification
As with any other career path, gaining the relevant certifications will enhance your Information Technology career and increase your employability and earning potential. Employers prefer to hire certified candidates as this proves to be less of a risk to the organisation. Certified Information Technology Professionals negate the need for full training which saves the company money. They are also validated in their ability to perform their role effectively and efficiently which reassures the employer that the candidate will be ready to hit the ground running. Certifications are beneficial to both the IT professional and the organisation that they work for. There is a wide range of IT certifications available and one of the most sought after when beginning the IT certification process is the CompTIA A+ qualification.
IT Careers Start with CompTIA A+
Once you have decided to gain an IT certification, the next question is which one should I choose as my first IT certification? We recommend that you begin with the CompTIA A+ qualification as this is the best starting point in the journey to IT certification.
CompTIA A+ is an IT certification awarded by CompTIA once a student has studied the relevant training material and passed the associated CompTIA A+ certification exam. The CompTIA A+ training course teaches the knowledge and skills that are required to install and maintain operating systems, hardware, mobile devices, laptops, printers and basic networking technologies. Upon passing the CompTIA A+ certification exam a student will be able to configure, upgrade and maintain Windows operating systems, computer work stations and small to medium networks. As you can see, the CompTIA A+ qualification will provide you with everything that you need in an IT certification which will enable you to become a productive IT Professional. If you are already working in an IT environment, the CompTIA A+ IT certification will validate your experience and prove to employers that you are capable and knowledgeable.
CompTIA A+ Exam Description
Now that you have completed your CompTIA A+ course and practiced with the sample exams until you feel confident in knowing the material, it is time to take the CompTIA A+ certification exam. For this IT certification you will need to write an exam that is broken down into two parts:
- CompTIA A+ 220-801
- CompTIA A+ 220-802
The CompTIA A+ 220-801 IT certification exam will test you on the fundamentals such as installation, configuration, mobile devices, networking, safety measures and prohibited content. This is the first of the two parts of the CompTIA A+ certification exam. The second part of the CompTIA A+ exam is the CompTIA A+ 220-802 which will test your ability to install and configure computer and mobile device operating systems and common functions in email, security and networking aspects of information technology.
The CompTIA A+ certification exam consists of a maximum of 90 questions (across the two parts) which are comprised of both multiple-choice and performance-based questions. You will be given 90 minutes to complete both parts of the exam and you will be required to achieve a minimum of 675 on a scale of 900 for the first part (CompTIA A+ 220-801) and 700 on a scale of 900 for the second part (CompTIA A+ 220-802) in order to pass the exam. Upon passing both parts of the CompTIA A+ exam, you will earn your CompTIA A+ certification which is internationally recognised. The CompTIA A+ qualification will set you well on your way to becoming a respected Information Technology Professional.