The timeline of digital photography technology starts in the 1950s. It all started with the digital images being used to help professionals in their work in the fields of entertainment and science.
The timeline of digital photography technology starts in the 50s. The technology was refined for specialized use during the 60s and the 70s. The 80s marked the birth of consumer-grade digital photography technology and from then on up to the present, the technology has continuously experienced both minor and major improvements that make the technology easier to use for consumers while making more sophisticated applications possible.
At the start of the 50s, digital signals began to be recorded on magnetic tapes. It was done by converting TV signals into electrical pulses which were then saved to magnetic tapes. This concept was later refined to make use of digital information in photography. In 1957, Russell Kirsch came up with a device that could scan images and turn it into digital information which could then be viewed on a computer. He used it to scan an image of his son which he then viewed on a computer. It was the first ever digital photograph.
The concept of producing digital copies of images took a leap forward in the 1960s when NASA started to gather information about the moon by mapping its surface. The analogue signals captured were converted to digital signals. Spy satellites also employed the same technology throughout the world. Before the decade ended, the process of editing digital photographs was initiated by NASA by enhancing the image quality of photos of the moon using computers. CCD, or charge-coupled device, was also invented before the 60s ended. The device was used to detect color and the intensity of light in images, paving the way for digital images in color.
The decade began with the concept of an electronic camera being discussed, and by the middle of the decade, 1975 to be exact, the first digital camera prototype was created and it used CCD technology. It was only a prototype that was not intended for the masses.
The inevitable finally happened in 1981 when the first consumer grade digital camera was produced. It was produced by Sony and it used magnetic impulses to record images on two-inch floppy disks. Several attempts at digital cameras came from other manufacturers like Kodak and Fuji. The race between those companies was all about storing images digitally and increasing the pixels captured.
Digital photography as we know it today was born in 1990 as the very first DSLR camera was produced by Kodak. At this point, images are now stored entirely as digital information. Throughout the decade, storage options were improved, as well as the transfer of data between the camera and the personal computer. Companies have improved the amount of pixels per image their cameras can take, leading to a minimum of 2 megapixels by the end of the decade.
Digital photography was initially designed to assist professionals, and in the 90s, professional photographs got the benefits of faster data transfer and larger storage space for photographs. In the 2000s, companies started to design their digital cameras with regular consumers in mind. The amount of megapixels a camera can record in one image dramatically changed early in the decade as cameras, like those of Canon, started to record images with at least 6 megapixels. By the middle of the decade, 2004 to be exact, Kodak, stopped producing film cameras. A couple of years later, Nikon and Canon did the same.
The timeline of digital photography technology continues we go beyond the 2000s. Storage space, the amount of megapixels and the ISO levels of digital cameras are continuously being improved.
Photo Courtesy of Morguefile.com / Supplied by Procrastinator