Advantages and problems of digital photography. About digital photography

The buildings 25.09.2019
The buildings

Despite the abundance of photographers, often self-made, few can tell in detail about the history of photographs. That is what we will do today. After reading the article, you will learn: what is a camera obscura, what material became the basis for the first photograph, and how instant photography appeared.

Where did it all begin?

People have known about the chemical properties of sunlight for a very long time. Even in ancient times, any person could say that the sun's rays make skin color darker, guessed about the effect of light on the taste of beer and the sparkling of precious stones. History has more than a thousand years of observations of the behavior of certain objects under the influence of ultraviolet radiation (this is the type of radiation characteristic of the sun).

The first analogue of photography began to be truly used as early as the 10th century AD.

This application consisted in the so-called camera obscura. It represents a completely dark room, one of the walls of which had a round hole that transmits light. Thanks to him for opposite wall a projection of the image appeared, which the artists of that time “finalized” and received beautiful drawings.

The image on the walls was upside down, but that didn't make it any less beautiful. This phenomenon was discovered by an Arab scientist from Basra named Alhazen. For a long time he was engaged in observing light rays, and the phenomenon of the camera obscura was first noticed by him on the darkened white wall of his tent. The scientist used it to observe the dimming of the sun: even then they understood that it was very dangerous to look at the sun directly.

First photo: background and successful attempts.

The main premise is the proof by Johann Heinrich Schulz in 1725 that it is light, and not heat, that causes silver salt to turn dark. He did this by accident: trying to create a luminous substance, he mixed chalk with nitric acid, and with a small amount of dissolved silver. He noticed that under the influence of sunlight the white solution darkens.

This prompted the scientist to another experiment: he tried to get an image of letters and numbers by cutting them out on paper and applying them to the illuminated side of the vessel. He received the image, but he did not even have thoughts about saving it. Based on the work of Schultz, the scientist Grotgus found that the absorption and emission of light occurs under the influence of temperature.

Later, in 1822, the world's first image was obtained, more or less familiar to modern man. It was received by Joseph Nsefort Niépce, but the frame he received was not preserved properly. Because of this, he continued to work with great zeal and received in 1826, a full-fledged frame, called "View from the Window". It was he who went down in history as the first full-fledged photograph, although it was still far from the quality we were used to.

The use of metals is a significant simplification of the process.

A few years later, in 1839, another Frenchman, Louis-Jacques Daguerre, published new material for taking photographs: copper plates coated with silver. After that, the plate was doused with iodine vapor, which created a layer of light-sensitive silver iodide. It was he who was the key to future photography.

After processing, the layer was subjected to a 30-minute exposure in an illuminated sunlight room. Then the plate was taken to a dark room and treated with mercury vapor, and the frame was fixed with table salt. It is Daguerre who is considered to be the creator of the first more or less high-quality photograph. This method, although it was far from "mere mortals", was already much simpler than the first.

Color photography is a breakthrough of its time.

Many people think that color photography appeared only with the creation of film cameras. This is not true at all. The year of creation of the first color photograph is considered to be 1861, it was then that James Maxwell received the image, later called the “Tartan Ribbon”. For creation, the method of three-color photography or the color separation method was used, whichever one likes more.

To obtain this frame, three cameras were used, each of which was equipped with a special filter that makes up the primary colors: red, green and blue. As a result, three images were obtained, which were combined into one, but such a process could not be called simple and fast. To simplify it, intensive research was carried out on photosensitive materials.

The first step towards simplification was the identification of sensitizers. They were discovered by Hermann Vogel, a scientist from Germany. After some time, he managed to get a layer sensitive to the green color spectrum. Later, his student Adolf Miethe created sensitizers sensitive to the three primary colors: red, green and blue. He demonstrated his discovery in 1902 at a Berlin scientific conference along with the first color projector.

One of the first photochemists in Russia, Sergei Prokudin-Gorsky, a student of Mitya, developed a sensitizer more sensitive to the red-orange spectrum, which allowed him to surpass his teacher. He also managed to reduce the shutter speed, managed to make the pictures more massive, that is, he created all the possibilities for replicating photographs. Based on the inventions of these scientists, special photographic plates were created, which, despite their shortcomings, were in high demand among ordinary consumers.

Snapshot is another step towards speeding up the process.

In general, the year of the appearance of this type of photography is considered to be 1923, when a patent was registered for the creation of an “instant camera”. There was little use for such a device, the combination of a camera and a photo lab was extremely cumbersome and did not greatly reduce the time it takes to get a frame. Understanding the problem came a little later. It consisted in the inconvenience of the process of obtaining the finished negative.

It was in the 1930s that complex light-sensitive elements first appeared, which made it possible to obtain a ready-made positive. Agfa was involved in their development in the first couple, and the guys from Polaroid were engaged in them en masse. The first cameras of the company made it possible to take instant photographs immediately after taking a picture.

A little later, similar ideas were tried to be implemented in the USSR. Photo sets "Moment", "Photon" were created here, but they did not find popularity. main reason- the absence of unique light-sensitive films to obtain a positive. It was the principle laid down by these devices that became one of the key and most popular at the end of the 20th - beginning of the 21st century, especially in Europe.

Digital photography is a leap forward in the development of the industry.

This type of photography really originated quite recently - in 1981. The founders can be safely considered the Japanese: Sony showed the first device in which the matrix replaced the film. Everyone knows how a digital camera differs from a film camera, right? Yes, it could not be called a high-quality digital camera in the modern sense, but the first step was obvious.

In the future, a similar concept was developed by many companies, but the first digital device, as we are used to seeing it, was created by Kodak. The serial production of the camera began in 1990, and it almost immediately became super popular.

In 1991, Kodak collaborated with Nikon to release a professional digital reflex camera Kodak DSC100 based on Nikon F3 camera. This device weighed 5 kilograms.

It is worth noting that with the advent of digital technologies, the scope of photography has become more extensive.
Modern cameras, as a rule, are divided into several categories: professional, amateur and mobile. In general, they differ from each other only in the size of the matrix, optics and processing algorithms. Due to the small number of differences, the line between amateur and mobile cameras is gradually blurring.

Application of photography

Back in the middle of the last century, it was hard to imagine that clear images in newspapers and magazines would become a mandatory attribute. The boom in photography was especially pronounced with the advent of digital cameras. Yes, many will say that film cameras were better and more popular, but it was digital technology that made it possible to save the photographic industry from problems such as running out of film or overlaying frames on top of each other.

Moreover, modern photography is undergoing extremely interesting changes. If earlier, for example, to get a photo in your passport, you had to stand in a long queue, take a picture and wait a few more days before it was printed, now it’s enough just to take a picture of yourself on a white background with certain requirements on your phone and print the pictures on special paper.

Artistic photography has also come a long way. Previously, it was difficult to get a highly detailed frame of a mountain landscape, it was difficult to crop unnecessary elements or make high-quality photo processing. Now even mobile photographers are getting great shots, ready to compete with pocket digital cameras without any problems. Of course, smartphones cannot compete with full-fledged cameras, such as Canon 5D, but this is a topic for a separate discussion.

Digital SLR for beginners 2.0- for connoisseurs of Nikon.

My first MIRROR— for connoisseurs of CANON.

So, dear reader, now you know a little more about the history of photography. I hope this material will be useful to you. If so, why not subscribe to the blog update and tell your friends about it? Moreover, there are many more waiting for you. interesting materials, which will allow you to become more literate in photography. Good luck and thank you for your attention.

Sincerely yours, Timur Mustaev.

Digital photography entered life gradually, step by step. The US National Aerospace Agency began using digital signals in the 1960s, along with flights to the moon (for example, to create a map of the lunar surface) - as you know, analog signals can be lost during transmission, and digital data is much less error prone. The first ultra-precise image processing was developed during this period, as the National Aerospace Agency used the full power of computer technology to process and enhance space images. cold war, during which a wide variety of spy satellites and secret imaging systems were used, also contributed to the acceleration of the development digital photography.

The first electronic camera without film was patented by Texas Instruments in 1972. Main disadvantage of this system was that photographs could only be viewed on television. A similar approach was adopted by Sony's Mavica, which was announced in August 1981 as the first commercial electronic camera. The Mavica camera could already be connected to a color printer. At the same time, it was not a real digital camera - it was more of a video camera with which you can take and show individual pictures. The Mavica (Magnetic Video Camera) camera made it possible to record up to fifty images on two-inch floppy disks using a CCD sensor with a size of 570x490 pixels, which corresponded to the ISO 200 standard. Lenses: 25mm wide, 50mm regular, and 16-65mm zoom. At present, such a system may seem primitive, but do not forget that Mavica was developed almost 25 years ago!

In 1992, Kodak announced the release of the first professional digital camera, the DCS 100, based on the Nikon F3. The DCS 100 was equipped with a 1.3 MB CCD image sensor and a portable HDD to store 156 captured images. It should be noted that this disk weighed about 5 kg, the camera itself cost $25,000, and the resulting images were only good enough for printing on the pages of newspapers. Therefore, it was advisable to use such photographic equipment only in cases where the timing of obtaining images was more important than their quality.

The prospects for digital photography became clearer with the introduction of two new types of digital cameras in 1994. Apple Computer first released the Apple QuickTake 100 camera, which had a strange sandwich shape and was capable of capturing 8 images at a resolution of 640 x 480 pixels. It was the first mass-market digital camera available for a selling price of $749. The images produced with it were also of poor quality, which did not allow them to be printed properly, and since the Internet was then on initial stage development, this camera has not found wide application.

The second camera, released in the same year by Kodak in conjunction with the Associated Press news agency, was intended for photojournalists. Its NC2000 and NC200E models combined appearance and functionality film cameras with instant access to images and capture convenience of digital cameras. The NC 2000 was widely adopted by many newsrooms, prompting the move from film to digital.

Since the mid-1990s, digital cameras have become more advanced, computers have become faster and less expensive, and software- more developed. In their development, digital cameras have gone from an alien type of device that could only be dear to their creators, to universal, easy-to-use photographic equipment that is embedded even in ubiquitous cell phones and has the same technical characteristics as the latest models of full-size (35 mm ) digital cameras. And in terms of the quality of the images obtained, such photographic equipment surpasses film cameras.

The changes that are constantly taking place in digital camera technology are remarkable.

The rapid development of the digital photography industry is evidenced by the increase in the production of cameras, as well as the reduction in the production of film by all manufacturers, the departure of the pillars of the photo industry from the market or their complete transition to digital technologies. The development of photo inkjet printers also indicates the growth of the digital camera (DSC) market.

A digital photograph is a photograph taken with a digital camera or still camera; a photograph digitized by a scanner, taken with an ordinary camera; slide.

Digital camera

The camera is one of the most amazing inventions of man. It leaves many moments of our lives forever.

The modern photographic industry began with Talbot's discovery 160 years ago. Now a new photographic era has begun - the era of digital photography.

A digital camera differs from the usual one in that instead of a film, a photosensitive matrix is ​​installed in it. It converts the image into an electrical signal, which is then processed and stored in digital form in the camera's memory.

The DPC matrix consists of cells, the operation of each of which is similar to the operation of a photoexposure meter, when, depending on the intensity of the light that hits it, an electrical signal is generated. When creating matrices for the CFC, different technologies are used. For example, the Bayer pattern, CCD RGBE technology developed by Sony.

With a digital camera, a computer, and photo software for photo editing, there are almost unlimited possibilities for realizing your creativity and ideas. The technology of creating digital photographs allows you to instantly share visual information with people, regardless of their geographical location. If the image was taken with digital cameras, then the program Adobe Photoshop CS5 supports a large number of Camera RAW formats.

Open the file with the RAW extension and save it in another format, such as TIFF format, as printers require images to be in this format.

Compact Flash memory card

Compact Flash (CF card or flash card) is a high-tech electrical device designed to store information in the form of digital images obtained with a digital camera.

Precautions when handling CF cards: Do not bend them, apply force to them, subject them to shock and vibration; Do not disassemble or modify the CF card. Sudden changes in temperature may cause moisture to condense in the card and cause it to malfunction. Do not use CF cards in places with excessive dust or sand, in places with high humidity and high temperature.

Formatting a CF card erases all data, including protected images and other types of files. Formatting is performed both for a new CF card and for deleting all images and data from the CF card.

Principles of operation of a digital camera

A digital camera creates an image based on light rays, however, it does not fire them on film, but using a photosensitive matrix, which can be called in another way a set of light-sensitive computer chops. Currently, there are two varieties of these chips: CCD (charge-coupled device - charge-coupled device - CCD), which stands for charge-coupled device, and CMOS (complementary metal-oxide semiconductor) - complementary metal oxide semiconductor.

When beams of light hit these devices, they generate electric charges, which are then analyzed by the digital camera processor and converted into digital image information. The more light, the more powerful the charge generated by the chip.

Once the electrical impulses have been converted into image information, this data is stored in the camera's memory, which can be stored either as a built-in memory chip or as a replaceable memory card or disk.

Typically, the camera uses a 1/3-inch CCD, which consists of elements that convert light waves into electrical impulses. The number of such elements depends on the brand of the camera.

For example, a 5-megapixel camera has about 5 million of these elements.

To access the image recorded by the camera, it is enough to transfer the data to the computer's memory. Some cameras allow you to display recorded images directly on a TV screen or directly output them to a printer for printing, thus bypassing the stage of editing the received frames on a computer.

The illumination or darkness of the resulting frame depends on the exposure - the amount of light acting on the film or on the photosensitive matrix. The more light, the brighter the resulting frame will be. Too much light, the image will be overexposed; too little light, the image will be too dark.

The amount of light hitting the film can be controlled in two ways:

© determining the amount of time the shutter will remain open (in this case, the shutter speed changes);

© by changing the aperture.

The aperture value is the size of the hole created by the set of plates located between the lens and the shutter. Rays of light are directed through this hole to the shutter with the help of lenses, after which they fall on the film or matrix. Thus, if you want more light to hit the sensor, you make the aperture size larger (larger aperture); if you need less light, you make the aperture size smaller (small the aperture).

Aperture values ​​are indicated by f-numbers, known in the English literature as f-stops (f-stops). Standard numbers are f/1.4, f/2, f/2.8, f/4, f/5.6, f/8, f/11, f/16 and f/22.

The shutter speed, or simply shutter speed, is measured in more understandable units - in fractions of a second. For example, if the shutter speed is 1/8, that means the shutter opens for 1/8 of a second.

Advantages

Fast results

The resulting image can be seen much faster than with the traditional photo process. As a rule, cameras allow you to view the image on a built-in or attached monitor immediately after shooting (and in non-mirror and some SLR cameras, even before shooting). In addition, the image can be downloaded to a computer quite quickly, and already there it can be viewed in detail.

Fast results result in early detection of fatal errors (and reshoots) and easy learning. Which is convenient for both beginners and amateurs / professionals.

Ready for use on a computer

Digital photography is the fastest and cheapest way to obtain images for later use on a computer - in web design, uploading images (photos of people and objects) to databases, creating artwork based on photography, measurements, etc.

For example, when preparing foreign passports of a modern sample, a person is photographed with a digital camera. His photo is printed on the passport and entered into the database.

In the traditional photo process, images are needed before processing on a computer, which requires additional funds.

Economy and simplicity

The process of digital filming does not require Supplies(film) and means/materials for photoprocessing (image development on film). Therefore, unsuccessful shots, if you do not take into account labor costs, do not cost the photographer a dime. More precisely, they cost very little, since digital media are mainly reusable with a large rewriting resource.

What's more, the entire process from shooting to getting prints (or previews) can be done from the comfort of your home or studio, and all it takes is a computer and a photo printer. The possibilities and quality of prints (compared to processing in the laboratory), in this case, will depend only on the capabilities of the technique and the skill of the operator.

Instant photography studios, consisting of a digital camera, a computer and a digital photo lab, are becoming more common. Photographs taken in such a studio are better in both image quality and durability than the traditional Polaroid-type instant photo.

Some cameras and printers allow you to take prints without a computer (cameras and printers with direct connection capability or printers that print from memory cards), but this option usually excludes the possibility of image correction and has other limitations.

Flexible control of shooting parameters

Digital photography allows you to flexibly control some of the parameters that, in the traditional photo process, are strictly tied to the photographic material of the film - light sensitivity and color balance (also called white balance).

Light sensitivity (in ISO units, by analogy with photographic materials) can be set manually, or automatically determined by the camera, in relation to the scene being shot.

In the traditional photo process, two types of film of different color balance are used (for daylight and electric lighting), and corrective filters.

A digital camera can change the color balance very flexibly - it can be chosen according to the lighting, let the camera determine automatically, or fine-tune with a gray pattern.

Extensive post-processing options

Unlike the traditional photo process, in digital photography there are very wide possibilities for correcting and adding additional effects after shooting.

You can rotate, crop, edit, change image parameters (whole or in a separate area), make manual or automatic correction of defects incomparably easier and better than when shooting on film.

Benefits of Digital Presentation

Since the original image digital photography is an array of numbers, then storage, copying, transmission to an arbitrary distance does not change it - any copy is identical to the original. In any case, the unreliability of the data can be quite simply established, and a repeated copy / transfer of the entire array or its fragment (or its restoration from redundant information) can be made. A copy from a film, especially when sequentially copied, will differ from the original.

Of course, the digital media can fail, but the information, if properly stored (with sufficient redundancy and periodic replacement of media), can be kept unchanged for an arbitrary period of time.

compactness

Most digital cameras are more compact than their film counterparts, as there is no need to allocate space for film and film channel mechanics in their design.

The ability to miniaturize the elements of digital cameras allows you to produce ultra-compact versions of cameras and cameras built into all kinds of devices that were not originally intended for photography - players, etc.

Of course, reduced geometric dimensions(in particular, the dimensions of the optics), bring their own characteristics to the pictures:

  • high (embedded options, as a rule, do not have focus mechanisms at all)
  • low optical resolution (“softness”) of images
  • more noise - a small sensor is less sensitive and the signal from it needs additional amplification, which, in addition to the signal, also increases the background noise

Number of frames

Digital cameras generally allow you to take more shots than film cameras, because (apart from battery capacities) they are limited only by the capacity of digital media, and the latter have a wider range than photographic film. However, the actual number of photos that can be recorded on the media depends on the characteristics of the camera (image resolution) and the recording format.

In addition, with digital shooting, if desired / necessary, the number of shots can be increased by lowering the image parameters - resolution, recording format and / or quality Images.

  • The resolution can usually be reduced by 2-4 times or brought to standard resolutions (640x480, 1024x768, 1600x1200)
  • Recording formats differ in the amount of information stored, the type of compression, etc.
  • Under quality It is customary to understand the degree of compression with the loss of information (as a rule, when saving in the format) - with low quality, the image loses in shades, but takes up less space.

If you have time, you can also delete bad frames from the media to make room for new ones, download frames to a computer or pocket storage devices for large amounts of information.

Of course, you can also use multiple media, but this option is also available for film cameras.

Problems

Image Resolution

In digital photography, the image is represented by a discrete array of points (). Image details smaller than one pixel are not preserved. the resulting image (the number or dimensions of the pixel matrix) is determined by the basic resolution of the camera sensor, as well as its current settings.

At the same time, film also has its own discreteness. The image on the film is formed by black or pigmented domains ("grains") of different sizes, deposited during the photoprocess.

Based on the average grain size of photographic film, a resolution of 12-16 megapixels per frame is considered to be a similar resolution for a digital image. Professional cameras have this or higher resolution.

However, the actual resolution of the resulting image (that is, the degree of visibility of details), in addition to the pixel resolution of the sensor, depends on the optical resolution of the lens and the sensor device.

Optical lens resolution

Image resolution cannot be higher than the lens. Optical resolution sufficient to obtain a clear image with a resolution of 12-16 megapixels can only be provided by removable semi-professional optics. The lenses of most compact cameras provide a resolution of 2-4 (sometimes 6) megapixels.

Compared to film cameras, digital cameras in the same class have the same or smaller lenses (and therefore potentially lower resolution).

SLR cameras use the same lenses, but models with non-full-frame sensors capture only part of the frame, and therefore have a lower resolution relative to the frame size.

Influence of the sensor device

Image resolution can also limit the sensor device. (see section ).

digital noise

Digital photographs, to varying degrees, contain . The amount of noise depends on the technological features of the sensor (linear pixel size, applied CCD/CMOS technology, etc.).

Noise shows up more in images. Noise increases with increasing ISO speed and also with increasing exposure time.

Digital noise is somewhat equivalent to film grain. Grain increases with film speed, just like digital noise. However, graininess and digital noise are of a different nature and differ in appearance:

property grain digital noise
Is … … by limiting the resolution of the film, the individual grain follows the shape and size of the photosensitive emulsion crystal ... noise deviations introduced by the camera electronics, the noise is formed by pixels (or spots of 2-3 pixels, when interpolating color planes) of the same size.
Appears… ... non-linear luminance and, to a lesser extent, color texture, broken lines sharp transitions in brightness and color ... a noise texture of brightness and color deviations throughout the image, which reduces the visibility of details that create heterogeneities in monochromatic areas
Overall, it captures… ... accurate brightness and color, deviations are positional in nature ... brightness and color with a statistical deviation to gray, chromatic deviants have colors that are unusual for the subject (which irritates the perception of the image), deviations are of an amplitude nature
With increased sensitivity... … increases maximum size grain
With increasing exposure... ... does not change … the noise level increases (degree of deviations)
In the white areas... ... manifests itself weakly
In the black areas... ... practical does not appear … manifests itself most strongly

Unlike digital noise, which varies from camera to camera, the degree of film grain is independent of the camera used - the most expensive professional camera and the cheapest compact camera on the same film will give an image with the same grain.

Digital noise begins to be suppressed even when reading from the sensor (by subtracting the "zero" level of each pixel from the read potential), continues when the image is processed by the camera (or RAW file converter). If necessary, noise can also be further suppressed in image processing programs.

Moire

During digital shooting, images occur, so if there is another raster in the image (textured fabrics, linear patterns, monitor and TV screens) close in size to the sensor raster, there may be a beating of the rasters forming zones of increase and decrease in brightness, which merge into lines and textures that are not on the subject.

Moire is enhanced with approaching frequencies and decreasing the angle between the rasters. The latter property means that moiré can be reduced by shooting the scene at some angle, chosen by experience. The normal orientation of the scene can be restored in the graphical editor (at the cost of loss of edges, and some loss of clarity).

Moire is very weakened when defocused - including "softening" filters (which are used in portrait photography) or optics relatively high definition, unable to focus a point commensurate with the raster line of the sensor (that is, low-resolution optics or a sensor with small pixels).

The sensors, which are a rectangular matrix of photosensitive sensors, have at least two rasters - a horizontal one, which is formed by rows of pixels, and a vertical one perpendicular to it. Fortunately, most modern cameras have a low enough optical resolution (or high sensor resolution) to focus a close frequency raster well, and the resulting moiré is rather weak.

Static sensor defects

Individual photosensitive elements of the sensor, as a result of a manufacturing defect, may have abnormal (reduced or increased) sensitivity or may not work at all. During operation, new defective elements may appear.

At the current level of development of sensor technology, it is very difficult to avoid the appearance of defective elements, and sensors containing them in small quantities are not considered defective.

statically "white" or elements with hypersensitivity called "hot" pixels (or hot pixels), statically black - "dead" or "broken" pixels.

Image defects resulting from sensor anomalies are usually removed by noise reduction filters.

Also, the camera can be programmed to the features of its sensor so that anomalous elements are ignored during reading, and their values ​​are determined by interpolation. Such programming (remapping, remapping) is carried out during the quality control process, when new defective elements appear, remapping can be repeated (independently or in a service center).

Low photographic latitude

The photosensitive sensor has a lower than traditional photographic film (especially negative). Therefore, when shooting a scene with a wide range of brightness in digital pictures, “burn-in” and/or blackout may occur. When “burned in”, the pixel acquires the maximum brightness value, when blackened, the brightness value approaches the minimum value (and also approaches or falls below the digital noise level).

Most amateur cameras, when viewing an image, allow you to see “burnt out” pixels, for reshooting if necessary.

To combat light burn-in, some sensors have additional photodiodes with reduced sensitivity.

Internal reflections

High power consumption

The entire process of obtaining a digital image, its processing and recording on a medium is electronic. Because of this, the vast majority of digital cameras consume more power than their film counterparts. Particularly high power consumption are compact cameras that use as a viewfinder.

CMOS sensors use less power than CCD sensors.

Due to power consumption, as well as the desire for compactness, in most digital cameras, manufacturers have abandoned the use of popular film cameras in favor of more capacious and compact batteries. Some models allow you to use AA batteries in optional battery packs.

Complex device and high price of digital cameras

Even the simplest digital camera is a complex electronic device, because as a minimum, when shooting, it must:

  • open the shutter for a specified time
  • read information from the sensor
  • write image file to media

While for a simple film camera it is enough to simply open the shutter, and for this (and also manipulations with the film) a few simple mechanical components are enough.

It is the complexity that explains the prices of digital cameras 5-10 times higher than the prices of similar film models. At the same time, among simple models, digital cameras often lose to film ones in terms of picture quality (mainly in resolution and digital noise).

Among other things, complexity increases the number of possible malfunctions and the cost of repairs.

Color sensor device and its disadvantages

The traditional color photo process uses a multi-layer photographic emulsion with layers sensitive in different ranges.

Most modern color digital cameras use mosaic or its analogues for color separation. In the Bayer filter, each sensor does not have a light filter of one of the three primary colors and perceives only it. This approach has a number of disadvantages.

Loss of resolution

The full image is obtained by restoring (interpolating) the color of intermediate points in each of the color planes. Interpolation reduces the resolution (sharpness) of the image.

The decrease in resolution is partly corrected by the "unsharp mask" method - by increasing the contrast at the brightness transitions of the image. In the documentation, this operation is called "sharpening" or simply "sharpening". Overuse of an unsharp mask results in halos at the edges.

Often the "sharpening" is done by the camera itself. But automatic sharpening often has too low a sensitivity threshold and amplifies digital noise. In amateur-level cameras, the use of an unsharp mask can be disabled in order to make the necessary corrections on a computer (in a RAW file converter or graphic editor) with the parameters most suitable for each image, and also to perform them in the required order.

Color artifacts

Interpolation can give wrong color on borders and image details commensurate in size with a pixel. Also, color artifacts can form moire formations (see section ).

Edge distortion is designed to prevent improved interpolation algorithms that track color transitions. To suppress color artifacts in the finished images, the "low-pass filter" algorithm is used, however, its use makes small parts images with less contrast and sharpness.

RAW file converters and photo processing programs deal with the prevention and suppression of color artifacts and moiré. High-end cameras have built-in algorithms for this.

Alternative color schemes

The disadvantages of the Bayer filter force developers to look for alternative solutions. Here are the most popular ones.

Three-sensor circuits

These schemes use three sensors and a prism that separates the light flux into component colors.

The main problem of the three-sensor system is the combination of the three resulting images into one. But this does not prevent its use in systems with a relatively low resolution, such as video cameras.

Multilayer sensors

The idea of ​​a multilayer sensor, similar to modern color photographic film with a multilayer emulsion, has always dominated the minds of electronics developers, but until recently it had no methods for practical implementation.

Foveon developers decided to use the property of silicon to absorb light different lengths waves (colors) at different depths of the crystal by placing primary color sensors one under the other at different levels of the microcircuit. Sensors announced in 2005 became the implementation of this technology.

The X3 sensors read the full gamut of colors at every pixel, so they don't suffer from the problems associated with color plane interpolation. They have their own problems - noise propensity, interlayer, etc. but this technology is still in active development.

Permission when applied to X3 sensors, it has several interpretations, starting from various technical aspects. So for the top model Foveon "X3 10.2 MP":

  • The final image has a pixel resolution 3,4 megapixel. This is how the user understands the megapixel.
  • The sensor has 10,2 million sensors (or 3.4×3). This understanding is used by the company for marketing purposes (these numbers are present in the markings and specifications).
  • The sensor provides image resolution (in general sense) corresponding 7 -megapixel sensor with Bayer filter (according to Foveon calculations), since it does not require interpolation and therefore provides a clearer image.

Comparative Features

Performance

Digital and film cameras, in general, have similar speeds, in terms of delays, before taking a picture in different modes. Although certain types of digital cameras may be inferior to film ones.

Shutter lag

At the same time, most compact and budget digital cameras use a slow but accurate contrast autofocus (not applicable for film cameras). Film cameras in the same category use less accurate (relying on high ) but fast focusing systems. SLR cameras (both digital and film) use the same system phase focusing, with minimal delays.

To reduce the effect of autofocus on the shutter lag (both in digital and in some types of film cameras), preliminary (including proactive, for moving objects) focusing is used, which is activated by the middle position of the three-position shutter button.

Viewfinder delay

Non-optical viewfinders used in non-reflex digital cameras - LCD screen or electronic viewfinder(eyepiece with a CRT or LCD screen) may display an image delay, which, like shutter lag, may result in a delay in shooting.

Ready time

Camera readiness time to shoot is a concept that exists for electronic cameras and cameras with retractable elements. Most mechanical cameras are always ready to shoot, and none of them are digital - all digital cameras and backs are electronic.

The readiness time of electronic cameras is determined by the time of initialization of the camera. For digital cameras, the initialization time can be longer, but quite short - 100-200 milliseconds.

Compact cameras with retractable lenses have significantly longer turnaround times, but retractable lenses have both digital and film cameras.

Delay in continuous shooting

The delay in continuous shooting is due to the processing of the current frame and preparation for shooting the next, which require some time. For a film camera, this processing would be rewinding the film to the next frame.

The digital camera must:

  • Read data from the sensor;
  • Process image - make a file of the required format and size with the necessary corrections;
  • Write the file to digital media.

The slowest of the listed operations is writing to the media (Flash-card). For its optimization, - write file to buffer (AKA cache cache; RAM area), while writing from the buffer to slow media, in parallel with other operations.

Processing includes a large number of operations for restoration, image correction, reduction to the required size and packaging into a file of the desired format. To increase performance, in addition to increasing the frequency of the processor part of the camera, increase its efficiency by developing specialized processors with hardware implementation of image processing algorithms.

Sensor readout speed usually becomes a performance bottleneck only in top-end professional cameras with high-resolution sensors. Manufacturers eliminate all other types of delays in them. As a rule, the maximum speed of a particular sensor is limited physical factors leading to a sharp decrease in image quality at higher speeds. To work with greater performance, new types of sensors are being developed.

Also, the preparation time for the next shot (both digital and conventional) is affected by the time required to charge the flash, if used.

Maximum number of shots in continuous shooting

Caching writes to slow media sooner or later leads to buffer filling and performance drop to the real level. Depending on the camera software, shooting may:

  • stay;
  • continue at low speed as images are recorded;
  • or continue at the same speed, overwriting previously captured but not recorded images in the buffer.

Therefore, for continuous shooting, in addition to the number of frames per second, the camera has a parameter maximum number of frames, which the camera can do before the write cache overflows. This amount depends on:

  • The size of the RAM and the sensor resolution (factory specifications) of the camera;
  • Selected by the user:
    • file format (if the camera allows it);
    • image size (if the format allows it);
    • image quality (if the format allows it).

Film cameras, by virtue of their design, always work with real performance, and maximum amount frames is limited only by the number of frames on the film.

Shooting in infrared

Most digital cameras allow shooting, in part, in the invisible infrared range (thermal radiation shooting or shooting with infrared illumination), because the photosensor is able to perceive upper part this range. Visible light, if necessary, can be filtered with a special light.

In classical photography, infrared photography requires a special film, but, unlike photosensors, it is capable of capturing most of the infrared range.

With the words "digital photography" most people imagine a compact digital "soap box" and the pictures taken from it on the monitor screen. But what exactly is "digital photography"?

Over the past 10 years, there has been a dramatic rise in the photography industry with the development of digital photography and the global decline in the price of digital cameras. Let's dive a little into the history of digital photography. It began back in the early 80s with a conference in Tokyo on August 25, 1981, at which Sony introduced a prototype of the company - the Mavica (Magnetic Video Camera). In it, the image was recorded on a two-inch floppy disk, SONY called it "Mavipak" - it contained 50 color images at a resolution of 570x490 pixels. At that time, this was considered the maximum resolution of the TV, on which the received photos were viewed. But the Mavica was less of a digital camera and more of a video camera capable of taking still pictures. The device had only one shutter speed, which was 1/60 of a second, and the sensitivity value, estimated by the International Organization for Standardization (ISO), was 200 units.

The revolution came in 1990 when the first consumer camera, the Dycam Model 1 or Logitech FotoMan, went on sale. The camera had a CCD matrix with a resolution of 376x240 pixels and the ability to obtain black and white images with 256 shades of gray. The device was equipped with a built-in memory of 1 megabyte, which allowed you to save up to 32 pictures and transfer them to a personal computer. But the camera had one very serious drawback - if the batteries that power the camera ran out, all the pictures from it disappeared.

A year later, Kodak introduced the DCS-100 professional camera, based on the Nikon F3. The filling of the camera consisted of a matrix with a resolution of 1.3 megapixels (currently in Cell phones matrices already installed that are three times the size of the DCS-100 matrix). Images in the camera were stored on an external hard drive with a capacity of 200Mb. The weight of the whole set was almost 25 kg, and its cost was about $30,000.

Now it's time to consider the difference between traditional photography and digital photography. The fundamental difference is in the way the image is registered and stored. In classical photography, the image is captured in an analog form, that is, passing through the lens of the lens, light particles are fixed on a special film coated with layers of silver emulsion. To obtain the final result of the shooting - a printed image, the film is subjected to chemical processing, that is, developing, fixing, washing and drying. In traditional photography, film is an intermediate storage medium. In this case, the image on the film after developing becomes visible, but negative (i.e. white becomes black, and vice versa) and mirrored. Through an enlarger or contact printing machine, a negative image is projected onto the surface of light-sensitive photographic paper. The exposed paper is then developed, fixed, washed and dried, resulting in the final result - the finished photograph.

In digital photography, the rays of light passing through the lens of the lens fall on the transducer sensor (the so-called camera matrix), which consists of several million pixel sensors that are sensitive to green, red and blue colors. The image is created thanks to interpolation, and sensitive pixels give the photo a thousand shades. Then the signal from the matrix is ​​processed by the camera's processor and recorded on a memory card or on the camera's built-in flash memory.

There are several formats for recording received images:
- JPEG(Joint Photographic Experts Group) - was created in 1990 by a joint group of experts in the field of photography and today is the most popular image compression format. It gained its popularity due to the optimal size-quality ratio. For example, a 15 megabyte file can be compressed to 1.2 megabytes with virtually no quality loss, i.e. only a trained eye can notice the difference, and then only at 100% magnification of the image. Compression occurs according to the Huffman algorithm.
- TIFF(Tagged Image File Format) - was released in 1986 by Aldus Corporation and was introduced as a standard format for storing images created by layout software packages and scanners. The ability to expand, which allows recording raster images of any color depth, makes this format very promising for storing and processing graphic information and for wide use in printing. The TIFF format supports several compression options:
- do not compress the image;
– use a simple PakBits scheme;
– use T3 and T4 compression (algorithm also used in facsimile communication);
– use some additional methods, including LZW and JPEG.
- RAW(from English raw - raw) - an image format that is directly obtained data from the camera matrix without processing. RAW data is either 12 or 14 bits per pixel (JPEG has 8 bits) and contains much more information about the image. This format is often referred to as "digital negative" and, like film in analog format, special software exists to develop the "raw" format into a JPEG that most users can understand.
RAW format extensions for some cameras:
- .bay - Casio
- .arw, .srf, .sr2 - Sony
- .crw, .cr2 - Canon
- .dcr, .kdc - Kodak
- .erf - Epson
- .mrw - Minolta
- .nef - Nikon
- .raf - Fujifilm
- .orf - Olympus
- .ptx, .pef - Pentax
- .x3f - Sigma.

Special attention should be paid to DNG(Digital Negative Specification) is an image format called a digital negative. It was developed by Adobe and announced in 2004 to standardize the format of digital negatives. The DNG format specifications are provided by the company free of charge, so any manufacturer of digital photographic equipment can include support for this format. Currently, Leica, Pentax, Hasselblad, Ricoh, Sinar have included DNG support in their new cameras along with their own RAW files. DNG also requires "developing" and is perfectly translated into other formats using, for example, Adobe DNG converter.

With the advent of digital photography, the procedure for obtaining a finished image on photographic paper has been noticeably simplified. Now you do not need to "conjure" in a dark room under the red light of a lamp with chemical solutions, but just connect the camera to a personal photo printer and press the "Print" button on the picture you like. The cost of purchasing consumables has also decreased, for example, the cost of a film for 36 frames is about 100 rubles, and the cost of a 4Gb SD card is about 400 rubles, but unlike film, about 1500 shots are placed on the card, with a camera resolution of 5 megapixels. Considering that the card can be used for many years, the savings are obvious! And how much film should I take when traveling on vacation? On the digital camera, even if you run out of space on the memory card, you can immediately delete less interesting shots and continue shooting new, interesting scenes! And on film, the result can only be seen by returning from vacation and developing the films, which allows inexperienced photographers to experiment more and make faster progress. These and many other factors that have simplified the life of a photographer, with the advent of digital photography, contributed to the massive passion for photography among today's youth, and also made life much easier for professional photographers.

Digital photography today has practically supplanted its "film" predecessor and does not stop at its development. Every month we are witnessing the announcement of new digital cameras, the resolution of some of them has already crossed the mark of 20 megapixels and the realism of the resulting picture already corresponds to the best film "SLRs". For some, digital photography is an opportunity to capture the joyful moments of the life of relatives and friends, and for some it is a means of self-realization and an opportunity to translate their most incredible ideas into the world of ones and zeros.

Anatoly Shishkin ©

We recommend reading

Top