Posted March 18, 2019 at 2:46 pm by by56014904

The role of different parameters in the optical lens

An optical lens is a key component in a machine vision system. Its essential function is to convert an optical signal into an ordered electrical signal. Choosing the right camera is also an important part of the design of the machine vision system. The choice of the camera not only directly determines the image resolution, image quality, etc., but also directly relates to the operating mode of the whole system.
The position and role of zero order/multiple order/achromatic optical waveplate in machine vision systems

The main parameters of the optical lens:
Resolution refers to the number of pixels (Pixels) of each captured image, usually expressed in terms of length * width. We often say how many megapixel cameras are calculated from resolution. For example, the resolution of 1280pixel*1024pixel, 1280*1024=1310720, is a 1.3 megapixel camera. Resolution determines the accuracy that a machine vision system can achieve in a certain sense.

Pixel Depth
That is, the number of bits per pixel of data is generally 8Bit, and for digital camera machines, there will be 10Bit, 12Bit, 14Bit and so on.
Frame Rate / Line Rate

The rate at which the high precision optical waveplate captures the transmitted image is typically the number of frames acquired per second for an area array camera (Frames/Sec.) and the number of lines acquired per second for a line camera (Lines/Sec.).
Exposure and shutter speed (Shutter)

For line array cameras, they are progressively exposed. You can select the fixed line frequency and external trigger synchronization. The exposure time can be consistent with the line period, or you can set a fixed time. The area array camera has frame exposure and field. There are several common ways of exposure and scrolling, and digital cameras generally provide external triggering. Shutter speeds typically range to 10 microseconds, and high speed cameras can be faster.
Pixel Size

The cell size and the number of pixels (resolution) together determine the size of the camera’s target surface. The size of the digital camera pixel is 3μm~10μm. Generally, the smaller the pixel size, the more difficult it is to manufacture, and the more difficult the image quality is.
Spectral Range Characteristics

It refers to the sensitivity of the pixel sensor to different light waves. The general response range is 350nm-1000nm. Some cameras add a filter in front of the target to filter out the infrared light. If the system needs to be sensitive to infrared light, the filter can be removed.
Interface Type

There are Camera Link interface, Ethernet interface, 1394 interface, USB interface output, and the new interface currently has CoaXPress interface.
Optical lens is also known as a camera. Compared with traditional civilian cameras (cameras), it has high image stability, high transmission capacity and high anti-interference ability. Most of the optical lenses on the market are based on CCD (Charge Coupled Device). Or a CMOS (Complementary Metal Oxide Semiconductor) chip camera.

CCD is a commonly used image sensor for machine vision. It integrates photoelectric conversion and charge storage, charge transfer, and signal reading. It is a typical solid-state imaging device. The prominent feature of CCDs is the use of charge as a signal, unlike other devices that use current or voltage as a signal. Such an imaging device forms a charge packet by photoelectric conversion, and then transfers and amplifies the output image signal under the action of a driving pulse. A typical CCD camera consists of high-quality laser grade optical waveplate, timing and sync signal generator, vertical driver, and analog/digital signal processing circuitry. As a functional device, CCD has the advantages of no burn, no hysteresis, low voltage operation and low power consumption compared with vacuum tubes.

The development of CMOS image sensors began in the early 1970s. In the early 1990s, with the development of VLSI manufacturing process technology, CMOS image sensors were rapidly developed. The CMOS image sensor integrates a photosensitive element array, an image signal amplifier, a signal reading circuit, an analog-to-digital conversion circuit, an image signal processor and a controller on one chip, and has the advantage of program random access of local pixels. CMOS image sensors are widely used in high resolution and high speed applications due to their good integration, low power consumption, high speed transmission and wide dynamic range.

On map