When the weather outside is frightful, but your computer vision application is still delightful.
What is it?
We are approaching the 21st of December: the shortest day in the Northern hemisphere. But through indoor agriculture, we are still able to extend the growing season. By controlling the climate and with the use of artificial lighting. But be aware that this has some drastic implications on the use of sensor technologies.
Sensor technology as a commodity
As we are heading towards 2023, we can cautiously conclude that computer vision technologies are becoming a commodity in Controlled Environment Agriculture, CEA. Where we see these innovations are not limited to the highest-of-tech locations, but take their place over the whole spectrum of low- to mid- to high-tech food production facilities. And at any scale.
Sensor technology has long assisted greenhouse growers in monitoring and controlling their climate control actions. A big driver of sensor advancement is the ever pushing consumer market. A common smartphone already has two or three sensors. In 2007, the original iPhone launched with a 2MP camera. Today the Google Pixel 6 phone, a mid-to-high end device has multiple cameras with 50MP. Cars have an array of optical sensors for autonomous driving. Consumer drones and action cameras are pushing the capabilities of compact high resolutions devices. All at a massive scale. Pushing performance while driving down sensor cost.
Applications of computer vision in horticulture
Computer vision technologies are already being used in a wide spectrum of applications and technology variations. Plant nurseries are automating the scoring of germination and early growth uniformity by using imaging systems, such as technologies from ADI or Corvus Drones. In production greenhouses plant yield, -health and -performance are estimated via integrated camera systems of IUNU or the Plantalyzer. And computer vision plays a essential role in harvest automation with technologies from Polariks or Octiva, for fast detection of harvest ready crops
What is the relevancy
Good quality data input is important for a computer vision AI model to work accurately. And now with short and overcast days, illumination plays a factor in the quality of imaging data. Not only the lack and intensity of daylight, but also the addition of SON-T or LED grow lights. This additional lighting can alter the image quality. Spotting discoloration of leaves is very different in white normal light, than it is in a purple gloom.
There are different approaches to this problem. The most ideal case is to develop and train an AI model for your application under a variety of lighting conditions. However, there is a lot of variation in greenhouse conditions and different types of lighting combinations.This leads to the risk of having to adapt the solution for each individual environment.
Simply correct timing of data acquisition can overcome a lot of issues. By capturing an image at the brightest time of day, when no other lights are on, is the simplest way of working around this issue. Or if possible, capturing data at night eliminates all light variation. In this case the only light would come from an integrated light source.
An example of this can be found at our ScoutCam solutions. A computer vision solution for sticky card monitoring. These images are from the same sticky card, 6 hours apart. and while our sensors have an integrated light source, still the SON-T lighting alters the image massively. The discoloration of objects and background result in a less accurate object recognition. To make sure to produce accurate pest monitoring, we adjust the time of data acquisition with our client’s lighting schedules.