• Tehnologija
  • Električna oprema
  • Materijalna Industrija
  • Digitalni život
  • Politika privatnosti
  • O nama
Location: Home / Tehnologija / Who Needs Building Sensors When We Have Computer Vision?

Who Needs Building Sensors When We Have Computer Vision?

techserving |
1304

Buildings everywhere are equipping themselves with a variety of new sensors. The global smart sensors market is expected to grow from $45.8 billion in 2022 to $104.5 billion by 2027. There are plenty of obvious reasons to want to install sensors in commercial buildings, to enhance security, to make them easier to use, or to gain some intelligence into what is going on inside them. But some of these sensors might be rendered unnecessary, like the payphones of my youth, thanks to the advancement of computer vision technology.

Maybe all of these sensors that we are painstakingly installing and wiring throughout our build world aren’t the most efficient way to achieve our goals. Maybe we could use the existing, extensive networks of cameras that we already have in place. Or maybe, as Karen Burns co-founder of computer vision technology firm Fyma put it, “There is no reason to install this junk, we already have cameras everywhere.”

Who Needs Building Sensors When We Have Computer Vision?

She walked me through a number of projects they are working on now with municipalities, real estate developers, and building operators to track everything from headcount to foot traffic to activity level using their existing camera networks. The real-time nature of the technology can even alert staff to possible security and safety risks. “We can train algorithms to identify almost anything if we can get enough examples of it for training data,” Burns said. She said e-scooters were something that one client wanted to track but in order to give their algorithm enough glimpses of them in action, they had to bring some back to their office and film themselves riding around their office building. I’m sure most of the developers were happy to be models.

Understanding where people are coming from and going is only the first step in what computer vision can do. Burns says that it is possible to identify gender with relatively high accuracy thanks to a person’s profile. That accuracy decreases in the winter of course when we all dress like padded blobs. Currently, Burns is working to be able to identify whether people are running, walking, sitting down, or lying on the ground. Imagine how useful it would be to know not just how many people are in a space but what they are doing when they are there. Retailers could know how often people look at their phones in a store, event managers could see where people congregate to talk versus to eat, and security personnel could be notified if potentially dangerous activity is taking place.

See also

Grid-Interactive Efficient Buildings Are the Answer to California’s Brownouts

Despite the seemingly invasive nature of having a computer watch our every move, Burns says the way the technology is set up makes it seem a lot less troublesome. “The software is programmed to not see faces and we don’t store any of the footage itself,” she said. By not processing any personal data and following strict rules about privacy and security, her firm has been able to meet Europe’s stringent GDPR privacy requirements.

There are certain jobs that will always be best done by sensors. Electricity usage, temperature, air quality, and acoustics are all examples of data that are best recorded with dedicated measuring equipment. But much of what we are installing sensors, for now, things like occupancy, movement, activity, light levels, and traffic patterns can be done by software, using the infrastructure that already exists. Buildings will continue to be outfitted with all kinds of sensors, for good reason. But some of those sensors might one day find themselves obsolete, replaced by a camera and a computer program smart enough to “see” for us.