Last session of the conference. Roundtable discussion about video analytics. Panelists are:
Erick Eaton, BRS Labs
Doug Marman, VideoIQ
David McGuinness, ObjectVideo
Carolyn Ramsey, Honeywell
Moti Shabtai, NICE Systems
John Whiteman, ioimage
Sam Pfeifle, Security Systems News (moderator)
Important in understanding video analytics is understanding customers needs.
Sam: Just motion detector with video?
Eric: Analytics describe broad array of how pixels are changing. Other approaches where object tracking and try to establish rules and rules and learning. All different and easy to slap a label on it and sometimes try to distsance ourself from video analytics not specific what trying to do.
Sam: What does algorithms mean and different approaches?
Eric: Misunderstanding what algorithms are. Algorithm is mathematical equation under the hood and built into user friendly tripwire draw on screen. What's difference between vendors? Are different approaches of science toward computer science.
Sam: If difficult to accomplish why proliferation of company and what say to market?
Moti: Has to do difference in implementing in lab and in field. Analytics great value for customer, but to do in lab it's quite easy.
Sam: What is meta-data and how contribute to analytics?
Carolyn: Think can consider separate and apart. Analytics trying to get video from it. Things we can predict in real time, set up a rule to look for certain things, but certain things we can't predict, but would like to know about. So metadata is descripton of things happen in video at any point in time, whether told system or not or just want to file away for later review. Many end users don't ahve time to look at video in real time and enabling them to sift through video and meta-data allows you to select what interesting opposed to digging.
Sam: How apply things? Seems like are sweet spots where people should expect some good results not as experimental. Where analytics work well?
Carolyn: Perimeter with designated time rules. At high level it's about predictability, easy for analytics to definitively say this rule has been broken.
Sam: Baseline for quality of system?
David: Typically 7 frames/second, number of pixels different images.
Indoor applications and think it gets back to resolution figure out how many cameras I need for a space. Those are some considerations. Also gets back to understanding what customer wants to accomplish.
Carolyn: Think it comes down to application. How much do you need to know about what that object is doing that determines how many pixels you need.
Sam: How do you compare different vendors? What data should we have?
Eric: Already asking for lab results in control environment. How effective is technology in field. More than lab test, field test results. Needs to be done by folks in security industry to analyze.
John: If we want to compare technologies I would suggest on probability and false alarm rate. Think important and go into real world and compare side by side and what accomplish and in our world it's vehicles or people or objects and determining what performs better.
Moti: Think way sell system. Company has policy to set expectation. Once realize what customer achieve enough experience and detection than I think set expectations right to avoid disappointment and get value from system. One thing if company is willing to do that exercise and analyze the problem, before buying the system
Carolyn: Think difficult discussion because every customer wants plain and simple. The issue is what trying to do? What's next best alternatives.
Sam: Set up and go, zero configuration. That sounds good to integrators. Installation and configuration are questions that should be asked?
Doug: Doesn't need configuration and learns on its own. Another benefit is if environment changes doesn't matter.
Sam: What get with more sophistication? What's better and value?
Eric: What get from set-up time that putting up? Learns on its own. We're able to alert scene. Help you find things didn't know to look for.
Sam: What about configuration?
John: Each of 40 manufacturers take a different approach. The time up front we have looked at in past automated learning. From ROI standpoint most easily measured, what rich data generating didn't have access to before.
Sam: Hear a lot of: "I could do this." It seems like more theory than practicality?
Carolyn: Our experience that people don't like to be out of comfort zone. Take sales person based on cost successfully and makes quota and ask him to start talking about ROI. Think challenges as offerings become more complex and customers engaged in critical thinking about where spending dollars we have to help customers help themselves.
Sam: Do end users have communication in organizations to be able to convey to the marketing guy?
Moti: Sometimes meet champions have wider scope and open gate could be operations where ROI successful. More see fits in IT more willingness to approach others in businesses.
Doug: Perimeter detection has huge ROI so powerful, remote guarding is similar and remote doing far less expensive.
Sam: Impact on standards? And what most important to you?
Dave: First area of focus working on is events and make sure common outputs that are understandable from system to system. And we're also PSIA has pulled in industry so integrators participate and getting from different groups.
Moti: If industry find way to measure objectively on performance. In UK, put framework on companies with different scenarios like to see something like that going forward.
Carolyn: Don't think tests they put the product through serve 80% of customers. Absolutely critical is end user engagement saying these are the scenarios I face every day. I think if industry had that it would help end users make better, educated decisions.
Sam: How consolidate impact market?
John: Challenging question we are embarking on cooperation strategy. On some level we compete and other ways we bring added value. DVTel acquisition because there are no standards we had embarked on open partnership strategy. None embraced all abilities of technology.
Doug: Like idea of open system really big step but getting lost is dumb cameras attached but losing adding values of intelligent cameras.
Sam: Perception of analytic companies that looking for exit strategies. How deal with that?
David: Just came out of horrific economic times. Think video analytics starting to become strategic and people able to take positions today. We're comfortable with fewer players.
Sam: People say analytics should be a feature, not business alone. Can be business.
Eric: If look at expertise - takes specialized knowledge used to analyze content of video and retaining and innovating is critically important and always capabilities to advance technology.
Sam: Future look. Make some predictions about what see coming down pipeline that don't know now. Also, price. Perception that expensive, what pressures on pricing.
Carolyn: Price is interesting one and I think that people are always going to fight about price if no clear expectation of value. My feeling that in last two years we've heard fewer complaints about price. Interesting in down economy. I contribute to increasing awareness of value and help sales voice talk to specific applications and as soon case, what's next alternative?
Eric: Value proposition is key piece. How much more effective by putting technology in place? Price point components and continue to improve algorithms. Have balancing act that price same I would say in 3 to 5 years.
Moti: More verticalized solutions solving specific customer problems. Trends in price simple and low cost installations and also customers not compromise on solutions. Cost of security system, analytics not that much part.
John: Driving factors and think: Cost effectiveness, ease of use, interoperability and plug and play, liability. Product has to meet expectations and it can't be labwear, has to be proven.
Doug: We take approach we look at future analytics becomes free have cameras and it becomes free and it's just there. That's approach we take is instead as seeing as add on just trying to make smarter camera. In market has to have shake out of technologies that aren't adequate. And consequently installing these systems and having problems and thinking reflection on analytics.
John: For us part of future revolves around work doing at SafeCity and early deployment picking up intrusion and capability of next generation of algorithm to detect crowds and tracking, etc.