Ready for Drone AI and Security AI? This webcast is presented by Security Systems News and Security Industry Association featuring Asylon, Ceva Logistics and Chooch AI. Joe Goodings, Director of Standards at Security Systems News moderated the panel. Jim McDonald, VP of Global Security, CEVA Logistics, Logan Selby, Vice President of Operations at Asylon, and Michael Leal, Vice President of Strategy and Growth at Chooch AI present.
Joe Goodings:
Good day everyone and welcome to today’s webcast. Next Generation SOC Analysts Tools:Drones and Robots with Computer Vision. This webcast is presented by Security Systems News and Security Industry Association. And is made possible by tremendous industry sponsors Asylon and Chooch AI. I’m Joe Goodings director of standards, and it’s my pleasure to be part of this session today. Just a few housekeeping items before I introduce our amazing round table panel. This is a webcast but what makes these special is discussion.
Joe Goodings:
Please don’t hesitate to provide Q&A in the field provided and we’ll try to get to as many questions live as possible. And we’ll follow up with answers to any questions we don’t have time to address. This webcast is being recorded and archived and will be accessible on the SSN and SIA websites. Please look out for many more of these emerging technology webinars throughout the year. And also please take this opportunity to download the handouts in the handouts tab to learn more about our wonderful sponsors Asylon and Chooch AI.
Joe Goodings:
I’m extremely excited to introduce you to our presenters today. It’s going to be a good one, I can already tell and I can’t wait for their perspectives. So we have Jim McDonald, VP of global security CEVA Logistics. Logan Selby, Vice President of Operations at Asylon and Michael Leal, Vice President of strategy and growth at Chooch AI. I’ll give these folks an opportunity to introduce themselves. Jim? I think Jim’s muted
Jim McDonald:
Hey, good afternoon. Thank you for the opportunity to speak. My name is Jim McDonald. I’m the VP of Global Security for CEVA logistics. Before that, I was the director of corporate security for Saia LTL Freight. Again appreciate the opportunity to be on here and to speak.
Logan Selby:
Hey, good afternoon, everybody. Logan Selby, vice president of operations in Asylon. I’ve spent most of my career in the DOD and intelligence community. Prior to coming on board at Asylon, I was the executive director for Simon Property Group’s operational Intelligence Center in Indianapolis, Indiana. Still an active reservist so currently assigned to the 75th Innovation Command where I’m one of the leaders of the robotics and autonomy group under army futures, and then currently a PhD candidate in atomic systems and robotics.
Michael Liou:
Hi, everyone. Good morning and good afternoon. My name is Michael Liou, I’m the VP of strategy and growth at Chooch AI, and it’s great to be here. I oversee our partnership network. And it’s great to have partners like Asylon and CEVA. We are a broad horizontal computer vision platform, that we’re deployed across multiple industries, including retail, manufacturing, security, and excited to be on this panel today. Prior to this role, I had a background in financial services, having worked for Goldman Sachs and Citi private Bank, and have been an active venture investor for the past decade. Great to be here.
Joe Goodings:
Awesome. So let’s set the stage with a poll question for the audience. So just to get an idea, what percentage of SOC analysts say they’re not equipped with the necessary tools to properly respond to security threats? Hopefully we have an answer. This was done by research. And we just want to level set what we hear and what we see from the audience to what the actual answer is. Let’s keep this open for about 10 seconds. Get your votes in. All right, let’s check it out.
Joe Goodings:
So the leading answer was 40% and 50%, there seems to be a tie there. The actual answer is more than 50%. And that’s the reason why we’re doing this session here today. And just to kick us off for a little bit more background. Logan, could you just set the scene for us? What are the current data pain points that you’re seeing in a typical SOC today?
Logan Selby:
Sure. Absolutely. Yeah. So taking from my experience at OSC, managing an operation for a Fortune 100 company, we had a SOC there that was delivering services with around 170 analysts. And so analysts are extremely inundated with tasks, whether it be monitoring CCTV, answering phones in a call center kind of environment, paging doors open and close, doing assessment checks for different items being called in from different properties across the US. And we’re looking for ways to make that operation more efficient, both from the physical security sides or in the field for digital security officers and on the SOC side for the monitoring aspect.
Logan Selby:
So we would like to get into this slide, we talk about the technology shift. So why we’re here today to talk through how to couple computer vision with drones and robotics. And in the traditional guarding sense, some of the issues that continuously arise for folks in the audience that are in the security industry, these are a surprise to you. But some I like to call out is high training costs, and that can be coupled with the high turnover rate.
Logan Selby:
So in the security industry alone, when it comes to guarding or SOC analysts in general, there’s high turnover rate, which it goes hand in hand with high training costs. So you got that continuous cycle of people turning over with [inaudible 00:05:42] training costs, which increases your liability cost, because then you have new people coming in that you have to retrain on a continuous basis, especially when you have security officers that are in the field protecting assets that are in public facing type positions.
Logan Selby:
Everybody today has some kind of smart device on at all times, it has a video or a picture capability. So that security officer that is representing your asset and protecting your asset, they could be caught at the wrong time and it could damage brand, which is a huge cost and liability. So that’s some of the points you make out and then just the cost nowadays, you’re breaching the 200k plus mark for a 24/7 or a 168 guard position, which is astronomical. So these technologies that we’re going to talk about today, both drone robotics and computer vision are ways security practitioners can apply these technologies in their ecosystem to create an ROI and reduce costs.
Logan Selby:
I’m glad that we’re joined with Jim today from CEVA Logistics. Jim you mentioned you’re new to CEVA, but we’d like to get your input on coming into a new environment. What security challenges are you looking to assess and address head on?
Jim McDonald:
Yeah, I mean, I think you made several good points already, for me as the quote unquote end user, it’s very difficult to quantify the return on investment with the physical guard spend. There’s obviously operational challenges with that, just from the low wage rate to inflation, to minimum wage hikes, to increased claims. It all kind of comes together and it’s really difficult. So even with the $200,000 price that you mentioned yearly, I think that that’s probably just for a normal site. Meaning that if you’re in a market, for example, like Memphis, or Chicago, or Fontana, your costs are going to be even more and more expensive.
Jim McDonald:
The difficulty in this is to again, make sure that you have the proper infrastructure and whatnot, but being able to again, on the opposite side to quantify the return on investment with what you can use with technology, as opposed to the guards, because what we’re saying is that with the guards, anytime that you do have someone that shows a lot of promise, typically we’re spending the money to get them up to speed and training, and then they’re leaving for another job, even if it’s a dollar an hour.
Jim McDonald:
So it’s just really you’re constantly moving pieces and parts and with a lot of these guards working second jobs, being distracted with smartphones, being on the internet, and not doing the traditional guard tours, it just makes it very difficult, not only with the return on investment, but protecting both your people and your assets. And again, you’re pulling from the same labor pool, if you will. And so it’s a battle of rising cost, and again, still taking on a lot of risk.
Michael Liou:
Hey Jim, it’s Michael here. In your role at CEVA, how and where are you seeing this technology impact operations and either like disrupt or perhaps fit into traditional security operations? And why do you think we’re seeing this trend?
Jim McDonald:
I think you’re seeing the trend obviously, because good, bad or indifferent, the workforce is changing. Technology has greatly advanced since what we saw 10 years ago on maybe CSI is now actually possible now. So for me, it kind of comes down to just kind of segmenting everything. Whereas you’re using automation, you’re using technology, but also just as part of the sell to the C Suite is this type of technology is not just for security. So when you got multiple stakeholders like let’s just say safety, HR, maintenance, there’s many different applications that you can use technology for.
Jim McDonald:
And again, it just continues to quantify that return on investment. I’ll give you a case in point, my previous job dealing with the Memphis market, we were paying $7500 a week in guards, and we were still taking on multiple claims, increased cost, and so did a $500,000 capex, which will pay for itself in less than two years if I’ve done the math right. And went from $7500 a week to less than $7500 a month. And I would argue that not only did we get a better type of application, but it further protects our employees, and also assets.
Jim McDonald:
Because the thing that I would kind of give case and point with technology is, technology doesn’t sleep at 2:00 am on a Saturday morning. I can’t tell you the amount of guards that I have personally caught sleeping on the job and who in theory, it sounds great that they’re going to do tours, but a lot of the times that’s not done, as opposed to using AI technology, and using self learning analytics to essentially cover or geo fence your whole facility. To me, that’s a game changer, a guard is never going to be able to replicate what AI and the self learning analytics can do. To me, it’s just not even possible.
Jim McDonald:
And really, I think where the disconnect is with security managers and security directors is really being able to talk the talk to C Suite. Because executives it really kind of comes down to money and also return on investment. So it’s one thing if you go up to the C suite, and you say, “I need a million dollars, and I need to do this and that.” And not really present a business case, as opposed to if you say, “Provide me with a cash infusion on a capex side, and we’ll pay for it in two years. We’ll cut the cost down year over year, and we’ll drive that cost down. And it’ll be paid for and not only within a couple of years, but then after that, you’ll make some enhancements every five years, and then you’re done, as opposed to what I mentioned earlier, spending $7500 a week.”
Jim McDonald:
Again, I couldn’t quantify any type of return on investment because we were still taking on claims and guards exhibiting bad behavior. And so for me also too on the claims side, what I can say with the technology is that we’ve been able to reduce year over year claims. So even if, hypothetically speaking, you have a couple of claims that hit after you’ve made the investment and your monthly recurring is a lot lower, you have some wiggle room to take on some claims, and still your overall is lower year over year.
Joe Goodings:
Excellent points there Jim. So let’s dive into a few of these technologies. I know we’re here to talk about Robotic Perimeter Security and computer vision. So Logan, can you give us a rundown on the Robotic Perimeter Security side of it?
Logan Selby:
Sure, absolutely Joe. Yeah, I appreciate it. So yeah, so at Asylon we have the robotic perimeter security, we kind of built it on three pillars. So when we had a client approached us about how to apply a Robotic Perimeter Security to their asset, these are the three items that we’d look at on deployment. So initially, obviously deploying the asset, so our brown asset, which is known as drone dog, and then our aerial asset, which is known as drone century. So that aspect of the pillar really creates that multi domain operation or that third dimension in your security depth, so the additional layer, and from an aerial perspective.
Logan Selby:
Then the second item, the second pillar is the easily programmable controls and the mission sets that we can establish pretty much instantaneously on deployment of our assets on a client site. And then the third is really our intelligence software that we deploy with our assets that we’re deploying for vehicles, which is known as Drone IQ. And I’ll get into a little bit of this I think on the next slide. But that allows us to remotely monitor our capabilities and then respond and report to the client anything that’s happening.
Logan Selby:
Whether that be intrusions, video from any kind of [inaudible 00:14:50] that happened on site. So this slide, we can include some gifs here. So in the top left hand corner, you can see that’s a gif of our Drone Century taking off from our drone home, which is our station. So our drone home is really the proprietary asset in our drone core system. So you can see there it’s demonstrating our proprietary drone SWAT capability, which really keeps our asset in the air around 15 minutes every hour.
Logan Selby:
So you can see there, the drone autonomously lands our station on [inaudible 00:15:20], and then receives a new battery whenever it runs low. And then the bottom left hand side of the screen, you can see from a person’s perspective, Drone Dog just patrolling around one of our test sites in Philadelphia, just to give you kind of an understanding of what that footprint looks like that vehicle walking around on site. And you can see it ties into what I talked about before, which is Drone IQ.
Logan Selby:
So we consider Drone IQ our central nervous system or our brain of our drone core platform. So that’s where all the magic happens. So everything is translated into Drone IQ, it allows us to control all the vehicles, to monitor the vehicles, to set up way points, patrols, or prescriptive patrols, as well as review video. We can do it all from that single SaaS platform.
Joe Goodings:
That’s awesome. Michael, in that computer vision, can you provide an overview as to how computer vision works? And how do you see this technology improving operational efficiencies?
Michael Liou:
Yeah. Thanks Joe. So what you see on the slide here on the right hand side is a bunch of images and the basics of computer vision require basically a model to be developed. And what we say to our clients is that if we can basically replicate human subject matter expertise using your eyes and interpreting a video or an image, and you give us enough images that represent that particular object or feature that you’re looking for, we can then replicate that expertise into a model.
Michael Liou:
We can ingest not only images, but also video as well. Now, what we’ve done at Chooch AI is we’ve actually solved three important, distinct bottlenecks or pain points involved with the edge AI or computer vision. And one is that in order to generate these models, you need 1000s and 1000s of images, and they need to be annotated and it’s often done by hand. It’s a painstaking manual and laborious process, and also prone to error. And we essentially have automated that. So that is one huge time saver.
Michael Liou:
The second is that we’ve also been able to create a no code platform to develop our computer vision models based upon these images. And if you kind of concatenate or add these timeframes together between what we call data set generation and model training, together that typically is north of nine months in the industry. And we’ve been able to compress that easily within a week’s time depending upon the data. And then lastly, we have developed expertise at Chooch AI to actually take these models that have now been generated, and deploy them to the edge.
Michael Liou:
And what I mean by edge, next slide. All I’m saying is that we can take in any type of video, it can be affixed on a tripod and overhead of a door, it can come from a drone or a robot dog, which we have done. And then take that video and feed it into some sort of server that’s running these models. And these servers can be either cloud based, or they can be edge based. And 99% of the inquiries that we’re receiving these days all are edge based. And why edge? Why edge today? Well, number one, the devices in the field these days are actually quite powerful. So we can fit multiple models that can inference and detect objects within milliseconds for our clients.
Michael Liou:
Second, reliability and uptime is critical five nines, you don’t want to have any network or cloud latency or broadband issues. So you want an on site on premise type of inferencing. And then lastly, because a lot of the information that people gather is sensitive, right? Some PII information or in a healthcare field, it needs to be HIPAA compliant, we need to ensure that the data is kept private. And so we leave these decisions in the hands of our clients. And once we’ve actually detected these anomalies or these objects, whether it be a hardhat, or an intruder, or someone who’s loitering, we then send an alert out, and we take that data in the form of what we call a JSON flat file and a JPEG. And we allow the client and then store that and manipulate that.
Michael Liou:
So we have wide ranging capabilities. We can ingest not only traditional electro optical feeds by video or imagery, we can do thermal, we can do infrared, CT scan, X ray, MRI, LIDAR radar, we’ve done multispectral satellite, we’ve done synthetic or aperture radar. And again, as long as we can replicate, it’s a subject matter domain expertise of an expert in an area and take the representative images, we can then build that dataset, build that model and deploy to the edge very rapidly.
Michael Liou:
As far as how efficient we can make things here, I mean, to echo Jim’s point, there’s a lot to be done. I think it’s very, very early days. I’ll bring up an example Joe, we actually spoke to a consultant about a Middle East airport, a security deployment. And we kind of walked through the benefits of a computer vision. And the consultant said, “Well, I can hire 150 people a day for $5 to kind of do this. So what type of efficiencies are you going to be able to capture?”
Michael Liou:
And I said, “Well, first, you need to train them, and deal with the turnover issues that both Logan and Jim addressed. Second, you need to make sure they’re looking the right way, they’re not texting their friends. They have to take lunch breaks, got to go to the bathroom. What happens if they need to now detect the Interpol 200 Most Wanted, how are you going to train them up? Give them a bunch of photos?” And the AI it doesn’t get bored, doesn’t get tired. The efficiency does not go down towards the end of the shift, it can’t be compromised, can’t be bribed, doesn’t have human bias.
Michael Liou:
And then if you’re aggregating all this data across six terminals 24/7, how can 200 guards aggregate all that data for actionable analysis, pattern detection and other downstream analysis? And this is where the operational efficiencies of AI really come into play.
Joe Goodings:
Right, so it sounds like these technologies, they save money, they save time, they add operational efficiency to staff. So what’s the process for trying to put this in place? Is there a strategy? Do you just call Logan and Michael? Jim, how did you go through the process of implementing this?
Jim McDonald:
Yeah, I think first and foremost, I mean, everything touches IT so it’s so crucial that the end user has a great relationship with IT. Because as we continue to evolve as a society, everything requires bandwidth. And so IT infrastructure is the first thing in making sure that you have enough because to Mike’s point, if all this technology is going to be running off bandwidth, then it could create a log game if you don’t have the right infrastructure. For me, it’s having to remain agile. And to Mike’s point, you’ve literally taken what I would have seen in 2012, where you’re going through hours of painstakingly … Going through video to try to find whatever particular incident that you’re investigating.
Jim McDonald:
And you’re essentially using the analytics to say, “Hey, I’m looking for this with this specific time frame, in this particular area.” And then also too on the proactive side, where it’s constantly looking for exceptions as opposed to kind of do an old school and a manual process. And of course, I would say part of that manual process is expecting guards to do their chores. And to catch the bad guys, which it’s hard for me in the 10 years that I’ve been in the space of corporate security, I can think of maybe one time a guard has caught a perv, I can definitely tell you how many times we’ve put people in jail with using either thermal fence line cameras with analytics and AI technology, or even our cameras that geo fence the the particular facility grounds.
Jim McDonald:
It’s just again, it kind of for me, it comes back to return on investment. I think the key takeaway here is, is that the world is evolving right now, and risk is definitely more inherent. And I’d be remiss to mention that we live in this 24 hour news cycle and social media, where it just takes one particular event, and it’s all the way around the world, even if it’s true or not. So it’s just we’ve got to step up our game as security practitioners, and it really kind of goes to the layered approach, meaning that there might be certain markets where you need guards.
Jim McDonald:
And there’s definitely in those certain markets where guards you can quantify maybe return on investment, maybe them checking in somebody inside the facility or this or that, but for me, as far as providing those layers, personally, I want to have as many layers as I can to protect both folks and assets. But ultimately criminals are inherently lazy, they want the easy stuff. It’s very rare that you at least in my 10 years where you get the bad guys doing the MacGyver, he’s dressed all in black, he’s going through the sewer system, Mission Impossible playing in the background and doing counter surveillance, usually it’s all internal, or there’s an internal connection.
Jim McDonald:
So for me, it just kind of comes down to layers. I think going back to Michael’s point is that what we thought was being done 10 years ago, what we’re kind of finding out now is receiving the data, that it wasn’t actually being done. So as Ronald Reagan used to say, “Trust, but verify.” And having this type of technology to key off your SOPs, number one, you’re eliminating those false alerts that your SOC or your G SOC is inundating your staff with, and you’re really just furthering or nailing down what’s actually a true theft event, or a true incident as opposed to somebody just not following the rules.
Jim McDonald:
So again, the ability to be agile, and again, this is cutting edge technology. So with cutting edge technology, there’s a learning curve. But for me, it all comes down to and I would say I’m a bridge between a millennial and kind of old school, but the more data that I can have to make decisions in real time, that’s what I want. I want to be proactive, I don’t want to be finding out stuff that everybody else knows, and then having to kind of play catch up with that. So to me, that’s where the value is at.
Joe Goodings:
So yeah, I heard collaboration and layering as part of the strategy. Yeah, every facility is going to be different. And from a perimeter security perspective, Logan, you’ve been on both end user and the technology provider side. So how can the two groups most effectively collaborate in the new technology landscape we’re seeing?
Logan Selby:
Sure. So Jim kind of hit the nail on the head, keeping an open mind, understanding this cutting edge or emerging technology, so understanding how to implement it, and then changing that culture once you’re on ground. Trying to get the old school type mentality to accept this new technology and how to use it to its advantage as a tool and not see it as a roadblock. So that’s step number one. But like you mentioned, every site is going to be different from a logistic standpoint, to a mall to auto manufacturer. As you can see in these videos here, every site is going to be different.
Logan Selby:
So the end user understanding their priorities. So initially, when we deploy, we do a thorough assessment, so we can understand the priorities and the gaps that the end user is bringing us in to fill. So that’s extremely important. The data collection, as Jim mentioned as well, data now a saying in the DOD is data is the new gun powder. So data is just the biggest asset as any line item on a financial statement. So doing that data collection, and then continue with data collection once we’re deployed is extremely important for us to understand the client asset and get a baseline to then move forward and create our prescriptive controls for our assets to patrol on a client site.
Logan Selby:
So that kind of takes us to the second stage, which is we mentioned before automated SOPs with our autonomous systems, understanding what that is. So this kind of gives you like a six step process walking you through, an example from our Drone Century standpoint, which is very similar to our Drone Dog as well. But once that continuous data collection continues to occur, we’re able to further automate missions and change them based on anomalies or based on alarms and different changes in the environment.
Logan Selby:
So I don’t think we mentioned earlier, but our devices are able to integrate into any kind of legacy systems that a client might have, any type of new IoT type sensors that are available on site. And that allows us to automate a lot of these patrols. So as you can see, takeoff reaching out to performing a mission. As that mission’s being performed, we’re monitoring it remotely from our SOC or Robotic Security Operation Center in Philadelphia. So with the integration with all those systems, whether it be a legacy system or an IoT sensor, we have the ability to immediately respond to that during an automated patrol.
Logan Selby:
So you have proactive patrols, prescriptive patrols that are based on historical data. However, in real time, if there is an alarm triggered event or some type of instance that occurs, our vehicles are able to respond in real time and course correct from that current mission. And then we’re able to return to base what we call RTL, so the drone or the robot will return to its station, and then perform an automated battery swap so it’s able to swap and then go on additional mission or be ready to respond to any type of alarm event or that occurrence.
Joe Goodings:
So in terms of-
Logan Selby:
Oh, sorry. So as you can see we have, this is an example of our Drone Dog patrolling from a POD aspect from the dog with utilizing Chooch AIs computer vision technology. So it gives you kind of an understanding of what that looks like. So I guess I could pose another question to Jim. So you have tons of different applications, computer vision that you can apply computer vision to, can you describe some of the things you’re looking to be responsible with or as a security practitioner for 10 years and a corporate security executive?
Jim McDonald:
Yeah, I mean, I think that what’s really cool about this technology is the ability to really build it around your business. And so having or being able to have seen the drone in action with the onboard analytics, seeing the Robot Dog in action. And for me, it’s for example, it all comes back to data collection. So the previous slide that you showed with the vehicles, it’s not just for security, you’re talking yard management, where are your high value assets, if you misplace it? Because, of course, that never happens.
Jim McDonald:
With this particular slide, the ability to collect vehicle tag information with the LPRs, the Robot Dog being able to handle that, being able to use thermal technology at night. Again, there’s not a lot of things that get me excited as far as with the technology field, but this right here is cutting edge, this is a game changer for the industry, and there’s so much return on investment, and we’ve really just breached the forefront of this. I mean, give us another three to five years and the sky’s the limit.
Jim McDonald:
With this particular slide, instead of a manual process, for example, a guard going to checking on a door that might be propped open, because, of course, that never happens. Again, with the availability of the API’s and the data to be able to form not only the SOP, but for all these systems to be talking and integrated to one another, it’s just taking that variable from the guard, and again, trust but verify, and making sure that it’s actually done because let’s be honest, the one thing that keeps me up at night as a security practitioner is these active shooter situations.
Jim McDonald:
And of course, usually when we go through these unfortunate scenarios after they happen, what we find is there was a breakdown in security, and it could be as something simple as a door being unlocked, and a person knowing that that door has been unlocked for a certain period of time. So to me just further automating it and then also too, for more of a practical standpoint, the ability to archive an event because we live in a very litigious society now from the safety, compliance and risk management perspective, it’s crucial to protect your brand and your company against litigation. And so what a better way to say, “Hey, here’s the video, and here’s the analytics, it’s doing everything that it’s supposed to be doing and here’s why we believe that we’re right with this particular incident.”
Joe Goodings:
Just want to just elaborate on some of those real world scenarios. I know you’ve already discussed some of the major aspects of how computer vision works Michael, but can you speak a little bit more about how it comes out in the real world environment, some considerations and controls and threshold confidence levels? Really, I’m talking about kind of false alarms. And how can you ensure that SOC analysts aren’t bombarded by false alarms with this type of technology?
Michael Liou:
Thank you. It’s a great question. Look, the real world has rainy days, right? It has low light conditions, it’s got crappy cameras, and it’s got fast moving objects. So not everything is going to be a picture perfect scenario with great lighting conditions. So within a computer vision world, we need to ensure that we have the right type of gear and properly situated at the right angles too in the right places. So whether that’s x number of camera deployments, whether it’s an upgrade of existing cameras from a 720 to 1080P or a higher frame rate, or maybe for every five visual cameras, we need one thermal camera for night vision operations surveillance.
Michael Liou:
But what we’re trying to do to put it another way is we’re taking the world of visual data, which is currently unstructured. And we’re actually putting structure into it by identifying all the different objects in the scene that are of interest to folks. Now, to answer your point here. AI in some ways is kind of dumb. We can detect objects, but we need rules on top of these detection models in order to make sense of them. So if I told you I can do weapons detection, and you’re monitoring security within a Walmart, and you see people buying guns near the guns counter, that doesn’t make a lot of sense.
Michael Liou:
One other hand, we have perimeter security in the parking lot, and you see someone getting out of their car 50 yards away from the entrance of Walmart with a long gun, then we have an issue. So we need some logic on top of that. Other situations if you’re detecting and want to ensure that people are wearing hard hats on a construction site for their safety, what happens if someone takes off a hardhat for eight seconds to wipe their brow because it’s a hot day and then puts it back on? Should they be terminated? Should their manager get an alert saying there was a safety violation?
Michael Liou:
Or take an example of security in an airport, and you’re detecting unattended luggage. Well, you have to run a couple of different models. You have to make sure you know what luggage looks like. You have to know what a person looks like to make sure it’s attended. And what’s the definition of attended? Is it three inches away? Is it touching it? Is it a foot away? And then when they start walking away, how far away do they need to be before it’s unattended? And if it’s unattended, how long do they need to be away? Is it 10 seconds? Is it 30 seconds? Is it a minute?
Michael Liou:
And then what happens if their friend comes and takes the luggage and brings it to the gate because they’re stuck in line getting coffee? So there are other considerations and rules and logic that needs to be applied on top of AI. And this is what we and our partners do to actually make the AI effective. So in the world of anomaly detection, whether we’re detecting open doors or intruders or people loitering, we need to ensure that we impose some additional logic on top of that so that we don’t get overwhelmed.
Michael Liou:
If someone fell down, do you need a constant alert saying that someone has fallen down? Like they’re not moving? Hopefully, you’ve noticed that the AI notices that. And then the EMS has been alerted is sent along the way. So same thing with fire and smoke detection. If someone lights a lighter, that’s technically fire, but it’s hooked up with someone who’s smoking, well, maybe we don’t call 911 for that. Or someone if there’s a small fire as another example, and is snuffed out by someone with a blanket in like 15 seconds, maybe we don’t have to call the authorities right away.
Michael Liou:
So again, I do emphasize that we need to have some logic on there to prevent a lot of spurious data overwhelming our analysts. As far as your last question regarding accuracy rate, here at Chooch AI I mentioned earlier, we develop models quite rapidly and quickly. We do not release them, unless they have at least a 90% accuracy. Otherwise, we don’t feel confident that they’re going to be effective in the real world.
Joe Goodings:
Great. So I know the last part is, so all this you’re detecting things, you’re running missions and routes. But once you get to the point where you put it all together and you receive the alert, that obviously there’s some human interaction to proactively respond. So can you guys take take turns going through this last step of what happens after an alert is received?
Logan Selby:
Sure, I can take that. Yeah, so after an alert is received on a SOC analyst aspect, depending on prioritization, we have protocols in place, whether that be automated protocols or protocols that the analysts follow themselves. So that could be something as like a POC script that allows them to understand what the next step is to take. So we’re kind of taking the guesswork out of the equation for the analyst as well. So an alert comes in, a POC script is generated to tell them what to do next.
Logan Selby:
So whether that is they hit a certain threshold that’s contact local law enforcement, or that is a certain threshold that tells them to contact a call tree for that client that they’re over watching. So really, it just becomes a decision tree for the analyst once that alert comes in. So that’s why creating more efficiencies and reducing the amount of over abundant alarms for the analyst is extremely important in the problem that we’re solving.
Jim McDonald:
Yeah, and I mean, just to add to that, every market is unique in itself. So obviously a Fontana, California and Dallas, Texas is probably going to be different than Dothan, Alabama. I mean the good news is this is all customizable. What is really good about this is the fact that you are providing real time data to your SOC analyst to where they can make the best possible decision as opposed to kind of what we did 10 years ago old school was just stare at a computer screen for 20 minutes, and then after that you’ve checked out.
Jim McDonald:
So I think that that’s where the return on investment is, is just tweaking it to markets specific. Like for example, in my previous job, for me, the caveat wasn’t always put the bad guys in jail, the caveat was to protect the employees. And so if it meant that the analyst finds a true identifiable threat, and then use it in IP speaker system to administrate and voice challenges to get them off the property, to me, that’s still a win, because trust me, the bad guys are letting their friends know that x, y, z place is not the place to be trying to steal stuff or whatnot. So I think that that’s what makes it so unique is its customizable.
Joe Goodings:
Right, Michael, anything to add?
Michael Liou:
No, I mean, if you think about a guards role, you think about surveillance and enforcement, you put this technology, you’re kind of shifting more into the surveillance mode. And there’s some tasks that AI and robots really can’t perform. But at least from a surveillance perspective, you kind of lighten that load a little bit and made that traditional aspect of a [inaudible 00:41:54] a lot more automated, a lot more efficient, and much more scalable too. If you have some really good practices that have now implemented computer vision models, you can now scale that across 1000s of cameras and get that degree of consistency that you may not get with a very varied guard force.
Joe Goodings:
Thanks for that. As the Senior Director of standards, I’d be remiss if I didn’t talk about how you put this together technically. I know there’s probably some clear standards and how do we put these technologies together effectively, and have them work the way that they’re supposed to?
Michael Liou:
I’ll kick it off Joe. And just really briefly, when we detect an anomaly or an object of interest, that data is easily accessible in a JSON and JPEG file. And that’s easily accessible via API, or MQTT broker, or we can just send out an email or text alert with very variable integration. So all of our partners have found it actually quite easy to integrate. Sorry Jim, go ahead.
Jim McDonald:
No, you’re fine. So the thing I think is most important here, if you’re just a security manager, a director or a VP, is being able to understand that Rome wasn’t built in a day. So in my previous job, it took me close to nine years to get us up to this point of the type of technology that we’re talking about today. And so a lot of that starts with funding and just because your C Suite says no one day, doesn’t mean that they’re not going to say yes the next.
Jim McDonald:
And it’s just being able to create that business case of it’s not just we just want some toys to play with here, it’s we really want to find a return on investment and drive down overall cost and provide a better product to both our employees and our customers. So I think it starts with that, and being able to talk the talk. And again, as I mentioned earlier, being able to go and show the business case and how the numbers make sense.
Jim McDonald:
Because even with the case in point that I gave about the Memphis market earlier, yeah, $500,000 seems like a lot of money. But after you’ve taken on two massive claims, and you’re paying $7500 a week, it’s really a no brainer of what the decision needs to be. So for me, this is an easy sell. It’s just planning it out strategically, understanding that it’s not going to happen overnight, and just realizing that it’s just not one stakeholder that this is benefiting from, you can use I mean, just for example, you can use the drones for yard management I mean, on top of looking for security exceptions and safety. So it’s just really getting the buy end of all your constituents if you will, and just making that business case.
Logan Selby:
And just to echo the two statements, as far as Asylon goes and our platforms are open as well. So we’re accepting of other technology platforms that we can integrate with a lot of legacy platforms as well. And to echo Jim’s statement, the culture side of things I can understand from experience can be extremely challenging as well. So getting that buy in is extremely important. Especially when you’re dealing with drones and robotics, putting some robots on a site can be intimidating to not only the security practitioners but the staff in general. So having that narrative and being able to explain to folks what the mission is, what they’re being utilized for is extremely important as well to get the entirety of the buy in from the organization.
Joe Goodings:
So let’s take some time now and do a second poll. In addition to security, which areas do you see drones and robotics with computer vision providing significant value? And Jim you had mentioned using the drone for yard management as one example. Anything else comes to mind that’s how you see this going?
Jim McDonald:
I mean, the main thing that comes to mind is this is how you make your business case, this is how you get your funding by involving other verticals of the business or other stakeholders. So for me, it’s an easy answer, it’s all the above. Because again, technology does not sleep. It’s constantly looking for exception. There’s no other type of product right now that is going to give you this type of coverage of what you’re going to get with AI and self learning analytics.
Joe Goodings:
Yeah, and it really does seem that the audience does agree with you with all of the above. So let’s shift gears a little bit, I think let’s do a little bit of a lightning round in terms of security workforce of the future. So I’ll go around and then it’ll also give the audience some time to submit some more questions. When you look ahead at the security workforce of the future, who is it? What are they doing? When are they doing it? Where? Why? 30 seconds each. So Logan, how about you?
Logan Selby:
Sure. So when it comes to in the robotics industry in general, people see robots as replacements for humans. And I definitely don’t see that as the case, I feel this is adding opportunity for current security practitioners to take on a different role in the administration aspect of drones and robotics, or the monitoring aspect of drones and robotics. So I see the security workforce of the future as individuals that’s in the workforce now that wants to gain additional skills and be able to manage and apply these assets.
Joe Goodings:
Michael, how about you?
Michael Liou:
I kind of concur with Logan’s comments, I think it’s going to be a happy coexistence. I mean, there are some things that robots and computer vision can do just better and faster than humans. I mean, if there’s 300 people in the crowd, by the time you count the four, I’ve already got 300 all counted of which 60% are women and 30% are wearing red Tshirts or so. I think there’s some functions that you should leave. But there are other more sophisticated functions from SOC personnel that frankly, I don’t really foresee robotics or computer vision overtaking anytime soon. They’re just too complex at the current state of technology. So I do kind of see a nice complement of both the technology as well as the best of what the traditional SOC analysts and guard force can provide.
Joe Goodings:
And Jim, what does your team have to look forward to?
Jim McDonald:
I mean, just to keep it stupid simple, I don’t know if it’s the logistics space as a whole, but anytime that you’re comfortable, that’s when you’re not growing. And so I think it’s easy to use the old adage that this is the way we’ve done it for 20 years, so why should we change message? For me when I came into this space from law enforcement, I just made the decision that I was going to be a change agent and that I was going to make calculated risks to better enhance our security platform. Because for me at the end of the day, everybody that comes into work I want them to go home safe and secure.
Jim McDonald:
And so just for me, although the guards do serve a purpose, for me the best product and application for what we want to do here now at CEVA is to use cutting edge technology and to be frank our customers are asking more and more for this. So I’d rather be on the forefront and our customers feel like they’re getting the best in class security offering, and we are doing everything possible to mitigate risk and also brand protection from both CEVA and our customer.
Jim McDonald:
And just to add on something to what Michael said. And Mike, I’m kind of curious what you think about this, but I really think this type of technology is really going to significantly reduce risks when it comes to active shooter. I think that it might be a little early, but I really think that the layered approach along with this type of technology specifically being able to point out weapons and items that can hurt people, obviously, I think this is going to be a game changer, I really do. What do you think?
Michael Liou:
I actually do agree with that point Jim. I mean, if you think about the ability for a camera with high definition range to actually detect objects in a distance that perhaps can’t be resolved with the human eye, or looking at a large field of vision where it might take three or four or five people to constantly look without blinking, right? 30 seconds is a long time. 30 seconds is a long time for an EMS response. And if I can detect one person coming out with a handgun in a parking lot, versus two people with a long gun and body armor, you’re going to have different type of responses. And you’re going to have different scenarios for casualty rates, potentially as well too. Which means you need to get the right type of enforcement out there ASAP.
Michael Liou:
If you now couple that with either an aerial or automated deployment of video through Robot Dog or drone or so it’s going to make that whole coverage even more effective. So I do agree, and I know 30 seconds may not sound like a lot. But in these type of scenarios, every second does count, especially when people’s lives are at stake.
Jim McDonald:
Absolutely.
Logan Selby:
And to protect the first responders too. So you’re lengthening the chain of events. So if a drone responds or a dog responds with the ability for two way audio, that’s going to be a distraction or deterrent from that person, and maybe make them think twice from committing that violent act as well. So the first responder aspects, you’re reducing the potential violent act from recurring.
Joe Goodings:
I think we have some time for q&a, I have one that’s really focused on false alarms. So let me try to condense this a little bit. A CCTV with certain algorithm is only seeing what it’s seeing. Otherwise, too many false alerts occur. So no AI system is 100% foolproof no matter how good the algorithm. Environments change globally, and all considerations and simulations need to be run, which never happens. So what are your thoughts on that in terms of those simulations? And what’s done to reduce those false alarms?
Michael Liou:
Well, that’s a good point. And it’s better to have false alarms than the ones that you totally miss, especially in mission critical situations. So to have a false alarm or a false positive, if you will, and then having someone just double check, that’s the worst case scenario, okay, nothing really bad happened, right? Well, it would be worse is if you had like a false negative, and you did miss something, and the door was open, or someone did have a gun. So you probably want to err on more on the false positive side than the false negative side.
Michael Liou:
At Chooch AI, we try to take as much real world data that is available. And we do a fair amount of interesting techniques to actually simulate edge use cases and increase the variety and diversity of the data set. That means we’ll gray scale our data set, we blanch them, we cut them out, we reverse image, we try to create a multitude of different images to actually help build that model.
Michael Liou:
We then potentially augment that with real world data, and then test it on real world data. So once a model is kind of built out, we’ll actually run it using kind of a crappy 720 CCTV camera and see how it goes. And if we’re getting a multitude of false positive negatives, we take the information, feed that back into the data set, and actually try to improve that model. And whoever asked that question is correct. It is not going to be 100%. But neither are humans. Humans make mistakes all the time as well.
Joe Goodings:
Right. And here’s one probably more targeted to Logan, but we would like everyone’s opinion too. What does the group think about the emergence of airspace security, drone detection and mitigation and its impact on SOC operations?
Logan Selby:
Sure, so that’s a question that we hear regularly at Asylon. So definitely a threat. As you can see in recent news overseas, it’s becoming a larger and larger issue. Probably only a matter of time before it becomes a larger issue in the US. I mean, there’s airspace incursion events that happen on a regular basis with small private UAS systems. So I would say, yes, it is an issue. We at Asylon currently are partnering with other organizations that are looking at combating those issues along with our platform. So it is something we hear on regular basis, but I’ll open it up to the other gentlemen to see their thoughts.
Jim McDonald:
Yeah, I mean, so for me, I think that there’s obviously inherited risk with anything that you do. A lot of it is just going to be governmental control, partnering, as Logan suggested, networking, all the above. I just think that right now as it stands, that this type of technology is not going to go anywhere. So I think just the governance part of it, that’s what needs to be worked on. But I would make the business case that here in the US, it’s a lot different compared to overseas with the type of problems that they’re having with drones.
Jim McDonald:
So for me I think it’s just inevitable with the Amazons of the world delivering with Domino’s Pizza using the self driving vehicles, I just think it’s inevitable that kind of like how we laugh when we watch the Jetsons, what you’re kind of seeing is the world transforming into that.
Joe Goodings:
We have time for a quick one. And it’s kind of related. So how do the drone flying regulatory requirements on the local state and federal level and the regulatory headwinds impact the feasibility and viability of the related AI technologies when you combine them?
Logan Selby:
So we’re currently combating that on a daily basis. So we’re applying for our beyond visual last sight waivers currently with the FAA submitting our first waivers through the next couple weeks, actually. So it’s a continuous process. So we have a pretty strong partnership with the FAA. A lot of our advisory board members come directly from the FAA, so we have a straight partnership right now in order to combat the regulations. We do have safety pilots that reside on site, but we still are controlling all of our flight operations from Philadelphia remotely. But we have that part 107 FAA cardholder on every site that we’re deployed to currently.
Joe Goodings:
And Michael, your platform is kind of dispatch agnostic, correct?
Michael Liou:
It is. So if we ingest video, whether it be real time stream from a drone or it’s downloaded, or we await the drone to be docked, and then download the video it’s all somewhat similar. When it comes to exportation of geospatial AI capability outside the US that is certainly regulated or so. And of course, anytime you talk about AI in surveillance, the privacy issue always kind of comes up or so which is a whole separate topic. But these are things that we’re certainly sensitive to here at Chooch AI.
Joe Goodings:
Thank you so much Michael, thank you so much, Logan, thank you so much, Jim. But I think we have to leave it there today. Here’s your contact information for anyone that has questions directly to them. Once again, thank you so much for the engaging audience participation. We really do appreciate that and we’ll reach out to answer any questions that were missed. And then thank you once again to our sponsors, Asylon and Chooch AI for making this event happen. That concludes today’s presentation, everyone be well and see you next time.
Learn more about Computer Vision, Drone AI and Security AI