Scott Keplinger is the CEO of Intellisee, the real-time artificial intelligence (AI) risk mitigation platform working to detect and mitigate threats before an incident occurs by using a company’s existing passive surveillance cameras.
Scott shares with me how his passion for businesses that solve real needs has informed his work, his thoughts on how AI will change our lives the same way cell phones did, and how his team is working to use technology to create a safer world. Scott also discusses a number of real use cases of how their company uses AI to save other companies money and reduce risk, how their unique partnership with the University of Iowa will continue to grow, and how his team is setting ethical boundaries to make sure that the Terminator film never becomes reality.
Sponsored by MidWestOne Bank, this is the latest edition of the CBJ’s new podcast feature with Nate Kaeding and notable Iowa business and cultural leaders, available first to CBJ members. Listen to this episode below, and subscribe on Spotify, iTunes, Google Play, Stitcher.
NATE KAEDING: Artificial Intelligence is a hot topic right now, but let’s start with your personal story. Can you tell us a bit about how you got started in this line of work?
Scott Keplinger: We’re doing some really amazing things, and the future’s incredibly bright for some of these emerging technologies. I’m actually an Iowa native, so I was born and raised here. Like a lot of people, I moved out of the state for about a decade, then started having kids and moved back. In corporate roles where I ultimately was managing large teams, I’ve always had an entrepreneurial spirit. I think part of that is because I’m highly analytical, but also creative.
My business partner happens to be my neighbor and a very good friend of mine. We actually started The Asymmetria Group a decade ago after successfully doing the same kind of things for ourselves as investors. We saw the opportunity to turn that into a business that then gave us plenty of opportunity to see all kinds of deals, ranging from venture capital to distress debt and other more traditional investments.
But what’s fascinating to me is the structure of businesses. Are they solving a real need? If they’re solving a real need and then executing against that, then you have a tremendous opportunity. I feel that we’re doing that here. We’re kind of running that same playbook by asking ourselves what problems we’re solving and how we’re solving them. The rest are a lot of Harvard Business Review strategies implemented with a lot of blood, sweat, and tears.
What is Intellisee, and what problem are you trying to solve?
So the company’s true name is Malum Terminus Technologies, and that’s a namesake from a Department of Defense project done with the University of Iowa. It’s Latin for “Stopping Bad.” Every organization faces all kinds of risks and threats. We’re in the safety and risk industry, and that’s actually our mission. Our mission is to create a safer world. We have the audacity to think that we can create a safer world. We’re using amazing technology to do that.
The catastrophic — I mean, you cannot read the news today without seeing the latest shooting. It’s heartbreaking. That’s what motivates us. It keeps us awake at night, but also gets us up in the morning. But those aren’t the only things that organizations face. In fact, the odds are almost zero that there’ll be a shooting in an organization. What every organization faces are things like trespassing, slips and falls, solo workers, vehicles not being where they’re supposed to be, all those normal day-to-day things. Those are the problems we’re trying to solve at the high level.
Everybody has surveillance cameras today, but nobody’s using them. They’re using them for after the fact forensics or video capture. We’re trying to flip that on its head and actually turn them into proactive risk mitigation tools. We’re automating the active monitoring of live surveillance through our technology.
What are some examples of this for retail?
If I’m a convenience store operator, I’m facing all kinds of challenges. The top of those actually is labor. It’s very hard to attract and retain staff. In many cases, convenience retailers are open late into the evening and very early in the morning. In some of those cases, they have solo workers. That is one of the most dangerous issues that every organization faces. It’s called risk exposure, and it’s increasing. What happens if the solo worker abandons their job? If you have a camera that’s facing there, we can alert you to say, “Hey, where is Joe? Where’s the person? No one has been behind the cash register for 10 minutes. Something’s going on.”
Another use case is what happens when somebody falls, and that is the most common incident. We have a lot of relationships with insurance carriers as well as insurance brokers. They like us because if we prevent just one of those, that saves them $50,000 right there, and if you’re a self-insured organization, that comes right out of your bottom line. Can you prevent the fall by addressing the cause of the fall first? So we can detect slip risks. Is there a spill on the ground? Is there something that somebody would fall on? We can also detect falls, and people fall for a variety of reasons. If the fall happened, whether it’s because they slipped or tripped or had a medical issue, you need to get help there immediately. That’s what we call mitigating the severity of the claim.
We have horror stories. We have several of the largest school districts using our system, but one (local district) shared with us that they had a solo janitor in an elementary school that had a medical issue, collapsed on the ground, and laid there for eight hours. What happens when you can’t get help yourself? Forgive me if I get emotional today, but again, it drives us. We can literally save lives. The University of Iowa had that unfortunate situation during the polar vortex in 2019 where a student froze to death. We could have saved that kid’s life.
You mentioned to me that you had a story about your partnership with the University of Iowa.
Yes. I consider us a part of the university, and they’ve been tremendous partners. They were our first beta site. The very first alert we sent out was a trespass situation at Kinnick. For the Big 10, they have to do a security sweep before every game, and those security sweeps would occur on Wednesdays. Then they would staff with seven overtime guards up until game day. So once we put Intellisee in, not only did they get 24/7/365 coverage, they were also able to reduce the number of guards to one, and that savings alone was over $150,000 a year. That trespasser was caught at 2 a.m. on an off-season Monday. So that’s an example of the cheesy line I like to use: Safe organizations save. We have probably 50 different examples of how the university has benefited from our services.
Can you share a bit about the origin story of Intellisee?
Yeah. So as I mentioned, all of us are very concerned about the school shootings, and that was ultimately the inspiration for this. When you look at what the University of Iowa is doing with K-12 education and mental health, there’s tremendous synergies there. Some brilliant folks got together and said, “OK, what can we do about this?” Then they reached out to some business folks like us and said, “Hey, we’ve got this idea. What do you guys think?” And we were so enthralled with the idea that we decided to seed the firm. We also said, “Hey, we’re going to put professional management in,” and that’s very important because each of these amazing universities are engines for research and insight and breakthrough, but as an investor, an idea is worthless if you don’t have a way to execute it.
Then, in 2020, we built a proof of concept, and the technology that we built is fantastic. We then, in 2021, did live betas. Then, in 2022 we started commercializing and we actually built out a distribution network. We’ve secured additional capital. You may have seen the Iowa Economic Development Authority are big fans of us, so they’re tremendous partners. We were just awarded half a million dollars there because they understand our mission and like what we’re doing as well.
Where are you at in your startup journey?
Like a lot of startups, we’re now emerging into what’s exciting — the growth phase. When you look at startups overall, they’re fraught with risk because you’re trying new things. We’ve gotten through probably the riskiest period, because you have to spend a lot of money to actually create a technology that can be shown, like a proof of concept. Well, all of those are sunken costs because you’re not going to get a single dollar of revenue until it’s actually implemented and people are excited enough to actually pay you.
We’re in a bridge round right now because, for those in your audience that are familiar, the VC markets softened significantly. Last summer in our space, we were going to get a premium on our valuation because we’re in AI. Our business model is software as a service (SAAS), which translates to annual recurring revenue, where you get premiums on those as well. So we’re actually pushing off a Series A probably until 2024 or 2025, when we can get that valuation as high as possible based on annual recurring revenue. The other thing in our business will be renewal rates. We’re at 100% right now, which is fantastic, but that’s based on an extremely low client base.
How many clients do you have right now?
We have just over 20 installations right now, and these installations range from very large organizations to small ones. Everybody has surveillance cameras now. Fifteen years ago, that might not have been the case. So our product is ubiquitous. Everybody should use it — everybody needs it — which is fantastic for addressable market but horrible for focus.
So what we’ve done is two things. One is we built out a distribution network of systems integrators. There’s a whole industry around selling with the fire alarms and selling with the access controls. Even selling with the cameras and the systems themselves. This channel actually makes them all talk together. That’s why they’re called systems integrators. That’s an example of how a jail is using our system. We weren’t going out and targeting jails, but one of our distributors had a relationship and sold it into a jail.
Our first priorities are really around education. Several hospitals are using our system, and then we have an emerging segment of retail right behind that because we have a lot of relationships with insurance. That’s the culture we’re trying to drive. On one side, you’ve got those systems integrators talking with the head of safety, the IT groups, et cetera. On the other side, we’re coming in through risk where we have a growing network of insurance brokers.
What are your thoughts on the popularity of artificial intelligence right now, and what do you think the future of AI looks like?
Well, first and foremost, from my perspective, like anything, you’ll think of it as fire. Fire can heat your home, cook your food, or burn down your home. All of these amazing, powerful tools create a better world when you apply them in the right ways. What’s fascinating about artificial intelligence is that it’s learning, and there’s all kinds of versions of this. The most advanced is what’s called deep learning, and that’s what we do. That’s the ChatGPT kind of stuff.
What I do want to remind folks of is that the math has been there for quite a while. The screens you and I are looking at right now are millions of pixels. But the processing power to actually analyze all those in real time is what has caught up.
On one end of the spectrum is what I’d call “rules-based” or “machine” learning. If this happens, then do this. On the other end of the spectrum is this deep learning where instead you’re saying, “Here’s a bunch of stuff. Now, in some of this data, we’re going to tell you what it is. So that’s a cell phone, that’s a handgun, that’s a spill on the ground, right? That’s a car. Now this other big data set, we’re not going to tell you that. You’re going to figure it out.” So you’re training the AI against a validation data set. Then what it does is checks, “Hey, did I get it right? Oh, I didn’t get it right. OK, let me try again.” It does that over and over and over again until it gets it right. It continues to get better. The cheesy line I use is “It’s not like a car engine where the more you drive a car engine, the more likely it’s going to break down.” With an AI engine, particularly ours, the more you drive it, the smarter and better it gets.
What do you say to those who are concerned about the threat behind AI?
So I was enthralled with the first Terminator movie, but that’s not going to happen. You want to make sure you have the proper guidelines, but as I mentioned, it’s like fire, where we want to use it to heat our house. That’s where the ethical boundaries come in. As an example, we have chosen not to do facial recognition because of the ethical concerns, but also the pragmatic concerns. You don’t need it for what we’re doing. Secondly, there’s regulatory things like in schools which is, in essence, privacy around the students. So our approach is that you don’t need facial recognition. We’re not going to do facial recognition. The horror stories are real out there if it’s used inappropriately and not regulated.
Now, there are ethical uses of facial recognition as an example. Some schools, if you walk in, you can actually put your face by a camera for the door to unlock it because it recognizes you as an employee or there’s a database behind it that says, “I’m sorry, but that guy is on the bad list. Do not let that person in.” To me, that’s an ethical use of it.
How do you define success?
I think we’re already successful. I personally think I’m already successful because we took the risk. We have the audacity to think we can make the world safer, but that takes risk. That takes jumping off the cliff and not knowing what’s at the bottom. It’s not the jumping but the landing that hurts you. Every day is elation down to stress and depression. It is that incredible rollercoaster ride, but my goodness, what a ride. To me, that’s personal success.