38307812 | Abstract Blue | Cammeraydave | Dreamstime.com
66b0e83144d77ccd99b5a847 Dreamstime L 38307812

AI has made personal relationships in business more important than ever

Aug. 6, 2024
In this new world of AI and deepfakes, personal relationships with customers are essential.

We have always looked at personal relationships as foundational to the success of our customers. Building these relationships takes time, but it’s well worth the effort, especially as globalization and remote work make direct in-person meetings more difficult. So, building and maintaining that personal relationship also builds trust at a time when trust has become an increasingly rare commodity (and AI plays a role in this).

There’s no question that AI has been tremendously helpful to businesses in areas like improving customer experience, developing creative marketing, and creating training and simulation videos. But as in all technology, bad actors find ways to turn that technology to their advantage. And AI is no exception. What used to be possible only in “Mission: Impossible” movies is now being used by cybercriminals to gain access to your data and your bank accounts.

The danger of deepfakes

Do you really know who you’re talking to? Is it who you think it is, or is this a deepfake? A LinkedIn article from early this year describes deepfakes as “manipulated videos and audios created using advanced artificial intelligence algorithms.”

What makes this so dangerous is that you don’t have to be a computer expert to accomplish this. As an article in Security Intelligence noted, “Tools allowing the creation of deepfakes are cheaper and more accessible than ever, giving even users with no technical know-how the chance to engineer sophisticated AI-fueled fraud campaigns.”

Voice cloning can replicate a person’s voice down to the tone, pronunciation ... and even intonation of that person. All it takes is anywhere from three to 30 seconds of audio of the person being copied, and a bad actor can then clone that voice to fool others. The voice can be copied from videos people post to social media, from voicemail messages, or even from an actual phone call that the bad actor records.

Voice cloning is not a crime; in fact, if you Google the term, it’s a growing business with tools that anyone can learn to use. One CBS News reporter was able to clone a voice in just a few simple steps. A cybercriminal can then simply type in whatever they want the AI-generated voice to say and voila! You’ve been fooled.

See also: Clark: AI offers both benefits and challenges to fleets

You might think that having face-to-face conversations with someone on Teams, Zoom, or FaceTime would keep you safe ... and you’d be wrong. Fraudsters have become adept at face swap attacks, using AI tools and apps to alter a person’s face or body to make them look like someone else.

Early this year, an employee of a multinational company thought he was on a call with his CFO and other people in the company. In fact, he was on with criminals and ended up transferring $25 million to multiple accounts held by the criminals.

How do you protect your company from deepfakes?

As technology keeps advancing, there are certain steps you can take. Obviously, the most important is training your employees to watch for things that seem odd.

For instance, a need for an emergency transfer of funds or a need for a password that should never be given out should raise caution. Multi-factor authentication is another step to be considered such as one-time passwords.

There are monitoring and detection tools that can analyze video and audio content in real-time, and companies need to have strict verification processes in place, especially when it comes to financial transactions or access to sensitive data.

Even the best and most security-minded individuals can be fooled, especially as technology advances.

This is where personal relationships can make a difference

When you develop a relationship with a customer, it usually goes beyond business alone. Over time, you get to know a customer’s likes and dislikes, their favorite teams, what’s going on with their family, and so much more. And they do the same with you.

It’s true that a criminal can find out that same information through Facebook, Instagram, LinkedIn, etc. But if your customer really knows you, a request that seems to come out of left field will more likely be met with skepticism and lead to further questions that a criminal may not be able to answer.

The reality is that there’s no silver bullet to this problem. Companies will need to keep developing new ways to combat the attacks that will continue to come. A deepfake in a personal business relationship may not always be detected, but it is an additional line of defense.

About the Author

Brad Hewitt

Brad Hewitt is VP of national/strategic accounts for Corcentric Capital Equipment. He previously held management positions with Ryder Truck Systems, Penske Truck Leasing, and Element Fleet Management. Brad holds a Certified Transportation Professional designation from the National Private Truck Council.

Voice your opinion!

To join the conversation, and become an exclusive member of FleetOwner, create an account today!

Sponsored Recommendations

Improve Safety and Reduce Risk with Data from Route Scores

Route Scores help fleets navigate the risk factors they encounter in the lanes they travel, helping to keep costs down.

Celebrating Your Drivers Can Prove to be Rewarding For Your Business

Learn how to jumpstart your driver retention efforts by celebrating your drivers with a thoughtful, uniform-led benefits program by Red Kap®. Uniforms that offer greater comfort...

Guide To Boosting Technician Efficiency

Learn about the bottom line and team building benefits of increasing the efficiency of your technicians in your repair shop.

The Ultimate Trailer Tracking Technology Checklist for Enterprise Fleets

We understand the challenges you face in consolidating inventory, reducing theft, and tracking revenue. That’s why we’ve created the ultimate checklist to help you evaluate your...