Techno-Crime Institute

Driving and Inspiring The Evolution of Investigations

  • Home
  • About
  • Speaking
    • Meet Walt Manning
    • Speaking Topics
  • For Investigators
  • Blog
  • Free Content Library
  • Contact
  • Home
  • About
  • Speaking
    • Meet Walt Manning
    • Speaking Topics
  • For Investigators
  • Blog
  • Free Content Library
  • Contact
You are here: Home / Archives for Uncategorized

Could Deepfake Technology Fool You?

September 21, 2020 by Walt Manning Leave a Comment

Deepfake Technology

Every day, most of us use multiple screens to get our news, keep in touch with other people, learn something new, or be entertained.

How many screens do you look at during a day, and for how long?

Now for the critical question:

How much of the information that you see on these screens can you believe?

You’ve probably heard about deepfakes or “fake news” recently. Still, have you seen or heard what this technology can actually do?

Imagine viewing a video of a Fortune 100 company’s CEO posted online announcing significant layoffs or a disastrous financial report.

What would that do to the employees’ morale and productivity?

If it were a publicly-traded company, what would happen to the stock price?

For a different example, think about the potential reaction if a video of Iran’s leader were published on the Internet threatening immediate nuclear war?

What could the possible outcomes be?

Later, you discover that both of these examples were fakes using artificial intelligence and that the videos’ messages weren’t real.

How much damage could be done from the moment when the video is published until it’s found to be fake?

Now let me give you another example.

Imagine a scenario where an employee receives a call from her boss or even the CFO.

She recognizes the voice, which instructs her to immediately wire funds to an important client, after providing detailed wiring instructions.

Except the caller isn’t actually the person they claim to be, and by the time the truth is discovered, the funds have disappeared.

All of these examples are possible today with what is known as deepfake technology.

What Is It?

The data you see on screens every day is already manipulated more than you know, but new technologies have taken the potential for crime and even more dangerous uses to a new level.

Some of the first deepfake videos were pornography, where a celebrity’s head replaced an actor’s head.

There have been several deepfake videos of politicians, and I’ll show you some examples.

There are many new ways that this technology can be integrated with artificial intelligence to do other things. One example is to create a digital clone of your voice and then editing “your” voice to make it say almost anything.

Artificial intelligence is being used to create human-like digital “people.” This technology can simulate customer service agents, teachers, or digital spokespersons.

Several of these systems can use your device’s camera and microphone to interact with you directly and change their response based on your facial expression, voice tone, and soon, perhaps even your emotions.

But showing you what’s already possible is better than telling you, so let’s get started.

To see the entire videos in the examples below, just click on the image and you’ll be able to see the video.

Face2Face

Face2Face

Face2Face is a research program from the Technical University of Munich. Researchers using a consumer webcam can manipulate facial expressions of a target speaking in a YouTube video. The software then renders a new synthesized face of the target, showing the altered video.

The lab is also working on algorithms to help detect fake videos.

You can find more information on the lab’s website at https://www.tum.de/nc/en/about-tum/news/press-releases/details/35502/.

Talking Head Models from Still Photos

Face2Face

Other research from Cornell University uses artificial intelligence to create realistic “talking heads” from still photos. The video demonstrates several examples of videos generated from a variety of images, and even brings the Mona Lisa painting “to life.”

See more details about their research at https://arxiv.org/abs/1905.08233v1

Text-based Editing of Talking-head Video

Text-based Editing of Talking-head Video

This project shows examples of manipulating text from a video to alter the words that appear to be spoken by the subject in the video.

For more information to to the project’s webpage at https://www.ohadf.com/projects/text-based-editing/

Lyrebird AI

Lyrebird AI

Lyrebird is a division of Descript and uses artificial intelligence to take a small audio clip of a person’s voice. Using the clone allows anyone using a text editor to create a statement that sounds like the target’s voice.

There is also a “do-it-yourself” demo to change the sample text and hear the altered output.

Learn more about Lyrebird and see the demonstrations on Lyrebird’s website at https://www.descript.com/lyrebird.

CereProc

CereProc

CereProc is a very advanced text-to-speech service that allows the user to select from a wide variety of voices or clone your own. The website has a fascinating demo where you can type in your text and choose the voice you wish to hear speaking.

Another of the company’s products named “CereVoiceMe” uses artificial intelligence to create a clone of any person’s voice that can then be used to convert text into speech that sounds just like the original subject. One example on the website is a former radio host who suffered the loss of his speech due to illness. CereProc’s technology allowed him to clone his voice.

Jordan Peele Manipulates Video of Former President Obama

President Obama and Fake News

A YouTube clip from a Good Morning America broadcast in 2018 demonstrates how artificial intelligence can be misused. In the clip, comedian Jordan Peele manipulates the voice of former President Barack Obama and provides a warning about this technology.

This Person Does Not Exist

This Person Does Not Exist

This website displays a different realistic photo of a non-existent person every time you visit the site. Artificial intelligence takes features from their image database and combines them to create new faces of any age, gender, and ethnicity. Visit the website five times, and you will see five different faces…of people who don’t exist.

Soul Machines

Soul Machines - Sam

Soul Machines, a New Zealand firm, has created what they describe as a “Digital Brain,” which uses “Embodied Cognitive User Experience” to create a “Digital Person.”

To demo their product, they’ll ask your permission to access your computer’s camera and microphone. You then have a choice to speak to either “Sam” or “Roman,” and provide your email address. You’ll then receive an invitation to carry on a conversation with whichever digital person you choose.

The AI is capable of using your device’s camera and microphone to not only converse with you but also to read your facial expressions and interpret your tone of voice in order to change how it responds.

Yes, there are times when the voice still sounds too much like a machine. You can throw the AI off with random questions or statements. Still, I think you’ll be surprised at how capable the technology already is.

The image above is from one of my previous conversations with “Sam.”

What’s Coming Next?

So now you can see how technology can be used to change reality…or at least what you perceive the truth to be.

Think about how each of these examples might be used to change your perception of what actually happened.

What evidence can you believe?

If someone alters a video or audio record, how would you know?

For example, could this technology be used to create a fake of an emergency alert warning?

What about destroying someone’s marriage or reputation with a fake sex video?

Can you imagine the possible scandal over a fake video or audio recording of a political candidate shortly before an election?

When you consider how something like this could quickly go viral on social media, you can appreciate how dangerous this technology can be.

You’ve all heard the old saying that “Seeing is believing,” but the truth is that believing is seeing.

Studies have shown that humans tend to seek out information that supports what we want to believe and to ignore the rest.

With audio, it’s an even more difficult problem.

Research says that our brains have a hard time detecting the differences between real and artificial voices.

It’s easier for us to pick up on a fake image than to recognize an artificial voice.

Hacking that human tendency gives malicious people, criminals, and even nation-states a lot of power to control what we believe to be true.

What Can We Do?

First of all, we need to ask whether deep fakes are legal.

Certainly, if someone uses deepfake technology to commit a crime, then existing law can apply. But what about just the act of creating a deepfake?

It’s an interesting and problematic question currently unresolved in law, at least in the United States.

We need to consider the First Amendment of the U.S. Constitution, intellectual property law, privacy law, and the new revenge porn statutes many states across the United States have enacted of late.

Now complicate the possibilities with deepfake “diplomacy manipulation.”

Think about the damage that could be done to a nation’s foreign policy or ongoing trade negotiations.

Many organizations are working on ways to detect deepfake videos, photos, and audio.

Some of these involve a highly technical analysis of the data. Still, it might be challenging to conduct this type of analysis in realtime.

And, as we mentioned before, a lot can go wrong during the time between the production of a deepfake and the time when it’s found to be false.

Another possible answer to countering deepfakes would be to use blockchain technology to validate the original if it’s a photograph, a video, or a recorded audio.

But what about a live-streaming video, a live telephone call, or the person you think you see on a screen?

Justin Hendrix, the executive director of NYC Media Lab, says: “In the next two, three, four years we’re going to have to plan for hobbyist propagandists who can make a fortune by creating highly realistic, photo-realistic simulations. And should those attempts work, and people come to suspect that there’s no underlying reality to media artifacts of any kind, then we’re in a really difficult place. It’ll only take a couple of big hoaxes to really convince the public that nothing’s real.”

Could you be fooled?

Filed Under: Uncategorized

Deepfakes — Were You Fooled By This One?

June 17, 2020 by Walt Manning Leave a Comment

 

Our world is under a lot of pressure today.

  • COVID-19 and the uncertainty of a second wave.
  • Tragic deaths and widespread protests.
  • Great tension and debate over law enforcement practices.
  • Unemployment and wondering whether businesses will be able to come back.
  • Volatility in the financial markets.
  • Politics that may be more polarized than I can ever remember.

Unfortunately there seem to be lots of people who are doing anything and everything to make these problems worse instead of talking about how we can work together to make the world better.

A couple of days ago, some people sent a video out to everyone on their mailing list that supposedly showed an interview with Joe Biden on the television show The View.

This caused a lot of reactions, and many of the people who received the video forwarded it along to others, who probably forwarded it to even more people and more people.

And shortly thereafter, someone else responded with a link to the real and unedited interview on YouTube.

I’ve given talks about the capabilities of Deepfake technology and how dangerous it can be.

It’s time for everyone to be very careful, especially with an upcoming election in the United States.

First, here’s the very much edited and controversial video. As you watch it, please notice the watermark in the upper right-hand corner…

https://technocrime.com/wp-content/uploads/2020/06/VIDEO-2020-05-28-12-31-34.mp4

Were you fooled?

Did you see the text “Daily Caller News Foundation Comedy”?

Apparently many people didn’t, and even if they did, they didn’t take the time to check the source.

Here is the real interview on YouTube:

It’s not my intention to be political, or to take one side over the other.

But we all need to understand the dangers that can come when it’s so easy to edit video, audio, and photos.

Please think before you react.

Please verify before you post.

Let’s all do what we can to lower the pressures and tensions we all feel right now, and do the right things for the world.

Filed Under: Uncategorized

Are You Planning for Post-Pandemic Investigations?

May 22, 2020 by Walt Manning Leave a Comment

Old Way or New Way?

Have you given any thought to what the “new normal” might be after the pandemic has completed its course?

We’ve all heard predictions by experts who all seem to have different opinions about how things will play out in the coming months and years.

The bottom line is that nobody knows what will happen.

Will the gradual re-opening of society and business cause a new wave of infections with re-imposing restrictions for the economy?

Or have we lived through the worst, with the possibility of long-lasting changes to our business and personal lives that may never return to what they were before the coronavirus?

Has the Pandemic Created New Crime Trends?

Several organizations believe that the pandemic has increased opportunities for criminals:

Coronavirus Pandemic Is a Perfect Storm for Fraud

Mitigating Fraud and Corruption Risks During the COVID-19 Crisis

How cybercriminals are taking advantage of COVID-19: Scams, fraud, and misinformation

COVID-19 Operating in the ‘new normal’ – A backdoor to increased fraud risk?

Other side-effects that are being discussed include:

  • Increased domestic & child abuse
  • Higher use of alcohol & drugs
  • A decrease in home burglaries, but increases in shuttered business break-ins

What Trends Might Survive the Pandemic?

I don’t claim to know the future, but let me suggest a couple of trends that may change your investigations in the foreseeable future.

Will Business Priorities Change?

Cutbacks to IT security, investigations, and auditing staff could mean fewer resources to prevent, detect, and investigate fraud and other matters.

I have read about companies who are temporarily transferring staff to other operational roles to fill in for employees who have been furloughed or laid off.

Security staff is feeling the pressure to support more remote employees while ensuring that their home offices provide adequate security. Due to these increased demands, they may not be able to keep up with software patches, upgrades, and reviews of digital logs that might indicate hacking or system abuse.

Investigators may find it challenging to identify, collect, preserve, and analyze the digital evidence in their investigations.

Audit staff may find it harder to ensure compliance and may need to pay more attention to technology use and activities related to cloud-based platforms.

When criminals can take advantage of making it more time-consuming or expensive to conduct investigations, they shift the odds in their favor that you may not catch them.

Virtual meetings vs. in-person

The number of virtual online meetings using services such as Zoom and Skype has soared in the past two months.

The trend of working from home has exploded, and many organizations, including their employees, may be reluctant to return to the office environments that currently exist.

As this situation develops, how could this impact your investigations?

Will you conduct virtual interviews in your investigations?

Will the quality of your interview suffer due to a lack of body language clues or a lowered ability to detect the mental or physical state of your subject?

In-person interviews can frequently provide micro-expression cues that experienced and intuitive investigators can sense. Could this be a disadvantage in a video or telephonic interview?

Does the interview environment make a difference? Would an interview subject feel and react differently in a location where they feel comfortable instead of in an office environment?

Bring Your Own Device & Cloud Computing

When working from home, the use of employee-owned devices is already complicating the responsibilities of IT teams in organizations of all sizes.

Employees not used to a higher level of security for their devices in a home environment may be increasing the chance for data breaches and theft of proprietary information.

At the same time, pandemic restrictions may not allow time to deploy new security guidelines and training to remote employees.

Things like the security of a home or small office routers, the proper selection and use of Virtual Private Networks, deploying consistent anti-malware solutions, and even the use of encryption can help minimize the risks for work-at-home employees.

How many organizations are deploying resources to address these issues?

From an investigative perspective, the subjects involved in your investigations may continue to be widely scattered geographically, making your efforts more time-consuming and costly.

If the employee needs access to their devices and home office networks to do their jobs, how will you be able to identify potential digital evidence, and then collect and preserve that data?

The use of cloud-based services has also exploded in the past two months. Even if you can arrange to collect data from employee-owned devices, how will you deal with applications and data stored in the cloud?

Shifting Workforce

There have been lots of recent articles predicting how the pandemic will change how companies view their workforce.

Many people who have previously lived in densely-populated cities may be rethinking that lifestyle after living through the pandemic restrictions. Polls show that they are thinking about relocating to different countries, smaller towns, or rural areas that may be less risky.

This migration could contribute to a more fragmented and scattered workforce, which could change how you conduct investigations.

If organizations are now seeing the benefits of working remotely, they may also consider changes in how they recruit future employees.

Instead of hiring their talent and requiring new employees to move close to an office, organizations might think about recruiting the best talent, regardless of the location of the prospective employees.

With a workforce now scattered even more geographically, new policies and procedures may be necessary regarding investigative access to electronic data.

Investigators may also need to be more aware of the different privacy laws that could impact their ability to access and analyze data, depending on the jurisdiction where the data is stored.

Have You Given This Enough Thought?

Should you be planning for post-pandemic changes in how you conduct investigations?

Can the security, investigations, and auditing staff keep up with a more widely-dispersed workforce that may be using insecure devices and cloud-based services?

Are your employees being given the training and resources to improve their security? If not, could this increase the number of investigations required in the future?

We all hope for a complete return to our “normal” world, with all of the associated activities.

But we may need to think about a “new normal” and what that could mean to your investigations.

Are there any other trends that you think may continue after the pandemic?

Please let me know your thoughts so I can share them in future messages and blog posts.

The more we can educate people about these issues, the better we’ll be prepared for the post-pandemic world, if and when that day arrives.

Filed Under: Uncategorized

Mobile Devices Might Be Your Biggest Technology Security Risk Today

March 26, 2020 by Walt Manning Leave a Comment

Security & Mobile Devices

Introduction

People throughout the world have had our lives changed in the past few weeks. I am sad for the individuals and families who have lost loved ones or friends from the global coronavirus pandemic.

Many people are now “sheltering in place,” working from home, or are experiencing other restrictions that have now become part of our “new normal.”

But our use of mobile devices doesn’t come without risk, and I want you to at least start thinking about that today.

Mobile devices do more for us each day, and most people can’t imagine living or working without them.

Nobody can doubt that mobile devices provide excellent features and convenience, as well as entertainment and increased business productivity.

But it’s easy to forget that mobile devices are pretty powerful computers, and you need to think about the security risks that come from using them.

Do you use your mobile device for anything you’d like to stay confidential or keep private?

Is there data transmitted by or stored on a mobile device that you want to keep secure?

The use of mobile devices is growing at unbelievable rates.

First, here are a few statistics:

  • There are now more connected mobile devices than there are people on earth.
  • Currently, in the U.S., there are approximately eight networked devices per person, a number expected to climb to 13.6 per person by 2022.
  • Nearly three-quarters of the world will use just their smartphones to access the internet by 2025.
  • In the U.S., roughly one-third of people (31 percent) use mobile banking more than any other app on their smartphone.

But the following will give you some background about the associated risks.

A Forbes magazine article, referencing the Verizon’s Mobile Security Index (MSI) 2020 Report, revealed that:

  • 54% of companies were less confident about the security of their mobile devices than that of their other systems.
  • 21% of organizations that were compromised said that a rogue or unapproved application had contributed to the incident.

A more in-depth review of the full Verizon report adds more thought-provoking information:

  • 83% of organizations were concerned about device loss or theft, and 20% of those felt that their defenses were inadequate.
  • Device operating systems are also a concern and often out of date. Almost half (49%) of enterprise devices are being used without any managed update policy.
  • According to Wandera, employees connect to an average of 24 Wi-Fi hotspots per week, and Netmotion found that the average device connects to two or three insecure Wi-Fi hotspots per day.

From a study produced by Aite at the request of Arxan, discussing mobile device vulnerabilities:

  • “There is no shortage of anecdotal evidence that hackers are actively seeking to leverage those vulnerabilities, such as the recent discovery in the wild of mobile malware that leveraged Androids’ accessibility features to copy the finger taps required to send money out of an individual’s PayPal account. The malware was posted on a third-party app store disguised as a battery optimization app. This mobile banking trojan was designed to wire US$1,000 out of an individual’s PayPal account within three seconds, despite PayPal’s additional layer of security using multifactor authentication.”
  • The study found three app categories with the highest number of vulnerabilities: retail banking, retail brokerages, and auto insurance.

Whether you are working remotely or not, let’s take a few steps to improve your mobile device security.

Mobile Device Threats

Are You Taking Risks Using Unsecured Wi-Fi?

Mobile device users routinely connect to the nearest or strongest available Wi-Fi network signal, and some may not even be aware of the significant security risks.

I have already written two blog posts about the use of Wi-Fi, so I won’t repeat that information in this one. Here are links to the previous posts, for your convenience:

This Is Possibly Your Biggest Techno-Crime Risk: What You Need to Know

How to Stay Safe on Public Wi-Fi

In addition to public Wi-Fi, if you now depend on your home router, you’ll need to make sure that it is also secure. Here’s a link with more information:

Do You Know Whether Somebody Has Already Hacked Your Home Network?

Should You Be Worried About Mobile Device Malware?

Mobile devices can be infected with malware, just like any computer.

A report from Check Point states that attacks against mobile devices in the first half of 2019 increased by 50% compared to 2018, with mobile banking apps being one of the primary targets.

Reports from multiple security companies document that the overwhelming majority of mobile malware targets Android devices. Still, anti-malware protection should be installed on every mobile device.

If your device is infected, mobile malware can:

  • Allow the attacker to wipe the device or alter data
  • Track your physical location in real-time
  • Surreptitiously turn on the device camera or microphone
  • Allow the developer complete access to all data stored on or transmitted by the device
  • Allow the developer to send text messages or make calls on the device
  • See text messages sent as part of 2-factor authentication systems
  • Change settings on the device
  • Convert the device into a node on a criminal botnet
  • Masquerade as an app from a legitimate financial institution to steal your financial data, including your login
  • Manipulate the screen so that it continues to show your valid transaction and expected balance, but not the real data
  • Recognize when you dial a financial institution 800 number and reroute the call to one of the attacker’s call centers
  • Connect to the company network, raising the possibility of infection on other machines

Do You Know All the Data You’re Giving Away to Your Mobile Apps?

Users who install apps on their mobile devices seldom read the Terms of Service agreement that comes with the app or the developer’s Privacy Statement.

But there are a significant number of apps that take advantage to permit themselves to do many things with your device and the data that it contains.

Most users are completely unaware that they have given away these rights.

The app developers are then free to sell the collected data to advertisers or any other interested party.

Apps may be allowed to collect and transmit:

  • The device manufacturer and model
  • The device serial number and IMSI number
  • Geolocation data
  • Browsing and search history
  • Demographic data
  • All contacts stored on the device

The Terms of Service may also give an app permission to:

  • Record audio through device microphone
  • Have full Internet network access via the device
  • Take photos or videos
  • Modify or delete the contents of data storage
  • Create accounts and set passwords
  • Send text messages
  • Read phone status and identity (includes call logs, phone signal, carrier, device ID, and phone number)
  • Connect and disconnect from Wi-Fi networks
  • Retrieve information about current and recently running apps on the device

Did You Already Know These Mobile Device Security Tips?

What can we do to secure our mobile devices against these problems and threats?

Here are a few essential recommendations:

  • Use a secure passcode on every mobile device
  • Configure the screen lock to engage after a minimal time with no activity
  • Use anti-malware and a firewall on every mobile device
  • Only connect with websites using HTTPS, but even that does not guarantee a secure connection
  • Use a VPN on every mobile device
  • Download and install apps only from approved app stores
  • Do not let sensitive apps remember your login user ID or password
  • Consider using encryption to protect any sensitive data stored on the device
  • Consider the use of an encrypted app to send text messages or make voice calls
  • Make sure to keep the operating system and all apps up to date at all times
  • Read the Terms of Service agreement for all apps and any associated Privacy Policy so you will know what permissions the apps require
  • Use a secure password manager
  • Do not use your fingerprint to access a mobile device
  • If not already available, consider the use of an app to locate your device if it is lost or stolen
  • When you sell or trade in an old device, make sure that your data is securely erased
  • Be very selective in choosing which apps can use location services
  • Turn off all unnecessary system services
  • Allow text, video, audio messages to expire rather than store them forever
  • Limit what diagnostic data is sent to the manufacturer, app developer or carrier whenever possible
  • Be careful what you sync with services such as iCloud, Dropbox, etc.
  • Control what notifications are displayed on your locked screen

I hope this post will help you understand some of the risks from your mobile devices, and help you begin to improve your security.

If you would like more details, go to the Free Content Library on our website. We’ve compiled a “Smart Home and Mobile Device Security Checklist,” which we are gladly providing at no cost.

This checklist doesn’t address all of the security issues you may face with mobile devices, but at least it gives you an excellent place to start.

Please share this information with co-workers, family, and friends to help everyone improve their security!

Filed Under: Uncategorized

Is Your New Smart Home Assistant Really Listening?

December 26, 2019 by Walt Manning 1 Comment

Alexa Listening

The short answer is, “probably, but you don’t know when.”

Smart home assistants like Amazon’s Echo/Alexa, Google Home, or the Apple HomePod were some of the most popular holiday gifts for the second year in a row.

All of these devices function by listening so they can respond to your questions or commands. For the Amazon products using Alexa, there are thousands of different “skills” available, which include everything from playing music to online banking.

Be aware that all of these devices collect data, even when the device’s “wake word” has not been used. Some of this data is reviewed by humans to “improve the quality” of service.

The convenience of these smart assistants is appealing and entertaining, and there have been both funny and dangerous incidents associated with their use.

Data from these devices has already been subpoenaed in litigation and criminal investigations, so be careful what you let them hear!

But today you are still relaxing and enjoying the holidays, so I’ll save those more specific privacy and security topics for a post later next year.

If you have any of these devices, here are some brief suggestions that you can put to use immediately:

  • Think about where you place the device and the conversations that might be heard there.
  • Change the device’s “wake word” from the default options, so the assistant won’t activate when you don’t mean for it to be in use.
  • Disable the microphone, especially if you are about to talk about anything you wouldn’t want someone else to hear.
  • Delete your old recordings. Almost all of the companies allow you to do this, but they won’t do it for you. Also, be aware that just because you have deleted the recorded audio, it doesn’t necessarily mean that you have removed all the data that has been collected by the device.
  • Make sure that your wireless network is secure. These devices can be hacked, just like almost all Internet of Things (IoT) devices.
  • Check the permissions you are allowing for any function or “skill” that you add to the device. For example, your street address and telephone number might be shared if using a skill for Uber or another ride-sharing or food delivery service.
  • Be careful adding functions from another developer that hasn’t been approved by the device manufacturer. Connecting the device to another IoT device or “skill” might be providing someone with the ability to be always listening and collecting your data.

Enjoy the holidays, and best wishes to all for a great 2020!

Filed Under: Uncategorized

  • 1
  • 2
  • 3
  • …
  • 5
  • Next Page »

Evolve With Us to Fight Techno-Crimes!

Join our mailing list and you will receive:

  • Immediate access to our mini-course!
  • Updates about new types of techno-crimes
  • Information about security tools and techniques to protect your data
  • Ways to increase your personal privacy
  • Information about our live and virtual keynote speeches and training opportunities

 

Click here to subscribe!

  • Home
  • About
  • Speaking
  • For Investigators
  • Blog
  • Free Content Library
  • Contact

2020 Techno-Crime Institute