‘Deep Fake’- an emerging risk for Banking & Financial Sector

A resident of Kozhikode in Kerala lost ₹40,000/-: Cybercriminals defrauded a Kerala’s resident by posing as his former colleague and asked for the money for his sister-in- law’s surgery through a WhatsApp video call.

The man received an anonymous call, not in his contact list which he first ignored. But later he was surprised to see several calls and WhatsApp messages from the same number where the person on other side introduced himself as his former colleague. He again received call which he later picked up. The caller trapped him by saying that he is boarding a flight from Dubai and needs urgent financial favour as his sister-in- law is scheduled for an emergency surgery at a hospital in Mumbai. He requested for UPI transfer of the fund to someone with her at the Mumbai hospital.

The fraudster even switched to video-call for few second to make him believe that he is his colleague. To the surprise of him, the caller, whose face was only visible and clear, looked exactly like his former colleague. His lips and eyes movement were like any normal person as they talked in English. Fraudster disconnected the call purposely and pretend that call has been disconnected. After the short video call, he again came back on a voice call and told him to understand the urgency of the situation and upon landing to India, he will return the money. After the man was double sure, he immediately transferred the money. But later, he was again asked for more rupees which grow suspicion in his mind and subsequently it was discovered to be a fraud.

The above fraud is nothing but a deepfake fraud which uses the fake video/images of one person to impersonate other.

A possible case of fraud: Now imagine a banker is busy with his customer and suddenly he received a video call of one of his reputed customers. On the screen appears a very familiar face of his HNI customer. He requests for an urgent transfer of fund to his client/vendor as he is busy in meeting/or boarding a flight / or may be some other reason.

He also mailed branch a written request for transfer of fund using a fake email id which is identical to the registered mail id with the Bank. And the poor Manager falls prey to the fake video call which was nothing but a deepfake call and transferred the fund.

KYC Fraud using Deepfake: Deepfake KYC fraud is a growing concern in India, where fraudsters leverage AI-generated deepfake videos and images to bypass Know-Your- Customer (KYC) verification processes. This has significant implications for financial institutions, fintech companies, and other organizations that rely on KYC to verify the identity of their customers.

In the recent past, we came across many other deepfakes videos & images of celebrities and political leaders whose alleged video/images were used during the election campaign or advertisement or in public forums.

In this article, we will learn and understand about Deepfake, its impacts in the banking industry and suggestive measures to prevent and mitigate such fraud.

Deepfake

A fake/ artificial image or a series of images or video generated by “deep” learning (a special type of machine learning) is known as ‘Deepfake’. Simply, we can say that it is nothing, but a “fake” images or videos generated by “deep” learning technology. Imagine someone taking a video of you and replacing your face with someone else’s face so convincingly that it looks like you’re saying or doing things you never did.

That’s a deepfake.

Deepfake technology is a method of creating a real looking but fake media—videos, images, or audio—using artificial intelligence (AI) and deep learning algorithms. The process involves manipulating or replacing a person’s voice, style or actions in existing media to make it appear as if they are saying or doing something they never did.

Deepfakes can mimic a person’s voice and facial features using artificial intelligence, It can mimic person’s facial movements from videos or even just a picture of their face and uses an audio recording of person’s voice to make a realistic looking video and sound that the person might never have done or said.

How deepfake works:

  • Gathering Data: The technology behind deepfakes, called “deep learning,” needs a lot of pictures or videos of the person they want to fake. The more data it has, the better it can learn to mimic that person’s appearance and
  • Training the Model: This data is used to train a computer program to understand how the person’s face looks from different angles, how they talk, and how they It’s like teaching the computer to become an impersonator.
  • Creating the Fake: Once the computer program has learned enough, it can then create new videos or images that look like the real person but are completely.

Generally, identifying deepfake videos or images can be challenging as the technology is constantly evolving. However, there are several clues and techniques which can be used to spot or identify them:

  • Unnatural Movements: Pay close attention to facial expressions, eye movements, and lip-syncing. Deepfakes often struggle to perfectly replicate natural human movements, leading to inconsistencies.
  • Inconsistent Lighting and Shadows: Look for mismatches in light and shadows and how it falls on the face compared to the surroundings. Deepfakes often fail to accurately mimic natural lighting conditions.
  • Blurring and Artifacts: Examine the edges of the face, especially around the hairline and jawline. Deepfakes may exhibit blurring or unnatural smoothness due to the manipulation process.
  • Skin Tone and Texture: Check if the skin tone matches the rest of the body and if the texture appears too smooth or artificial.
  • Blink Rate: Deepfakes often have irregular or infrequent blinking
  • Audio-Visual Mismatch: Deepfakes sometimes struggle to synchronize audio and video. Hence a bit of attention to identify whether the lip movements match the audio perfectly will help in spotting deepfake video.

AI and Deep Fake Technology and its impact in the Banking Industry

The impact of AI & Deepfake is both transforming and devastating for the Banking and financial sector. AI may be considered a boon for the banking Industry and has the ability to perform tasks that normally require human intelligence- such as speech recognition, image analysis, decision-making, and natural language processing.

AI and deep fake technology can be used for both good and bad purposes in the banking industry. On the one hand, they help the banks to enhance their customer experience, efficiency, and profitability. For example, some banks use AI-powered chatbots or voice assistants to provide 24/7 customer service, respond to queries, expert advice, and execute transactions. Banks use Artificial Intelligence and machine learning in analysing customer data, behaviour and preferences, and thereby offer personalized and tailored products and services. AI and deep fake technology is often used to train Bank’s employees by using virtual reality or augmented reality to simulate real-life scenarios or situations etc. The technology is also used to create realistic and engaging marketing campaigns such as using celebrities or influencers to endorse their products or services.

On the adverse side, deepfake technology can give rise to significant risks and challenges to the industry such as increasing cyberattacks, attack on data privacy & data protection, breach of trust and other severe unethical issues.

Fraudster are using AI and deepfake technology for malicious and fraudulent purposes in the banking industry. For instance, some cybercriminals create fake identities, documents or biometrics using AI and deep fake technology and mis-use them to open accounts, apply for loans, or unauthorised access of funds.

Using AI and deepfake technology, Cybercriminals create fake websites, emails, or phone calls and lure the customers or employees into revealing their personal or financial information or transferring money. In many occasions they manipulate the market, influence the public opinion or damage the reputation of banks or competitors by creating fake news, reviews, or social media posts using deepfake technology.

Deep fake threat/fraud in Banking sector

In recent times, deepfakes, which can produce highly realistic yet fabricated videos and audio, have emerged as a significant threat to the banking industry. While they hold the potential for creative applications, their misuse has already led to substantial financial losses and has raised serious concerns about the security of digital financial systems.

  • Fraudulent Transactions: One of the most concerning aspects of deepfakes is their potential to facilitate fraudulent transactions. By impersonating customers through convincing video or voice calls, fraudsters can deceive bank employees into authorizing unauthorized transfers or
  • Identity Theft: Deepfakes can be used to create fake identities, making it easier for criminals to open accounts, apply for loans, or access sensitive financial
  • Reputation Damage: using fake videos and audios of Bank’s executive/ employees, deepfake can cause significant reputational harm, leading to a loss of trust among customers and
  • Market Manipulation: The spread of false information through deepfakes can manipulate stock prices and create instability in financial markets.
  • Compromised KYC Processes: Know Your Customer (KYC) procedures, crucial for verifying customer identities, can be bypassed using deepfake videos or voice

KYC Fraud using Deepfake technology:

  • Video KYC: Fraudsters use deepfake videos to impersonate legitimate customers during video verification calls.
  • Image Manipulation: Deepfake technology is used to alter or create realistic but fake images for identity documents.
  • Synthetic Identity Fraud: Deepfakes can be combined with other stolen or fabricated information to create entirely synthetic identities.

Potential Solutions to counter deepfake KYC frauds

  • Advanced Biometrics: Implementing live detection and other advanced biometric solutions can help to distinguish between real and fake
  • AI-Powered Detection Tools: Leveraging AI to detect deepfakes in real-time during KYC
  • Multi-Layered Security: Combining multiple verification methods, such as document verification, facial recognition, and biometrics, can provide a stronger
  • Collaboration: Public-private partnerships and collaboration between industry stakeholders can help to share information, develop best practices, and stay ahead of emerging threats.

Deepfake-a growing concern

Deepfake financial fraud is a growing concern due to the increasing sophistication of AI-generated videos and audio.

However, to mitigate the risk, several steps can be implemented such as:

Risk mitigating steps for Banks & Financial Institutions:

  • Employee Training: Employee education on a regular basis about the existence and threat of deepfakes. Provide them with resources to identify potential deepfake attempts and establish clear protocols for verifying the authenticity of communication, especially when dealing with sensitive transactions.
  • Multi-Factor Authentication (MFA): Implement strong MFA for all financial transactions and account access. This can include one-time codes, biometric verification, or hardware tokens, making it much harder for deepfakes to bypass security
  • AI-Powered Detection Tools: Invest in advanced AI-based systems specifically designed to detect These tools can analyse minute inconsistencies in videos and audio that human eyes and ears cannot detect.
  • Investing in advanced and robust cybersecurity systems and tools, such as firewall, antivirus, encryption, authentication, verification and blockchain to protect data, networks, and systems from unauthorized access, manipulation, or
  • Implementing strict and clear guidelines, policies, procedures and standards such as data governance, privacy protection, ethical guidelines, and code of conduct to regulate the use and management of AI and deep fake technology, and to ensure compliance with relevant laws and regulations.
  • Collaborating and co-operating with other banks, industry associations, government agencies, law enforcement and civil Develop a framework to monitor and detect emerging threats and trends and to coordinate and support each other in preventing and mitigating AI and deep fake fraud. Share the best practices, experiences, and insights among the several stakeholders in order to develop a common standard.

Risk mitigating steps for Customers:

  • Educating and providing awareness to the customers- educate the customers on benefits and risks of AI and deep fake technology, how to identify, report and respond to potential AI and deep fake fraud, such as verifying the source, content and using trusted and reliable channels and
  • Stay informed about deepfake technology and common scam Be cautious of unsolicited requests, especially those involving urgent financial transactions or confidential information.
  • Be extra cautious about what you share on social media and other online platforms and set your profiles to “friends and family” only, as scammers can use publicly available information against you
  • Verify Communication: Independently verify the authenticity of any communication that appears Contact the bank or individual through official channels only to confirm the request.
  • Strong Passwords and MFA: Use strong, unique passwords for your online banking accounts and enable MFA whenever possible.
  • Report Suspicious Activity: Immediately report suspicious activity or communication to the respective banks/FIs.
  • in case of any cyber fraud, Contact and report the Cyber Police on helpdesk 1930.

Additional Considerations:

  • Collaboration: Financial institutions should collaborate with technology providers, law enforcement agencies, and other industry stakeholders to share information and best practices in combating deepfake
  • Legislation: Governments should consider enacting legislation to address the specific challenges created by deepfake technology in the context of financial

Conclusion:

Deepfake technology presents a significant and evolving threat to the banking sector. While it offers potential benefits in areas like customer service and personalization, its malicious applications pose serious risks. The growing frauds using Deepfake undermine customer’s trust in financial institutions and disrupt the integrity of financial systems.

To mitigate these risks, Banks & FIs needs to invest in robust security measures, including advanced detection technologies and employee training to identify deepfake content. Collaboration with regulators and industry partners to establish effective countermeasures is essential. Additionally, educating customers about the dangers of deepfakes and promoting vigilance against fraudulent activities can help protect individuals and institutions.

The battle against deepfakes is an ongoing one, as the technology continues to advance rapidly. Banks must remain vigilant and adaptable, continuously updating their security protocols and leveraging the latest innovations in detection and prevention. By taking proactive measures and fostering a culture of training & awareness, the banking sector can navigate the challenges posed by deepfakes and safeguard the trust and stability of financial systems.

Popular from web