Banking sector faces data governance hurdles in AI transition
Banking sector faces data governance hurdles in AI transition
As Vietnam’s banking sector reaches a turning point, with fragmented data systems evolving into a foundation for AI adoption, Dao Hong Giang, executive vice president and director of Finance–Banking at FPT IS, speaks with VIR’s Nhue Man about new institutional and infrastructure breakthroughs.

Why has an industry-wide interconnected data structure yet to take shape, and what systemic issues are hindering the banking sector’s data transformation?
Vietnam’s banking sector stands at a historic inflection point. Two powerful forces are converging: the internal need to better understand customers in the digital era, and strong policy momentum from the National Digital Transformation Strategy. Together, they create a rare opportunity to fundamentally reshape the industry.
Banks increasingly recognise data as a core competitive asset that underpins intelligent, targeted financial services. However, progress remains constrained by both internal and external barriers.
At the commercial bank level, fragmentation is the most common impediment. Customer data is scattered across core banking, card systems, lending platforms, and digital channels, making a 360-degree customer view difficult to achieve. Many banks still rely on legacy core systems that are rigid, difficult to integrate, and limited in real-time data extraction. This slows analytics and weakens data-driven decision-making.
Data quality presents another major challenge. Inconsistent standards, incomplete records, and inaccuracies undermine the effectiveness of analytical models. In addition, analytics remain largely reactive, focused on historical reporting rather than predictive insights to support forward-looking decisions.
At the industry level, these challenges become more pronounced. The lack of standardised data models means each bank effectively speaks its own data language. While the State Bank of Vietnam (SBV) has issued circulars on Open APIs to promote common formats and data security, implementation still requires stronger enforcement. Interoperability across banks’ IT systems remains limited, as most were not designed for seamless or secure integration.
Organisational and competitive factors further complicate the picture. Customer data has long been treated as a proprietary asset, creating reluctance to share. This is reinforced by the absence of a trusted intermediary to coordinate fair, transparent, and secure data exchange.
Legal and compliance constraints add another layer of complexity. Regulations on personal data protection impose strict requirements on data collection, processing, and sharing, particularly around customer consent. The Law on Credit Institutions and related confidentiality rules, while necessary, also restrict cross-institutional data sharing without a competent public authority to coordinate it at the sector level.
As data standardisation and sharing grow more urgent, what role should the SBV play in shaping an industry-wide data architecture for the AI era?
Data fragmentation remains the core challenge. Each bank sees only a partial view of the customer within its own data silos, resulting in incomplete risk management, fragmented customer experiences, and inefficient allocation of social capital. Nevertheless, the banking sector now has an unprecedented opportunity to address this issue.
That opportunity stems from strategic national initiatives creating both infrastructure and legal momentum. Government policies are gradually forming a golden data foundation for large-scale connectivity and sharing.
A cornerstone initiative is Project 06 and the National Population Database. This provides a single, verified digital identity source, addressing foundational e-verification challenges, enhancing fraud prevention, cleansing existing customer data, and enabling the creation of a unified customer ID linked to the national personal identification number.
In parallel, the National Data Centre is envisioned as an integrated hub aggregating data from population, enterprise, insurance, and tax databases. Subject to customer consent, banks connecting to this centre could access verified non-financial information such as income or insurance history, significantly enriching customer profiles.
Strategic resolutions and plans, including Resolutions 57, 171, 68, and Plan 02-KH/BCDTW, further define digital infrastructure roadmaps and data-sharing requirements, laying the groundwork for a unified national data ecosystem.
Within this framework, the SBV’s role is pivotal. Guided by Resolution 57’s emphasis on innovation, the SBV is not merely a regulator but also an architect and coordinator. By developing sectoral data strategies and shared industry databases, it can lead the formation of a banking data architecture robust enough for the AI era.
Which data-sharing model would best support faster AI adoption in the banking sector?
In the European Union, PSD2 and Open Banking rely on regulatory mandates, requiring banks to open APIs to third parties with customer consent, under strict GDPR privacy protections. In contrast, the United States follows a market-driven approach, where intermediaries such as Plaid facilitate data exchange without mandatory regulatory frameworks.
India and Singapore offer a third path: public-private or government-led utility models. India’s Aadhaar and UPI systems function as public digital rails, enabling private-sector innovation. Singapore’s Monetary Authority sponsors SGFinDex, which allows individuals to aggregate financial information from multiple institutions through a consent-based platform that coordinates queries without storing data.
For Vietnam, a Coordinated Utility Model appears most suitable. The country shares similarities with India and Singapore in terms of strong government leadership, commitment to national digital identity infrastructure, and the SBV’s institutional authority. Pure market-driven or heavily mandated approaches may be less optimal than a coordinated, balanced model.
Under such a model, the government and SBV would establish the legal framework, API standards, and consent mechanisms, while providing foundational infrastructure such as the National Population Database. A neutral intermediary, potentially under SBV sponsorship or through an expanded role for Napas or the Credit Information Centre, could act as a coordination utility similar to SGFinDex. This entity would provide standardised API gateways, manage consent, and route encrypted, authorised data queries without copying or storing data.
If data fragmentation persists over the next three to five years, what risks could arise? What actions are urgently needed to avoid missing the transformation window?
The most significant risk is that AI adoption in banking remains superficial. Without integrated data, AI will struggle to move beyond demonstration use cases. Risk management will remain incomplete, customer journeys fragmented, and capital allocation inefficient. More critically, the banking sector could fall behind fast-evolving fintech ecosystems built on interoperable data foundations.
Avoiding this scenario requires a coordinated and realistic roadmap aligned with national policy momentum.
For the State Bank of Vietnam, the immediate priority is to finalise the legal framework for data sharing. This includes clearly defining which data categories can be shared, outlining the responsibilities of participating parties, and establishing strong consent governance mechanisms aligned with personal data protection regulations. While a decree introducing a regulatory sandbox for open APIs has been issued, additional implementing guidance and policy incentives are needed to support effective, large-scale deployment.
The SBV’s standardisation leadership must also continue. Circular 64/2024/TT-NHNN and the industry-wide Open API standards provide a common technical language, but these standards require ongoing review and refinement to reflect operational realities and technological innovation.
For commercial banks, modernising core systems is essential. Effective connectivity and data sharing cannot occur on outdated core infrastructure. Banks must upgrade or replace legacy cores, build API-first data platforms, and ensure readiness for controlled data openness. Equally important is investment in human capital and data governance processes so that data becomes a strategic capability rather than a stored asset.
- 10:53 19/02/2026