In the digital economy era, how to effectively strengthen anti-monopoly and prevent the disorderly expansion of capital?

On October 24th, Zhang Xiaohui, Dean of Wudaokou School of Finance, Tsinghua University, stated at the 3rd Bund Finance Summit that the principle of openness and transparency must be established in algorithmic supervision. To ensure that users are treated fairly, risk or impact assessments must be done in advance for automated decision-making to avoid risks caused by algorithm abuse. In the future, consideration should also be given to including algorithms in antitrust supervision.

Zhang Xiaohui pointed out that the digital economy mainly relies on algorithms to enhance economic efficiency and improve customer experience. In practice, the algorithms of large technology companies have largely affected the consumer behavior of users. The complexity of the algorithm and the deliberate concealment of the algorithm users make it impossible for most people to understand the working principle of the algorithm, leading to large technology companies as algorithm users, especially those who hold almost all digital platform companies related to personal life behaviors. It can be in a de facto dominant position, forming “algorithm hegemony”, which seriously harms the legitimate rights and interests of the algorithm counterpart, that is, consumers.

Zhang Xiaohui further analyzed that algorithms have become the main tool for large technology companies to control the market. In the name of protecting competitive advantages and trade secrets, algorithms have created more gray space for large technology companies to deliberately hide rules, manipulate consumers, and create discrimination. One is through unfair rankings, biased toward their own products or business partners. The second is the problem of algorithm discrimination. Including price discrimination, identity gender discrimination, education discrimination, etc., “big data kills familiarity” is a form of price discrimination, which provides different commodity prices for different users. The third is to induce consumers to over-consume and take risks through inducing information and risk hiding. Smart algorithms are often easy to conceal the complexity of financial risks, which will not only guide excessive consumption and debt, but may also mislead investors in the field of financial investment. Fourth, the convergence of large-scale technology companies in business models and algorithms can easily trigger a herd effect, leading to ups and downs in the market. In particular, the service targets of large-scale technology companies are mostly the public with weak financial expertise and recognition capabilities, which are often more likely to trigger social group incidents, which may lead to systemic financial risks.

Zhang Xiaohui suggested that the main algorithms of large technology companies need to implement external supervision and increase transparency. In view of the unclear operation process and decision-making mechanism of algorithm recommendation caused by the “black box” of algorithm, relevant departments require that algorithm recommendation must increase “transparency”, including formulating and publishing algorithm recommendation related service rules, and optimizing retrieval, ranking, selection, The transparency and interpretability of rules such as push, display, etc., inform users of their provision of algorithm recommendation services in a significant way, and publicize the basics of algorithm recommendation services in an appropriate mannerPrinciple, purpose and intention, operating mechanism, etc. In the new credit reporting regulations issued by the People’s Bank of China, Article 25 requires credit reporting companies to disclose the personal credit scoring algorithm model. In the “Internet Information Service Algorithm Recommendation Management Regulations (Draft for Comment)” issued by the Cyberspace Administration of China, it also addresses issues such as “big data familiarization” and “algorithm discrimination”, requiring practitioners to improve the algorithm management system and optimize algorithm recommendations. , Regularly review and evaluate the algorithm model, strengthen content management, and promote the application of algorithms for the better.

The following is the full text of the speech:

Integrating into the entire process of various fields of economic and social development, the rapid development of the digital economy, the wide range of radiation, and the unprecedented depth of influence are becoming a key force in reorganizing global factor resources, reshaping the global economic structure, and changing the global competitive landscape.

In the context of the digital economy, how to give full play to the financial technology represented by digital finance and smart finance in accelerating the transformation and upgrading of the financial industry and better serving the real economy The role of this is a challenge that China’s financial industry must face head-on. Judging from the recent development of financial technology in China, there may be several issues that need to be paid close attention to:

1. Balance privacy protection and fair use in data governance

br>

The digital economy can certainly improve economic efficiency, but the premise is that data governance must be done, especially the protection of data privacy and the fair use of data. Currently, China still faces major challenges in data governance. On the one hand, large technology companies excessively collect customer data and mix data across product lines, infringing on customer data privacy. In order to obtain financial services from platform companies, Chinese consumers often need to provide them with personal information. The problem of excessive data collection abounds. During the period of rapid growth of “cash loans” from 2016 to 2017, there was even a situation of buying and selling borrower information. Some technology companies arbitrarily mix user data on different product lines, increasing the difficulty of privacy protection. This is not allowed even in developed countries. Imagine that if companies such as Google, Microsoft, and Amazon could use personal information to carry out financial business at will, then these institutions may have become the largest lending institutions in the global financial market. On the other hand, large technology companies have insufficient data openness and utilization. Some large technology companies hinder the migration of customer data to competitors, affecting users’ free choice between different platforms. Moreover, because the data is not open to use, some merchants cannot share their data on the platform.According to the opening to commercial banks, commercial banks directly grant credit and lend, but they can only rely on joint lending or loan assistance from large technology companies.

The European Union is relatively advanced in terms of digital governance. Many concepts and methods have been borrowed by other countries and regions, including the General Data Protection Regulation and the recently formulated ” The Digital Services Act and the Digital Market Act. In summary, one is to carry out the most stringent protection of data privacy and strengthen users’ control over data. The General Data Protection Regulation puts forward the principles of fairness and transparency, purpose limitation, minimum necessity, accuracy, storage limitation, integrity and confidentiality, and other personal data processing principles, and strengthens the user’s control over personal data, including the right to know, the right to object, and the right to restrict , The right to be forgotten, the right to portability, etc. The second is to stipulate that large technology companies must not integrate personal data based on “core platform services.” In other words, large technology companies must not integrate the personal data they obtain with personal data obtained from other channels and from third-party services. They must establish a data firewall, and each service module must protect the personal data they obtain. The third is that large technology companies must not use data generated by commercial users to compete with these users. The data provided or generated by business customers on the platform are generally business secrets. If large technology companies are purely platform businesses, this will not distort competition, but it is obviously extremely unfair to develop self-employment after gaining business insights. Fourth, large-scale technology companies need to facilitate data usage for commercial users or third-party agencies authorized by them. This move aims to eliminate the monopoly of large technology companies on commercial user data, similar to the application of the “open banking principle” in the data market. Fifth, the data provided or generated by commercial users and end users to large technology companies is portable. The General Data Protection Regulation has introduced the concept of portability, and the Digital Market Act has expanded this measure from natural persons to legal persons, facilitating the free switching of commercial users and end users between different platforms.

China should also learn from the EU’s practices and strive to balance the relationship between privacy protection and fair use in data governance. It is not only necessary to clearly distinguish the boundary between data as a private product, quasi-public product, and public product, but also to clarify the multiplicity of various types of data and the systemic risks and social side effects that may arise from the cross-border mixed use of data; strictly implement the “Data Security Law” , “Personal Information Protection Law” and other laws and regulations, strive to improve and enhance supervision capabilities, adhere to the two-pronged approach of institutional norms and technical protection, strictly prevent data misuse and abuse, and effectively protect the security of financial data and personal privacy.

2. Open and transparent supervision of algorithms should be implemented

Digital economy enhances economic efficiency and improves customer experienceMainly rely on algorithms. In practice, the algorithms of large technology companies have largely affected the consumer behavior of users. The complexity of the algorithm and the deliberate concealment of the algorithm users make it impossible for most people to understand the working principle of the algorithm, leading to large technology companies as algorithm users, especially those who hold almost all digital platform companies related to personal life behaviors. It can be in a de facto dominant position, forming “algorithm hegemony”, which seriously harms the legitimate rights and interests of the algorithm counterpart, that is, consumers. It should be said that algorithms have become the main tool for large technology companies to control the market. In the name of protecting competitive advantages and trade secrets, algorithms have created more gray space for large technology companies to deliberately hide rules, manipulate consumers, and create discrimination. One is through unfair rankings, biased toward their own products or business partners. For example, the asset allocation bias given by financial technology companies recommends products that are highly related to their own interests, and some platforms use specific algorithms to filter products of poor quality, but their own products or services are exempted. The second is the problem of algorithm discrimination. Including price discrimination, identity gender discrimination, education discrimination, etc., “big data kills familiarity” is a form of price discrimination, which provides different commodity prices for different users. And compared to traditional discrimination, algorithmic discrimination is often more difficult to restrain. Discriminatory pricing can only be achieved by monopolistic enterprises, and will not exist in a fully competitive market, and is a new type of monopolistic behavior. The third is to induce consumers to over-consume and take risks through inducing information and risk hiding. Smart algorithms are often easy to conceal the complexity of financial risks, which will not only guide excessive consumption and debt, but may also mislead investors in the field of financial investment. In addition, the convergence of large technology companies in business models and algorithms can easily trigger a herd effect, leading to ups and downs in the market. In particular, the service targets of large-scale technology companies are mostly the public with weak financial expertise and recognition capabilities, which are often more likely to trigger social group incidents, which may lead to systemic financial risks.

In light of this, the main algorithms of large technology companies need to implement external supervision and increase transparency. Because if the algorithm is unfavorable to users (platform managers), they will definitely make changes immediately; but if it is unfavorable to consumers, it is only possible to correct it when it is exposed or forced to be open and transparent. This point has been fully understood by the regulatory authorities and taken corresponding measures. In the new credit reporting regulations issued by the People’s Bank of China, Article 25 requires credit reporting companies to disclose the personal credit scoring algorithm model. In the “Internet Information Service Algorithm Recommendation Management Regulations (Draft for Comment)” issued by the Cyberspace Administration of China, it also addresses issues such as “big data familiarization” and “algorithm discrimination”, requiring practitioners to improve the algorithm management system and optimize algorithm recommendations. , Regularly review and evaluate the algorithm model, strengthen content management, and promote the application of algorithms for the better. In view of the problem that the algorithm recommendation operation process and decision-making mechanism caused by the algorithm “black box” is not clear, the relevant departments require that the algorithm recommendation must be added “”Transparency” includes formulating and publishing rules for algorithm recommendation related services, and optimizing the transparency and interpretability of rules for retrieval, sorting, selection, push, display, etc., and informing users of their provision of algorithm recommendation services in a significant way, and in appropriate ways Publicize the basic principles, purpose and intentions, operating mechanism, etc. of the algorithm recommendation service.

In short, in algorithm supervision, the principle of openness and transparency must be established to ensure that users are treated fairly. For automated decision-making, risk or impact assessment should be done in advance to avoid the risk of algorithm abuse. In the future, algorithms should also be considered for antitrust supervision.

Third, on the Internet The implementation of consumer credit supervision consistent with traditional financial services

The rapid development of Internet consumer credit objectively improves the convenience of financial services, reduces financing costs, and helps Remote areas, small and medium-sized enterprises, and ordinary households have access to more financial services. However, because financial technology has partially changed the form of traditional financial services, there have been regulatory deficiencies and regulatory arbitrage, and certain risks have arisen. On the one hand, large-scale Technology companies engage in financial services in disguise, but do not have licenses. They not only provide credit card services, lending services, asset management services, etc., but also use the Internet and other information technology to break through the restrictions on cross-industry and cross-regional operations, and absorb public deposits in disguise. In addition, the Internet Companies also use the name of inclusive finance to recommend customer resources to licensed financial institutions, which provide loan funds, and engage in credit information and loan assistance services without a credit investigation license. On the other hand, Internet consumer credit has finances. Distortion of values ​​and insufficient protection of financial consumers. As we all know, an important cause of the subprime mortgage crisis in the United States is insufficient protection of financial consumers. Some families who should not have obtained loans are burdened with heavy burdens under the lobbying of commercial banks. There are a lot of risks. Generally speaking, financial values ​​advocate “sow melons, sow beans to get beans”, and oppose the hedonism of getting something for nothing, over-borrowing, and pre-consumption. However, in recent years, some financial technology companies have failed to adequately treat customers. Under the evaluation, the over-marketing of financial products such as online consumer loans through various consumption scenarios, and providing large amounts of loans to students and other groups who have low actual income and weak repayment ability but prefer to consume in advance through borrowing, mislead users. Therefore, it is necessary to carry out license management for fintech companies in accordance with the principle of “same business and same supervision” to prevent regulatory arbitrage. Frankly speaking, Chinese fintech companies have enjoyed “regulatory dividends” in the early stages of development. Some institutions have previously engaged in similar businesses to banks. The deposit and loan business does not need to accept the regulatory requirements such as capital adequacy ratio and leverage ratio under the Basel Agreement. This not only leads to unfair competition between fintech companies and traditional financial institutions, but also magnifies traditional financial institutions to a certain extent. Especially small and medium financial machinesThe operating pressure of financial institutions has also caused incentive distortions, prompting financial technology companies to excessively pursue regulatory arbitrage, and weakening their motivation to improve financial services to the real economy through their own technological innovation.

4. Prevent vicious competition and cross-industry holding investment of large technology companies

based on “winner takes all” Motivation to develop vicious competition is a common method used by large technology companies. Due to the existence of network effects, the financial technology field usually forms a “winner takes all” situation. According to the research conducted by the Bank for International Settlements, the three elements of data, network effects, and intertwined business will reinforce each other and form a mutually-driven chain. Once a large technology company becomes a winner, it will get most of the benefits of the entire sub-industry. In order to pursue “winner takes all”, many large technology companies often use unfair competition methods. After entering new industries, some large technology companies will use the profits of monopolistic industries to fight price wars. They will use unfair competition behaviors such as money burning and cross-subsidy to seize market share, and finally form new industry monopolies. “The game continues. Once gaining market dominance, it will engage in exclusive competition and harm the interests of consumers. Due to the large amount of subsidies before, in order to recover costs, large technology companies often resort to reverse “harvesting” by means of price increases, high commissions, etc., and use market dominance to force users to “choose one of two” to prevent Potential competitors. Therefore, it is necessary to strengthen anti-monopoly supervision in the field of financial technology to prevent vicious competition. On the one hand, it is necessary to improve the market access system, the fair competition review system, and the fair competition supervision system, and establish a comprehensive, multi-level and three-dimensional supervision system to realize the supervision of the entire chain and all fields before and after the event; on the other hand, timely correction and Regulate behaviors and practices that harm consumer interests and hinder fair competition in the development process, prevent platform monopoly and disorderly expansion of capital, especially control platform companies’ cross-industry holding investment, maintain fair competition in the industry, and protect the legitimate rights and interests of consumers. At the moment, some domestic super platforms with hundreds of millions of users have both controlled banks and insurance companies, and also controlled financial technology companies to serve small and medium-sized banks. They not only participate in smart cities at all levels of government, but also control powerful cloud Computing companies provide computing and storage services for financial institutions. The possible conflicts of interest and data risks should not be underestimated. At present, in view of the fact that Google, Amazon and Microsoft have hosted more and more banking, insurance and market businesses on cloud computing platforms, the U.S. Treasury, the European Union, the Bank of England and the Bank of France have all proposed to strengthen the protection of technology companies’ cloud technologies. Review to reduce the risk of banks and other financial industries “overly” relying on cloud computing platforms. The Central Bank of China and the China Banking and Insurance Regulatory Commission are also discussing the establishment of cloud computing and data centers for the financial industry. The China Banking and Insurance Regulatory Commission also requires large state-owned commercial banks to provide financial technology services to city commercial banks and rural commercial banks for free.

The supervision of platform monopoly and disorderly expansion of capital is very important. In the future, in addition to clarifying the main responsibilities and obligations of platform enterprises, and building industry self-discipline mechanisms, social supervision, media supervision, and Public supervision forms a combined force of supervision. In addition, it may be necessary to establish a negative list to prevent and respond to such risks. (The above shorthand draft has not been reviewed by myself)