SHAP - Holiday rental home fraud

Responsible organisation: City of Amsterdam (Governmental)

In the city of Amsterdam, a pilot is being carried out which uses AI and algorithms to track down people illegally renting out their homes via platforms, such as Airbnb. Housing fraud is an increasingly big issue in the city of Amsterdam according to the Court of Audit of Amsterdam, leading to harms of the neighbourhood. The algorithm will include information about previous housing fraud, information from the city’s population registries and building data from the city’s registry of addresses and buildings. While this information is also usable for analysis by civil servants themselves, the use of the AI system is expected will speed up the process significantly and enable more accurate responses as it can connect multiple factors together in ways civil servants themselves would not be able to. According to the AI registry of the city of Amsterdam, once a new report has been received in which a rental is suspected of illegal holiday rental, a probability based on a random forest regression is generated resulting in either a high or low suspicion of fraud, based on a combination of different values. If the probability of housing fraud that rolls out of the algorithm is higher than 50 percent, an employee can prioritize the address for a field investigation. The city mentions however, that the tackling of fraud remains human work and the algorithm will not make decisions, merely recommendations. The civil staff will have to decide whether to investigate an address further, by combining their expertise and the algorithm predictions. The AI system will provide a visualization of the risk assessment and consequently on the basis which data plays a role in the recommendation and which does not. Furthermore, the system does not in the end decide if fraud has taken place, as this is the responsibility of the department. To this end, the AI system merely acts as a first recommendation in the process of determining if housing frauds take place. It is expected that the use of the AI system is to optimize the limited inspection capacity of the city in the most efficient and effective matter, so that field investigations are conducted in premises where fraud is indeed more like. However, there are many risks to public value destruction, of which the city is already aware of. Therefore, personal data such as birth, nationality and others are not included in the algorithm, limiting prejudice of the algorithm to a certain demographic or increasing privacy concerns. Furthermore, even when personal information is not used, it may occur that the AI indirectly leads to targeting certain neighbourhoods more often, potentially leading to discriminatory recommendations. Therefore, the city of Amsterdam is conducting ongoing research to tackle this form of unintended algorithmic bias during the pilot, highlighting that the increase of efficiency may lead to more discriminatory services as calculations are based on correlations and not causations.

Additional information

Source AI Watch - Artificial Intelligence in public services. Overview of the use and impact of AI in public services in the EU
Web site https://algoritmeregister.amsterdam.nl/vakantieverhuur-woningfraude/
Start/end date 2021/01/01 -
Still active? Unknown

Related AI cases