Letting agents are being urged to avoid using or offering services based on so-called ‘black box’ AI – opaque systems that make decisions without clear reasoning or accountability.
Reapit, the PropTech firm making the call, says its own services only use what it describes as “explainable, ethical AI.”
Recent analyses from computing giants such as IBM that many AI systems are so complex, even their creators cannot fully explain how they reach decisions. Reapit claims that this lack of transparency can lead to legal, ethical, and operational risks for letting agencies.
With AI adoption accelerating, many more agencies are increasingly relying on tools to automate valuations, generate listings, and manage tenant interactions.
But Reapit claims that allowing ‘black box’ AI to make decisions without oversight could breach the Data Use and Access Act 2025 and existing consumer protection laws requiring businesses, including agencies, to disclose when AI is used in decision-making.
According to law firm Freshfields the law gives individuals “rights to make representations, obtain human review, and contest significant decisions”. In the property sector, this could include tenant applications, vendor enquiries, maintenance requests and more.
In addition, misleading property descriptions, altered images, or automated valuations without oversight could breach the Digital Markets, Competition and Consumers Act 2024 if AI usage is not disclosed.
“The question isn’t whether agents will use AI, it’s whether they’ll use the right AI” says Matt McGown, chief product officer at Reapit. “Generic tools might save time, but they can also introduce risk. If your AI can’t show how it reached a decision, or how much it edited a photo, what information it used to draft a property description, or why it approved a tenant or prospective buyer for a viewing, you’re risking fines and your hard-won reputation.”
According to a Reapit survey of 624 UK property professionals in July this year shows that 62% believe AI can make decisions and learn independently; 29% see AI primarily as automation; and 79% have encountered tools marketed as AI that were in fact basic scripting.
Reapit says its approach to AI is grounded in transparency, compliance, and control. Unlike generic models trained on public internet data, each Reapit AI (RAI) is Platform AI powered by an agency’s own data, already housed within Reapit.
“We are building Reapit AI (RAI) with transparency and security at its core. RAI is set to help agencies move faster without compromising trust, where agency staff give the first approval on a recommendation; after that, the system will handle similar cases automatically with a full audit trail. You stay in control, and the business moves faster without cutting corners” McGown continues.
“Agents don’t need another shiny tool, they need fewer late nights writing ads and buried in admin, without losing their unique approach to property. That’s what Reapit AI is designed to deliver.”







