The National Lottery Community Fund is urging applicants to use Artificial Intelligence (AI) tools with caution in funding applications

Tom Anstey | Planet Attractions | 24 Jan 2025

While a powerful tool, AI can also cause issues when being utilised in the writing of funding applications
The National Lottery Community Fund (NLCF) has urged caution when using artificial intelligence tools in the process of writing grant funding applications to the organisation.
As AI technology becomes more integrated into daily life, with tools like ChatGPT, Gemini, Copilot, and more making waves in both personal and professional spheres, the NCLF has said that it recognises the growing interest among communities in using these tools to enhance their funding applications.
In a statement, the Fund acknowledged the benefits of AI, particularly for applicants who may not have English as their first language or are new to the funding application process. However, the organisation also emphasised the importance of using AI with caution.
“While AI can provide a useful starting point, what it generates is often not as strong as it may seem,” said the NCLF. AI-supported applications, the Fund statement also cautioned, may fail to capture the unique qualities of a community or the distinctive goals of a project, making applications too generic and potentially less compelling.
Key tips for applicants include focusing on community impact, making the content personal, and being specific in outlining project details.
“AI tools often produce content that lacks the depth and individuality needed to tell the true story of your community,” the Fund advised. “Be sure to include insights and feedback directly from the people you’re aiming to serve, and demonstrate how your project will make a tangible difference.”
The statement also touched on the importance of careful budgeting. While AI can generate budget suggestions, applicants were urged to ensure their budgets align with their plans, follow the programme’s funding rules, and reflect value for money. The Fund stressed the need to verify AI-generated content for accuracy, as AI can sometimes produce misleading or incorrect information.
Additionally, applicants were reminded to be mindful of privacy concerns. Free AI tools can store data input by users, which may pose a risk to confidentiality. The Fund encouraged applicants to comply with data protection regulations and seek advice from the Information Commissioner’s Office if needed.
Finally, the Fund noted the environmental impact of AI tools, which require significant energy and water to run data centers. The organisation urged applicants to use AI “mindfully” and only where it would clearly improve the quality of their applications.
NLCF’s guidance aims to help communities leverage AI in ways that enhance their chances of securing funding while maintaining the integrity of their applications and aligning with best practices in data security and environmental responsibility.
Technology
|
|






Talking heads: What trends will shape the attractions industry over the next decade?
|