AI In Grant Applications Streamlining Or Skewing The Process
AI Published Date, 2023

AI In Grant Applications: Streamlining Or Skewing The Process?

Created By Orr Group’s AI Taskforce: Terry Cangelosi, Bobby Hunter, Jack Little, Katie Nickels, CJ Orr, Jace Prokupek, and Rebecca Voulgarakis

December 6, 2023

The rapid adoption of Artificial Intelligence (AI) in our daily lives has heightened discussions surrounding its ethical usage and detection. HR executives weigh the pros and cons of scanning AI-generated content in cover letters, and academic institutions are investing fervently in AI-detecting software. However, proponents of AI writing tools ask the question: why should we penalize AI-generated content and the use of AI to enhance work productivity? What if AI can help secure grants to fund nonprofit missions?

Well, we are at a point in our technological evolution where those questions are front and center.

In the nonprofit sector, AI’s application is on the rise, particularly in the realm of funding proposals (i.e., foundations, corporations, government) and grant reporting. The fast adoption of tools like Grantable has significantly accelerated the ability to create customized grant applications and reports, tasks typically known for their meticulous and repetitive nature. Yet, many seasoned grant writers would argue there is an “art” to crafting compelling grants to ensure that they not only align with the funders’ objectives but also resonate with the nonprofit’s core activities. This nuance, they argue, cannot be outsourced to a computer program.

In a 2023 Good Grants article, there was a growing concern among funders, with some contemplating the inclusion of clauses in their grant contracts to prohibit the use of AI-generated content. In response to these emerging issues and increasing concerns, Orr Group’s AI Taskforce has taken the initiative to explore the various perspectives on AI-generated content. This process has involved examining the viewpoints of both nonprofits advocating for AI to alleviate their workload and streamline grant processes and funders who oppose the use of AI in grant proposals and reports.

Position in Favor of AI-Generated Content
  • Saves Valuable Resources: On average, completing a foundation grant application can take between 20-30 hours, which equates to approximately 60% of a full work week for a single fundraiser. Typically, such applications have a success rate of around 50% in securing funding. Reducing the time required to complete a grant by just 10-20% would result in substantial savings over the course of a year. For instance, if a fundraiser is expected to allocate 250 hours (about one and a half weeks’ worth of time) to submitting 10 grants, a 15% reduction in grant writing time would free up roughly one full week annually. This additional time would allow the fundraiser to craft more grant applications, ultimately enabling the nonprofit to secure more funding for its programs. 
  • Levels the Playing Field: Smaller nonprofits often struggle to approach high-end funders. In contrast, larger and more established nonprofits, equipped with expert grant writers, advanced impact measurement methods, and support from various professionals, tend to craft more persuasive grant proposals. Now, with the assistance of AI tools, even smaller nonprofits can produce content that rivals the quality of these larger, more sophisticated organizations, essentially leveling the playing field. Beyond writing support, AI-generated design not only eliminates the need for a costly designer to create a visually persuasive proposal but also provides more cost-effective alternatives for analyzing data to demonstrate impact.
  • Enhances Conformity and Consistency in Tone/Style: When working on grant applications, consistency in tone and style is crucial to convey a unified and professional image. AI can be instrumental in achieving this, especially when multiple collaborators with varying writing styles are involved. For instance, in a situation where a grant application is composed by a team of staff across various departments such as operations, finance, development, and programs, each represents a specific viewpoint and approach to the funding pitch. AI can integrate sections from various contributors to ensure a cohesive and fluid narrative. This uniformity in tone enhances readability and strengthens the proposal’s overall impact. By providing tools for style and tone analysis, AI ensures that the final document reflects a singular, compelling voice, despite multiple contributors. AI can also identify discrepancies within your grant application, such as inconsistencies in the promised impact numbers or total costs, which can be particularly common in complex grant applications involving multiple contributors. Cohesiveness is essential in making a strong, persuasive case to funders who value clarity in grant applications.
Position Against AI-Generated Content

  • Compromises on Quality: Critics have voiced concerns about ChatGPT’s potential to spread misinformation. They highlight instances where the platform may present misleading information with a high degree of confidence, which could deceive the average user into thinking it is credible. Such misinformation can be particularly challenging for funders, who rely on accurate data and information to make informed decisions. The risk here is that they may base their funding decisions on incorrect or biased information, leading to the potential misallocation of resources.
  • Hinders Collaborative Deliberations: Many articles emphasize the benefits of a thorough grant application process, highlighting its role in providing organizations with a platform to explicitly outline their goals. Additionally, this process serves as a tool to ensure that the entire team aligns with the organization’s vision and objectives. This alignment can promote better planning, coordination, and execution of projects. That said, critics argue substituting the grant writing process with an auto-generated proposal could diminish collaboration and the necessary due process within organizations, both crucial elements for ensuring organizations can fulfill commitments made in large grant requests.
  • Potential for Plagiarism and Lack of Originality: One of the primary concerns associated with AI-generated content is the risk of plagiarism. Critics contend that AI, while capable of quickly processing and synthesizing vast amounts of information, may inadvertently reproduce existing content without proper attribution. In the context of grant funding applications and reporting, where originality and credibility are crucial to building trust with funders, this issue becomes particularly critical.  The reliance on AI to create content raises questions about the authenticity and originality of the ideas presented. These factors play a vital role in demonstrating the relevance of a project and building a strong reputation with funders.
Conclusion

While both sides of the argument have been explored, the Orr Group AI Taskforce supports the ethical, legal, and responsible integration of AI in grant applications and other time-consuming processes. We believe that AI’s ability to save time significant advantage for nonprofits, allowing fundraisers more time to focus on securing additional funding and furthering their missions. We recognize, however, that the use of AI is not without its complexities and potential ethical considerations.

Our commitment extends to ensuring that AI adoption in grantmaking, and beyond, is carried out thoughtfully and responsibly. We emphasize the importance of transparency, fairness, and equity in AI-driven processes, and therefore, encourage the adoption of policies and guidelines across the philanthropic landscape that promote the responsible use of AI, with a focus on mitigating potential risks and maximizing the benefits it can bring to the nonprofit sector.

Explore more tools and resources from Orr Group’s AI Taskforce in our Knowledge Center!


Orr Group’s AI Taskforce aims to address AI’s shifting and expanding landscape and ensure the firm is up to date and adapting accordingly to better serve our team and our clients. Comprised of team members across the firm’s departments, including Terry Cangelosi, Bobby Hunter, Jack Little, Katie Nickels, CJ Orr, Jace Prokupek, and Rebecca Voulgarakis, the Taskforce was formed in May of 2023 to guide AI exploration, initiatives, and strategy development.

Related Resources