Emerging Generative Artificial Intelligence Governance in Community Associations

Share this article:

On July 29, 2024, the American Bar Association issued Formal Opinion 512, its first formal ethics guidance on the use of generative artificial intelligence (referred to as “GAI”). While addressed to attorneys, these ethical guideposts affect directors and managers when employing GAI in the daily operations of their community associations.

Formal Opinion 512 highlights the duties of competence and confidentiality under Model Rules of Professional Conduct sections 1.1 and 1.6. Directors and managers hold similar fiduciary duties, which have been discussed in the previously posted Best Practices article. The key takeaway as to competency is that directors and managers need to understand that GAI can “hallucinate” – that is, produce false or misleading content. Thus, all GAI-assisted output should be treated as an assistant for the first draft, not as the final editor. Directors and managers can be trained in GAI literacy by focusing on input parameters and output verification to ensure accurate work product.

As for confidentiality, directors and managers are custodians of association records. For every GAI software that an association utilizes, it is important to verify each software’s policies pertaining to data encryption, storage location, document retention, and user accessibility. It is also important to know whether data input by the user is used to train language learning models. These verifications ensure data security and privacy compliance as GAI becomes integrated into the standard operating procedures of associations, especially for niches such as the Safe at Home Program under Civil Code section 5216.

Additionally, Formal Opinion 512 highlights the importance of transparent communication under Model Rules of Professional Conduct section 1.4. Just as attorneys must disclose substantive GAI tool usage to clients, directors and managers have the same obligation. For example, directors and managers maintain a duty to disclose to the community whenever GAI is used to help perform association duties, such as GAI-assisted dictation software that produces meeting transcripts. Disclosing to the community of said use would create an expectation to have all meetings, as well as hearings, transcribed. In turn, transcripts would be responsive to formal records request under Civil Code section 5200 et seq. and become further discoverable in litigation. While GAI-assisted transcription may help document exactly what members say at meetings, the persistent use of GAI-assisted dictation software may burden the association more than help it.  

Further, Formal Opinion 512 also impacts attorneys’ fees under Model Rules of Professional Conduct section 1.5, stating that attorneys should bill only for reviewing GAI output and not simply using it. Directors and managers would be keen to review their engagement agreements and applicable vendor contracts for any terms requiring GAI disclosure and verification. Soon, more attorneys are going to be required to disclose GAI use in fee agreements, and whether the time spent learning or training to use GAI is billed.

Though GAI implicates several fiduciary duties, it also presents an opportunity for community associations to enhance their standard operating procedures. Implementation can start with straightforward, low-sensitive tasks such as a resident newsletter or a welcome flyer to build user confidence. Over time, directors and managers can develop prompting skills without simply pasting content into GAI software and thus inadvertently exposing sensitive information. (e.g., draft a concise, understandable [output] for residents about [topic]; change tone to courteous but firm; create a seasonal maintenance checklist for a community association with [type of amenities] and [X] as the budget).

There are free and very affordable, low-cost options for GAI software, many of which contain templates for infographics, FAQs, or meeting slides. As automated usage grows, associations may need to budget for upgraded versions of software, so long as the cost is justified by work product. At a higher budget tier, and with guidance from legal counsel and an Artificial Intelligence Governance professional, chatbots can be designed and integrated to field association duties when managers are off the clock, such as maintenance requests. Using predictive analytics, directors or managers with business backgrounds may be able to utilize GAI to forecast maintenance and other financial needs when completing reserve studies.

Whether drafting documents, managing communications, or analyzing data, the accessibility of GAI to streamline tasks is ever-present. With the right knowledge and discipline, directors and managers can appropriately engage with GAI to make protocol more efficient, creative, and tailored to their members. Despite all that GAI can do, only humans can build communities.

Best Practices for Community Association Managers and Board Members Using GenAI

Share this article:

AI has become so prevalent that even the state’s legislature codified “Generative Artificial Intelligence System” (“GenAI” for short) last year. Section 22757.1 of the California Business and Professions Code defines GenAI as “an artificial intelligence that can emulate synthetic content like text, images, audio, or video.” There are over 17 widely used software applications that utilize GenAI as a tool, each available to consumers for free or at an affordable rate. Although a convenient time-saver, particularly for non-sensitive and non-substantive tasks, managers and board members of community associations should exercise the utmost discretion when using GenAI to ensure that fiduciary duties are maintained, especially the duties of confidentiality and care.

First is the duty of confidentiality. In part, this duty requires managers and board members to Protect Personally Identifiable Information (“PII” for short) from becoming compromised. Software applications that utilize GenAI are able to retain data and metadata, meaning each GenAI tool can readily transmit whatever is input. For example, uploading an owner’s vehicle registration document containing a resident’s name, address, birthdate, and driver’s license into ChatGPT would be enough for a third party to identify that resident. If that information were extracted from Open AI’s database, whether through an inadvertent or intentional data breach, or if Open AI were to change its terms of service and sell uploaded information to third-party services, either scenario could be determined to be a breach of confidentiality on the part of the party that uploaded the data. This could, in turn, lead to violations of regulations and laws, such as the California Consumer Privacy Act. That is why it is currently still a best practice to never share PII with vendors that offer GenAI. Taking the time to educate team members on what constitutes PII and how to avoid sharing it helps an association demonstrate its compliance with federal and state privacy laws.

In practice, you may want to use only the vendors that post clear privacy policies. Try to avoid “copy and pasting” an email or document into a GenAI prompt, and if you have to, redact all PII, including the metadata, before uploading. Alternatively, try to use generic prompts if you need to ask GenAI for help, like “draft a concise letter to the Association’s Board President about X” or “provide three different ways to say to Resident A that they violate a rule about Y.” Literally type “Resident A” when you enter the prompt; you will then have to Find & Replace the owner’s actual name in place of “Resident A” before you send the letter. You can also tailor follow-ups to initial prompts, such as “revise the second paragraph in simpler terms,” to engage with GenAI and not merely copy its initial output.

Second is the duty of care, which requires, in part, fiduciaries to use reasonable effort to make informed, competent decisions and to act in good faith. In this context, reasonable effort would mean that outputs of GenAI usage are verified for accuracy before taking subsequent steps involving decision-making, such as posting notices to members, tenants, and guests. A manager or board member can demonstrate competency with GenAI by taking the time to understand its capabilities, limitations, and appropriate use. For instance, substantive tasks such as amending a governing document would require complex decision-making, which is more suitable for the association’s legal counsel to handle, rather than GenAI. Reviewing the substance of the final product created by GenAI is crucial before it is distributed to board members, management employees, or anyone else.

Acting in good faith requires diligence and transparency, which is especially important when work product is assisted by GenAI. Diligence can be demonstrated by disclosing how GenAI was utilized and the steps taken afterward. Regardless of how trustworthy GenAI may be, it would be wise to treat its output as a first draft and easily track how a human verified it for accuracy. To demonstrate transparency, managers and board members should implement internal policies that track all daily operations assisted by GenAI, regardless of how small or mundane a task may seem. Peers can hold each other accountable whenever they spot the use of GenAI without disclosure; doing so ensures compliance, especially if they are ever asked to disclose such information in a Section 5200 document request.

In summary, all fiduciaries should utilize GenAI with privacy and care in mind. Otherwise, absent-minded use of GenAI may expose the manager, the board, or the entire community association to fines, liability, or disciplinary action. Please feel free to reach out to your association’s legal counsel for any questions regarding best practices on using GenAI. Consulting with a certified information privacy professional would also be beneficial. A little caution goes a long way in safeguarding the data and integrity of our community associations.

List of widely used software applications in alphabetical order that utilize GenAI, followed by the company that developed it, the year it was first available, and noteworthy tidbits:

ChatGPT, Open AI, 2022, referred as a Chatbot like Alexa, and used as customer service on websites;
Claude, Anthropic, 2023, inspired by Claude E. Shannon to employ a constitutional approach to AI;
CoCounsel, Thomas Reuters, 2023, caused sanctions for submitting briefs with hallucinated citations;
Copilot, Microsoft, 2023, integrated with various Office software applications;
DeepSeek, High-Flyer, 2023, known for its training data;
Flux AI, Black Forest Labs, 2024, became leader in AI-image generation;
Gemini, Google, 2023, suspended in the past for posting historically inaccurate and offensive images;
GPT-4, Open AI, 2023, passed the Uniform Bar Exam, scoring in the 90th percentile among test takers;
Hippocratic, Munjal Shah, 2023, recognized as a leader in Generative AI for healthcare;
LLaMa, Meta, 2023, designed to be a base model for running a “local ChatGPT” on a PC;
Midjourney, Midjourney, Inc., 2022, utilized Discord App to create award-winning AI-generated images;
Operator, Open AI, 2025, designed as an AI agent that can automate online tasks like a person would;
Perplexity, Aravind Srinivas, 2022, designed as a conversational LLM-powered answer engine;
Protégé, LexisNexis, 2024, personalized AI-legal assistant with generative and agentic AI capabilities;
Speechify, Tyler Weitzman, 2022, originated as a text-to-speech platform and now an AI voice generator;
Veo, Google Deepmind, 2024, released as a multimodal video generative model.