OwnershIP Of AI Predictive Models For Flood Management.
📌 1. Key Ownership Issues in AI Flood Models
When AI is used to create predictive models for flood management, legal questions arise:
- Who owns the AI‑generated model?
- The developer of the AI system?
- The user or entity that directed its creation?
- No one, if AI is considered a non‑human creator?
- Are the outputs (predictions, code, visualizations) protected by copyright, patent, or trade secrets?
- Who owns derivative works if the AI builds on pre-existing datasets?
- How do contracts and public interest obligations affect ownership?
- Can the AI itself be considered an inventor for patent purposes?
🧠 2. AI Cannot Be a Legal Author or Inventor
Case 1 — Thaler v. Perlmutter (D.C. Cir., 2023)
Facts: Stephen Thaler sought copyright for works created by his AI (“DABUS”).
Holding: Copyright law requires a human author. Works autonomously generated by AI without meaningful human input cannot be copyrighted.
Implication for Flood Models:
- If an AI independently develops a flood prediction model, the AI alone cannot own rights. Ownership depends on human involvement.
Case 2 — In re Thaler (Fed. Cir., 2022)
Facts: Thaler attempted to name AI as the inventor on a patent.
Holding: Inventor must be a natural person. AI cannot be listed as an inventor under U.S. patent law.
Implication:
- If the AI discovers novel flood prediction algorithms, a human must have sufficient creative contribution to be named as inventor.
Case 3 — Naruto v. Slater (9th Cir., 2018)
Facts: A macaque took a photo; the question was whether the animal could hold copyright.
Holding: Non-human entities cannot own copyright.
Parallel: AI systems, like non-human animals, cannot legally own rights in their creations.
✍️ 3. Human Contribution Determines Ownership
Case 4 — Aalmuhammed v. Lee (2d Cir., 1998)
Facts: A supporting writer claimed joint authorship in a film.
Holding: Joint authorship requires original creative input and intent to be a co-author.
Application to Flood Models:
- If humans select training datasets, choose modeling parameters, interpret outputs, or integrate results, they can be authors of the predictive model or its report.
Case 5 — Community for Creative Non‑Violence v. Reid (U.S. Supreme Court, 1989)
Facts: Sculptor commissioned by organization; dispute over ownership.
Holding: Ownership depends on factors like control, tools provided, and intent.
Implication for AI Models:
- Contracts and supervision matter. For instance, if a city hires a firm to create AI flood models:
- Who owns the model may be determined more by contractual terms than default IP law.
- Courts examine who controlled the AI, who made creative decisions, and who supervised outputs.
📊 4. Ownership of Data and Derivative Works
Flood models rely on datasets (rainfall, river flow, topography). Ownership may be complex if AI uses proprietary datasets.
Case 6 — Authors Guild v. Google, Inc. (SDNY, 2015)
Facts: Google digitized millions of books; authors claimed copyright infringement.
Holding: Transformative use of copyrighted works can be fair use.
Implication:
- If the AI uses third-party environmental datasets, the output may still be protectable as transformative work, but excessive copying could infringe rights.
- Ownership may depend on license agreements for the datasets.
Case 7 — British Horseracing Board v. William Hill (UK, 2004)
Facts: Dispute over rights in a database of horseracing information.
Holding: Substantial investment in data collection can confer database rights.
Implication:
- Agencies investing in hydrological or meteorological data may hold database rights, even if AI generates predictive models.
- Use of these datasets by AI may require permissions or licenses.
🖥️ 5. Software Licensing and AI Tools
Ownership also depends on licensing agreements of the AI software.
Case 8 — Oracle v. Google (Fed. Cir. & Supreme Court)
Principle: Ownership of software APIs and outputs can be shaped by licensing terms.
Implication:
- The AI software license may determine who owns models and derivative outputs.
- If a consulting firm develops AI models for a government agency, the license may dictate whether the agency or the firm owns the predictive model.
🌊 6. Public Interest and Regulatory Obligations
Flood management models are often funded or used by public agencies. Legal obligations may affect ownership:
- Many jurisdictions require public disclosure of risk assessment tools.
- Even if a firm holds copyright or trade secret rights, regulatory law may mandate access to AI predictive models for public safety.
🧩 7. Practical Ownership Framework
| Scenario | Likely Ownership | Notes |
|---|---|---|
| AI autonomously builds model | No copyright/patent | Model may enter public domain |
| Human curated or directed AI | Human or entity with contractual rights | Human contribution anchors authorship |
| Proprietary datasets used | Dataset owner may claim rights | Licenses and agreements critical |
| Agency-funded AI | Ownership may default to agency | Often dictated by contracts/funding agreements |
| Model used for public safety | Statutory obligations may override exclusivity | Transparency and liability rules |
🧪 8. Hypothetical Cases for Flood Models
Case A — Fully Autonomous AI Model
- AI trained on public flood data generates a predictive model with no human input.
- Outcome: No copyright; model is effectively public domain.
- Government and private entities may freely use it.
Case B — Human-Directed AI Model
- Engineers select datasets, set parameters, and analyze AI outputs.
- Outcome: Humans (or their employer) may hold copyright and trade secret rights.
- Ownership determined by contract and employment agreements.
Case C — Third-Party Proprietary Data Used
- AI incorporates private hydrological datasets.
- Outcome: Must check licensing. Rights holders may restrict use.
- Possible liability if the model is shared without permission.
Case D — Public Agency Requirement
- City commissions flood prediction AI model.
- Even if firm owns IP, public access may be required due to statutory obligations.
🏁 9. Key Legal Takeaways
- AI cannot be a legal author or inventor; human contribution is required for ownership.
- Contracts define ownership when AI models are developed for clients or agencies.
- Dataset and database rights can impact ownership of outputs.
- Public interest obligations may limit exclusivity, especially for safety-critical AI models.
- Careful documentation of human input ensures clear attribution of authorship or inventorship.

comments