Navigating the Legal Boundaries of User-Generated Content

We often come across various stories and posts where we see a friend of ours or maybe our favorite Instagram or YouTube influencer posting a photo or video about some or the other brand/product and telling how happy they are by the use of a particular product. It might be an Air Jordan or a lipstick from Huda Beauty. This is termed user-generated content for UGC. This type of content is unsponsored and unpaid; therefore, it is authentic, priceless, and credible. These are real users compared to the actors; consequently, it feels like getting a recommendation from someone you know.[i] UGC differs from influencer marketing; in influencer marketing, the influencers are paid to promote the brand, and therefore, the review’s credibility is questionable. Many customers rely on UGC to stay up to date on the products coming into the market. Additionally, such content reduces the burden for marketers who struggle with ideas and have exhausted their pool of ideas. In this digital age, it becomes essential to balance the rights of both the creators as well as the owners of such trademarks and copyrights to ensure that the acts of creators do not cause acts such as defamation, trademark and copyright infringement. This blog will explore such potential infringements and consequences and conclude with suggestions that creators should keep in mind before coming and posting on such platforms.

INTELLECTUAL PROPERTY CONCERNS

Trademark concerns:

UGC is generally content that creators make for their followers and entertainment. Therefore, the purpose is only commentary, reviews, and artistic expression of their skills, which falls within the category of non-commercial purposes. There being no commercial purpose, the work/content falls within the category of Fair use, Section 30 of the Trademark Act.[ii] So, for instance, if an influencer posts a review video of hers using a particular facewash, say, Dot&Key, and says, “ I love this facewash as it is gentle on my face,” there is no commercial purpose, and therefore, it does not attract any form of trademark infringement and will fall under fair use.

Copyright Concerns:

Different platforms have different terms and conditions for creators who want to incorporate music into their content. Music is the original artistic work of singers and, therefore, requires authorization before being used. Social media platforms such as Instagram have agreements with “music rights holders.” Therefore, a creator making UGC having incorporated some music and also adhering to usage guidelines would be protected under the Copyright regime because his work falls under “Fair Dealing.”[iii] On the other hand, a creator posting a video on YouTube will require the permission of the owner holding a copyright in that music before uploading it.

Additionally, this UGC is itself a copyrighted work. The creators who upload videos (reviews), photos, posts, etc, are, by default, the rightful owners of the work they create.[iv]

Defamation Concern:

A false statement of facts, which harms the reputation of an individual, can be termed as defamation. It can be pointed towards a specific individual or an entity. It must be noted that mere opinion on something, for instance, review videos, cannot per se be said to be defamatory. It should be seen from the perspective of a reasonable listener what he will perceive when he comes across such a video. Section 356 of the BNS, 2023[v] lays down the grounds for defamation along with punishment, which includes imprisonment, fine, or both, or with community service. While a statement might not be defamatory, but might be against the guidelines of a social media platform. A false and defamatory statement can lead to huge losses for a business because defamation on social media is viral, potent, and instantaneous. A customer might not be aware of a particular good but will become aware as soon as someone makes a negative statement about it.[vi] Consumers tend to lose trust in a brand, thereby leading to a massive loss of goodwill and reputation.

UGC Content

Liability of platforms hosting UGC and Safe Harbor provisions under the IT Act:

Safe harbor protections are designed to absolve liability for “social media users”, “user-generated content-based sites”, and even “search engines that receive thirds-party content” in their platforms.[vii] For such protections, intermediaries are normally required to comply with several requirements, including the following:

  • Serving merely as a passive channel that does not alter or edit any content for users.
  • Associating with a “notice and takedown” system that allows the removal of infringing content upon receipt of legal notice or complaint.
  • Creating enforceable policies and systems against copyright infringement and other illegal, actionable content.

Section 79 of the IT Act of 2000[viii] extends safe harbor coverage to intermediaries in India who do not originate, choose, or alter the content they are transmitting and who, once their notice or knowledge of illegal content is received, remove it promptly. This was significantly impacted by the Shreya Singhal v. Union of India (2015),[ix] which entails that postal communications and email services are protected from intermediaries’ interference so long as there is no legal order or supervisory notice given to them.

In many other countries, including the US and on relevant local jurisdiction, YouTube undertakes the DMCA[x] which they show as their content policy, as well as a global law provisioning basis. The core of it lies within the Content ID system, which enables copyright owners to track and control their content.

Even if Content ID enforcement surmounts the classic notice-and-takedown models, YouTube is obliged to comply with safe harbor rules by swiftly implementing a takedown procedure where applicable.

Meta’s Instagram has a Safe Harbor under the DMCA in the USA and the Indian IT Act. It creates a means for users to submit cases of suspected copyright violations, bullying, and other types of abuse.

Freedom of Expression Versus the Right to Restrict Free Speech:

The right to free speech and regulation of user-generated content (UGC) is another intriguing issue. Freedom of speech is prone to abuse and so laws need to be put in place to deal with the negative consequences of harmful content, false information, and even illegal activity. However, too much regulation will lead to a distortion of free speech. Such distortion affects their ability to create, debate, and innovate, which is detrimental to the health of the digital space. Finding the balance between distortion and expression requires a more intelligent approach to protecting people while at the same time worrying about society.

PROBLEMS ENCOUNTERED WHEN REGULATING UGC

  1. Overregulation: Given such broad or poorly defined laws, there is a risk that legal expression can be stifled. For instance, hate speech or misinformation regulations may sometimes stretch to include political satire, art, or government criticism speech.
  2. Ambiguity in Vocabulary used: Words such as “harmful” and “inappropriate” are usually not defined, resulting in arbitrary enforcement and discrimination in moderation.
  3. New and Emerging Technology: The application of AI in content moderation poses a danger of true algorithmic bias. The misuse of AI writing tools can lead to over-censorship, and the not-so-clever programming behind them leads to inadequately contextualized outputs.

Role of the Platforms

With regard to user-generated content, platforms are an essential piece of the puzzle as they strive to strike a balance between “self-regulation, freedom of expression, and intellectual property (IP) protection.” Their responsibilities include:

  1. Well-defined Guidelines: Platforms should outline policies that are reasonable, understandable, and contestable with regard to content moderation, as well as IP boundaries like copyright, trademarks, and fair use. Reports detailing the removal of content, especially on the basis of alleged IP infringement, can provide insight and foster trust.
  2. Moderation Without Bias: Under IP law, overreaching moderation, such as aggressive takedown of content deemed harmful or potentially infringing parody, criticism, or transformative works, can lead to silencing dissenting perspectives and abuse of user rights.
  3. Improved Methods for Resolving Conflicts: Unfounded removal of content, misuse of alleged IP infringements, and restriction of creators’ freedom to express themselves can be minimized with higher standards for content disputes. Rightful owners of the IP must be identified and protected, but there must also be an absence of excessive regulation of creativity.

This blog contains the most salient legal issues related to UGC, such as copyright violation, defamation, and intermediary liability, as well as privacy invasion. Even though some of these frameworks might be available, they are not comprehensive, which calls for a change.  To keep up with the changing realities of UGC, it is necessary to work together with lawmakers, platforms and users. Legal measures aimed at maintaining the equilibrium between protecting rights have to be delicate, as these rights should not prevail over the right to exercise and UGC, which has become a vital means of innovative expression.

Author: Anavi Jain, in case of any queries please contact/write back to us via email to chhavi@khuranaandkhurana.com or at Khurana & Khurana, Advocates and IP Attorney.

[i]Shahid, K. (2024) User-generated content (UGC): What it is and why it matters for your brand, Sprout Social. Available at: https://sproutsocial.com/insights/user-generated-content-guide/ (Accessed: 24 January 2025).

[ii] Trade Marks Act, § 30, No. 47 of 1999, India Code (1999).

[iii] Copyright Act, § 53, No. 14 of 1957, India Code (1957).

[iv]  Vaidehi, S. (2024) IP and usage rights for user-generated content (UGC), Fashion Law Journal. Available at: https://fashionlawjournal.com/ip-and-usage-rights-for-user-generated-content-ugc/  (Accessed: 24 January 2025).

[v] Bharatiya Nyaya Sanhita, § 356, No. 11 of 2023, India Code (2023).

[vi] Writankar, M. (2011) Social media defamation rules: People have to be careful about what they post on social media websites, The Economic Times. Available at: https://economictimes.indiatimes.com/tech/internet/social-media-defamation-rules-people-have-to-be-careful-about-what-they-post-on-social-media-websites/articleshow/10686349.cms?from=mdr (Accessed: 24 January 2025).

[vii] Vasudev, D. (2024) Conceptualising India’s Safe Harbour in the Era of Platform Governance, SSRN.  Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4727481 (Accessed: 24 January 2025).

[viii] Information Technology Act, § 79, No. 21 of 2000, India Code (2000).

[ix] Shreya Singhal v. Union of India, AIR 2015 SUPREME COURT 1523.

[x] Digital Millennium Copyright Act, 17 U.S.C. § 512 (1998).

Leave a Reply

Categories

Archives

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • February 2011
  • January 2011
  • December 2010
  • September 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010