Amid the growing debate about the propriety of those similarities in the age of AI and Deepfakes, Tom Graham, CEO of generative AI tech firm Metaphysics — the company best known for popularizing the popular TikTok account spoofing Tom Cruise — has submitted a copyright registration of his AI similarity with the US Copyright Office. It seeks to demonstrate how copyright laws can be used to protect against the indefinite creation and dissemination of content in which a person’s likeness is duplicated by AI.
As Hollywood grapples with ethical and legal issues surrounding the use of the technology, the US Copyright Office stated in March that most works generated by AI are not copyrighted, but are declared, in some instances, eligible for protection.
Speaking on a panel in January at CES, SAG-AFTRA national executive director and chief negotiator Duncan Crabtree-Ireland emphasized the importance of addressing this issue in light. He told the Guild that he is working on strategies on how to create opportunities, not take them away, while protecting members from having similar earnings.
In the United States, copyright laws do not protect works created solely by machines. Various courts have ruled that copyright can only be granted to works created by humans. An application in which the material generated by AI can only support a claim if it satisfies the requirement that a human contribute to a larger part of the work, as if a person “selected or arranged” a part “enough to create a creative way of working from it. The original work of authority.”
In his copyright application, Graham lists himself as the author of the work. This is a crucial distinction from the copyright application of Stephen Thaler, CEO of the neural network firm Imagine Engines, who counted the AI system as the creator of the art called the recent entry to Paradise. The copyright office denied registration, finding that the work “lacks the human authority necessary to claim the oath” and that “the connection between the human mind and creative expression” is a major form of protection. The following year, Thaler applied for the office, prompting rejection.
Unlike Thaler’s application and others seeking protection for AI-assisted works, Graham emphasizes his contribution in creating the video for which he is seeking copyright. He says he took care to use the dataset to train the model and added “other fabricated tasks” to successfully enable the metaphysical AI tool to produce a realistic image of itself. There is also work in the editing and composition process, he notes.
“All of these steps are very deliberate, human-driven actions that I believe can be distinguished from any Photoshop or VFX tools,” Graham says. “You use AI, but there is a high level of work and human effort involved. It is internet from top to bottom. “Perhaps the most famous is the power @DeepTomCruise system, Metaphysics is developing generative AI tools and services for talent, including through a partnership with CAA.
The amount of copyright largely depends on how the AI tool works and how it is used to create the final product. If AI technology only takes input and produces work, for example, it is not eligible for protection because the traditional elements of human authority are determined and implemented by the technology.
Copyright protection standards are still up in the air. Intellectual property attorney Suzanne Hengl, who does not represent Metaphysics, doubts Graham’s application will satisfy current office standards for human authority. “As long as you train the AI tool to work and produce output, it won’t get over the hump,” he said.
A partner at the law firm Baker Botts notes that there is protection if the work is done by a person after the video is produced. It shows that the copyright of an artist who uses Photoshop to edit an image qualifies for protection of the modified image.
While the video may not be Graham’s, the chief executive of the application highlights the lack of protection against the illegal dissemination of images and videos in which a person’s likeness is duplicated by AI. There are no copyright laws specifically designed to combat the use of deepfakes. For in some cases they allow it.
Some instances of deepfakes in Hollywood fall under the “fair use” exception to copyright infringement. The doctrine allows the use of legal books of license in certain circumstances, such as commentary, criticism and news reporting. One of the four factors is whether the work is associated with a purpose and a character of use, which extends protection to transformative works, when the purpose of the work of law is subsumed to create something with a new meaning or message.
The fair use of the public law exception is why Kendrick Lamar was on firm legal ground when he transformed his altafakes into Will Smith, Jussie Smollett and Kobe Bryant in the “See Part of the Heart” music. They don’t agree to be in the video but profundafakes is likely to be used transformatively to explore black representation in Hollywood, among other themes in the music video. There could also be categories of parodies, which are also protected under copyright laws.
Among the reasons Graham says he moved to register the video-AI created self is to forge a way for people to identify their identities, services and brands against illegal content using AI-created tools, such as deep pornography. Under the Digital Millennium Copyright Act, a copyright holder can quickly take down material that infringes on their property.
“I want to underline the value of using the DMCA as a means to take down video sums within 24 hours,” says Jordan Manekin, a lawyer at Metaphysics. “This remedy is much more readily available to most than those who have just been litigating and litigating for several years.”
But even if protection is granted, Hengl says it’s unlikely Graham could use his signature to upload AI-generated photos and videos. “In terms of enforcing property rights to likeness or privacy rights, trying to fit a square peg into a round hole,” he suggests. “Only then can you protect against works that are substantially similar. It must be more than the image of that person – the attitude, the look on the face, the body they put in, the station they are in – it is similar”.
Hengl notes the traditional approach of taking action against deep-sea and similar technology claims for violations of public and privacy rights.
As SAG-AFTRA Crabetree-Ireland summarized at CES, “As these democratizing technologies become more accessible and available, this is going to be more open to the public.”