Adobe Backtracks on Terms of Use After MASS Exodus Threat
Adobe updated its Terms of Use, but unclear wording caused a user panic. Many feared their work would be used for AI training or stolen. Now, Adobe clarifies the changes are just for content moderation, not AI, and users can opt out of data sharing.
CONTENTS: Adobe Backtracks on Terms of Use After MASS Exodus Threat
Adobe clarifies content ownership in Terms of Use
Adobe Backtracks on Terms of Use After MASS Exodus Threat
Adobe is updating its Terms of Use after users raised concerns about the privacy and ownership of their work due to ambiguous language. In a blog post on Monday, the company, known for its creative software tools like Photoshop, Premiere, and InDesign, announced that updated terms would be rolled out by June 18, 2024.
Executive Vice Presidents Scott Belsky, who oversees product, and Dana Rao, who oversees legal and policy, emphasized Adobe’s clear stance on customer commitment and responsible innovation.
Belsky and Rao clarified that Adobe has “never trained generative AI on customer content” or taken ownership of unpublished work, and the recent Terms of Service update does not indicate an intention to do so. They also noted that Adobe’s Firefly generative AI models are trained using Adobe’s stock library and public domain data, which is separate from user-created content for personal or professional use.
Adobe clarifies Terms of Use after user backlash
Belsky and Rao acknowledged the need to evolve Adobe’s Terms of Use to better reflect their commitments to the community. This decision follows a PR crisis last week when users reacted negatively to notifications about the updated Terms of Use.
Due to unclear explanations, many users feared the changes implied Adobe could use their unpublished work for training its Firefly AI models and potentially take ownership of in-progress work. The lack of clarity and transparency sparked swift backlash, with some users vowing to leave the platform.
However, Adobe clarified that the updated policy granting access to user content was intended only for screening activities that violate laws or terms of service, not for AI training or taking control of user work.
Belsky and Rao emphasized that users can opt out of Adobe’s product improvement program, which involves sharing content for model training, and that the licenses are specifically designed for activities like scanning for illegal content. Additionally, Adobe does not scan content stored locally on users’ computers.
In conclusion, this situation could have been avoided with clearer communication, though some reputational damage has likely occurred. “We recognize that trust must be earned,” Belsky and Rao stated, closing their post.
Check out TimesWordle.com for all the latest news