A Adobe started the week by confirming that its image and video editing tools will be integrated into firefly. This means that software like premiere pro It is photoshop will be able to create effects or certain types of content based on a generative artificial intelligence.

Tools like Dall-E, Stable Diffusion and Midjourney generate images from prompts, that is, from typed instructions. But the purpose of Adobe Firefly is not to do all the work for the user. The proposal is to facilitate or speed up tasks.
A video released on adobe blog shows, as an example, Firefly generating soundtracks, lighting adjustments, subtitles and visual effects from user-typed instructions.
Presumably it will be up to the user to make the final adjustments. In any case, Firefly should reduce the time spent performing tasks or help users who have little mastery of editing tools.
Adobe Firefly: A Creative Co-Pilot
Adobe Firefly was introduced in March as a standalone generative artificial intelligence. But Adobe’s intention is to make the tool a “creative co-pilot” of tools like Photoshop (image editing), Illustrator (illustration) and Premiere Pro (video editing).
The initial phase is focused on tasks such as:
- Music and sound effects: allows the generation of audio effects and custom soundtracks without the user having to worry about royalties;
- Color and lighting adjustments: the user only needs to enter an instruction to change the color scheme or adjust the light of an image. For example, you can ask for a sunny sky scene to be transformed into a cloudy sky;
- Text effects: subtitles, logos and titles can receive graphic effects and even contextualized animations;
- Video production streams: Firefly can be used to optimize production workflows (including pre and post) to create storyboards, automatic previews and b-roll recommendations (alternate clips).
More features will be introduced in later steps.
When and how?
Adobe Firefly is currently standalone and in beta. Those interested can register on the official website to test it. Integration with Photoshop, Illustrator, Premiere Pro and other Adobe tools will happen over the next few months progressively.
Premiere Pro will be one of the tools most benefited by artificial intelligence. Editing videos is painstaking work, so efforts to make this type of activity easier or more streamlined tend to be welcome.
In this sense, the tool will receive, starting in May, a text-based video editing feature. With it, lines existing in clips can be automatically transcribed to facilitate the production flow.
But text-based editing will be based on Adobe Sensei technology, which also uses artificial intelligence to optimize tasks. The first features based on Adobe Firefly are expected to be released by the end of 2023.