Stable Diffusion Animation SDK For Python
Written by Nikos Vaggalis   
Monday, 19 June 2023

Stability AI has released its Stable Animation SDK, a toolkit designed for artists and developers to implement the Stability models in generating their animations.

stablogo

In general, the SDK allows users to generate animations from text, images, or video combined with text. More specifically, the SDK offers three ways to create animations:

  • Text to animation: Users input a text prompt (as with Stable Diffusion) and tweak various parameters to produce an animation.

  • Text input + initial image input: Users provide an initial image that acts as the starting point of their animation. A text prompt is used in conjunction with the image to produce the final output animation.

  • Input video + text input: Users provide an initial video to base their animation on. By tweaking various parameters, they arrive at a final output animation that is additionally guided by a text prompt.

After installing it but before using it, you have to create a Stability DreamStudio account in order to get an API key since to create animations using the SDK, a connection to the Stability servers must first be established.

The installation is pretty straight forward. Just run

pip install stability_sdk

Now you are ready to use the SDK from your code.

Alternatively, there's also a UI which runs as a local server which you can access through the browser. To do this use the
[anim_ui] flag as :

pip install stability-sdk[anim_ui]

Then you just run the UI with

python3 -m stability_sdk animate --gui

From there on, you can experiment with the models and generate animations without having to write any code.

In the UI as well from code utilizing the SDK, dvelopers can use many parameters to adjust their animations.
For instance the following Python code does just that :

from stability_sdk. animation import AnimationArgs, Animator# Configure the animation
args = AnimationArgs()
args. interpolate_prompts = True
args. locked_seed = True
args. max_frames = 48
args. seed = 42
args. strength_curve = "0:(0)"
args. diffusion_cadence_curve = "0:(4)"
args. cadence_interp = "film"

animation_prompts = {
0: "a photo of a cute cat",
24: "a photo of a cute dog",
}

To watch how these parameters affect the resulting animation, you can check the animated previews of the Animation Handbook, a Google Doc file with many examples.

The available Diffusion models are:

stable-diffusion-v1
stable-diffusion-v1-5
stable-diffusion-512-v2-0
stable-diffusion-768-v2-0
stable-diffusion-512-v2-1
stable-diffusion-768-v2-1
stable-diffusion-xl-beta-v2-2-2
stable-diffusion-depth-v2-0

The SDK is open source but since it has to call into the Stability servers, there is charge based on credits depending on its usage.

There are two parts to the credit usage; one part is for still image generation, and the second part is for the running of animation operations

At the default of (512x512, 30 steps) using the Stable Diffusion v1.5 model, an animation consisting of 100 frames (around 8s) will use 37.5 credits.

Depending on the parameters used, the charging varies. There's a detailed chart on charging here. However, when you open an account you'll also get some free credits to try the API out.

In conclusion, the Stability Animation SDK opens up the world of generative AI to your Python code.

stabilitysq

More Information

Stability Animation

Stability-AI onGithub 

Related Articles

Take Harvard's CS50 Introduction to Artificial Intelligence with Python For Free  

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Be Counted In the Python Developer Survey
09/10/2024

Conducted annually by the Python Software Foundation in conjunction with JetBrains, this survey is the major source of knowledge about the current state of the Python community. The eighth iterat [ ... ]



Apache Lucene Improves Sparce Indexing
22/10/2024

Apache Lucene 10 has been released. The updated version adds a new IndexInput prefetch API, support for sparse indexing on doc values, and upgraded Snowball dictionaries resulting in improved tokeniza [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Monday, 19 June 2023 )