Stable Diffusion Animation SDK For Python
Written by Nikos Vaggalis   
Monday, 19 June 2023

Stability AI has released its Stable Animation SDK, a toolkit designed for artists and developers to implement the Stability models in generating their animations.


In general, the SDK allows users to generate animations from text, images, or video combined with text. More specifically, the SDK offers three ways to create animations:

  • Text to animation: Users input a text prompt (as with Stable Diffusion) and tweak various parameters to produce an animation.

  • Text input + initial image input: Users provide an initial image that acts as the starting point of their animation. A text prompt is used in conjunction with the image to produce the final output animation.

  • Input video + text input: Users provide an initial video to base their animation on. By tweaking various parameters, they arrive at a final output animation that is additionally guided by a text prompt.

After installing it but before using it, you have to create a Stability DreamStudio account in order to get an API key since to create animations using the SDK, a connection to the Stability servers must first be established.

The installation is pretty straight forward. Just run

pip install stability_sdk

Now you are ready to use the SDK from your code.

Alternatively, there's also a UI which runs as a local server which you can access through the browser. To do this use the
[anim_ui] flag as :

pip install stability-sdk[anim_ui]

Then you just run the UI with

python3 -m stability_sdk animate --gui

From there on, you can experiment with the models and generate animations without having to write any code.

In the UI as well from code utilizing the SDK, dvelopers can use many parameters to adjust their animations.
For instance the following Python code does just that :

from stability_sdk. animation import AnimationArgs, Animator# Configure the animation
args = AnimationArgs()
args. interpolate_prompts = True
args. locked_seed = True
args. max_frames = 48
args. seed = 42
args. strength_curve = "0:(0)"
args. diffusion_cadence_curve = "0:(4)"
args. cadence_interp = "film"

animation_prompts = {
0: "a photo of a cute cat",
24: "a photo of a cute dog",

To watch how these parameters affect the resulting animation, you can check the animated previews of the Animation Handbook, a Google Doc file with many examples.

The available Diffusion models are:


The SDK is open source but since it has to call into the Stability servers, there is charge based on credits depending on its usage.

There are two parts to the credit usage; one part is for still image generation, and the second part is for the running of animation operations

At the default of (512x512, 30 steps) using the Stable Diffusion v1.5 model, an animation consisting of 100 frames (around 8s) will use 37.5 credits.

Depending on the parameters used, the charging varies. There's a detailed chart on charging here. However, when you open an account you'll also get some free credits to try the API out.

In conclusion, the Stability Animation SDK opens up the world of generative AI to your Python code.


More Information

Stability Animation

Stability-AI onGithub 

Related Articles

Take Harvard's CS50 Introduction to Artificial Intelligence with Python For Free  

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.


AWS Introduces A New JavaScript Runtime For Lambda

Amazon has announced the availability, albeit for experimental purposes, of a new JavaScript based runtime called Low Latency Runtime or LLRT for short, to bring JavaScript up to the performance throu [ ... ]

Redis Changes License, Rival Fork Launched

The developers of Redis have announced that they are changing the licensing model for the database. From now on, all future versions of Redis will be released with source-available licenses rather tha [ ... ]

More News

raspberry pi books



or email your comment to:

Last Updated ( Monday, 19 June 2023 )