Collection

Creatability

Exploring how creative tools can be made more accessible for everyone.
Man using laptop with webcam on. On his screen - a colorful musical keyboard is played with a cursor mapped to his nose.
What is Creatability?

Creatability is a set of experiments made in collaboration with creators and allies in the accessibility community. They explore how creative tools – drawing, music, and more – can be made more accessible using web and AI technology. They’re just a start. We’re sharing open-source code and tutorials for others to make their own projects.

Experiments

Keyboard

by Use All Five & Google Creative Lab
A simple musical keyboard you can play with your face, body, mouse, or keys.

Sound Canvas

by Kearney-Volpe / Miele / Phillips / Pereira
A simple drawing tool that works through both sight and sound.

Body Synth

by Use All Five & Google Creative Lab
Make music just by moving your body.

Seeing Music

by Jay Alan Zimmerman
Experience music visually.

Clarion Lite

by OpenUp Music & Use All Five
An expressive, adaptable musical instrument in your web browser.

Sampler

by Use All Five & Google Creative Lab
A simple musical sampler you can play with your face, body, mouse or keys.

Word Synth

by Google Creative Lab
A fun way to play with speech and music.

Made something
you'd like to see
feature here?

Submit your
experiment here.

Stories

Whether you’re a teacher, student, researcher, or artist, you can use these experiments in all kinds of ways. Here are links to stories of how they're being used. (Want to share your story? Drop us a line at creatability-support@google.com)

October 25, 2018

Drawing with Sound
Claire Kearney-Volpe & Josh Miele walk through different ways to use the Sound Canvas tool.

October 25, 2018

Making Your Own Musical Instruments with P5.js, Arduino, and WebMIDI
Luisa Pereira shows how to combine body tracking and Arduino to make your own instruments.

Build Your Own

Here are some links to get started making your own experiments.
Github: Creatability Components
We're sharing open-source code and components from these experiments in this repo on Github.

Tutorial: Nose theremins and light painters in P5.js
A tutorial for making body-controlled interfaces with P5.js and Posenet, with links to P5.js code you can edit right in your browser.

MORE INFO


Who made these?

These experiments were a collaborative effort by Jay Alan Zimmerman, Claire Kearney-Volpe, Kyle Philips, Yotam Mann, Luisa Pereira, Use All Five, and Google Creative Lab. Special thanks to Chancey Fleet, Josh Miele, James Maxson, and friends at Henry Viscardi School at The Viscardi Center, Tech Kids Unlimited, and ADAPT Community Center.


What can you do with them?

The experiments explore a diverse set of inputs, from mouse, keyboard, body, wrist, nose, or voice. You can make music by moving your face, draw using sight or sound, experience music visually, and more.


How does the body tracking work?

The body tracking feature was made using Posenet, a machine learning model that can detect key body joints in images and videos. This lets you control the experiments with your webcam, simply by moving your body.


When I turn on my webcam, are my images being sent to Google servers?

No. Your images are not being sent to any Google servers while you’re using body tracking mode in these experiments. All the image recognition is happening locally in your browser.


What devices does it work best on?

For the best experience, use Chrome on a desktop PC or Mac. Most features also work on tablets, but have not been tested on all devices. Phones are not fully supported. 


Do they work with screen readers?

We’ve worked to make the experiments work with many screen readers across different platforms, but we’re still working to improve compatibility. If you have feedback, drop us a line at creatability-support@google.com.


Why is the experience slow on my computer?

The body tracking feature works best on newer computers with faster GPUs. If you have an older computer, you can still try it, but it just might just feel a little slower and less accurate.


How can I build projects like this?

We’re working on sharing open-source code and components in this Github repository. We’ve also creating guides as starting points, like this tutorial that lets you learn how to use body tracking with P5.js.


Made something that you’d like to see featured here? Contribute your own experiment to this collection:


Submitting...
Saving...