Every year at its MAX user conference, Adobe shows off a number of research projects that may or may not end up in its Creative Cloud apps over time. One new project that I hope we’ll soon see in its video apps is Project Sharp Shots, which will make its debut later today during the MAX Sneaks event. Powered by Adobe’s Sensei AI platform, Sharp Shots is a research project that uses AI to deblur videos.
Shubhi Gupta, the Adobe engineer behind the project, told me the idea here is to deblur a video — no matter whether it was blurred because of a shaky camera or fast movement — with a single click. In the demos she showed me, the effect was sometimes relatively subtle, as in a video of her playing ukulele, or quite dramatic, as in the example of a fast-moving motorcycle below.
With Project Sharp Shots, there’s no parameter tuning and adjustment like we used to do in our traditional methods,” she told me. “This one is just a one-click thing. It’s not magic. This is simple deep learning and AI working in the background, extracting each frame, deblurring it and producing high-quality deblurred photos and videos.”
Image Credits: AdobeGupta tells me the team looked at existing research on deblurring images and then optimized that process for moving images — and then optimized that for lower-memory usage and speed.
It’s worth noting that After Effects already offers some of these capabilities for deblurring and removing camera shake, but that’s a very different algorithm with its own set of limitations.
This new system works best when the algorithm has access to multiple related frames before and after, but it can do its job with just a handful of frames in a video.
Adobe is betting big on its Sensei AI platform, and so it’s probably no surprise that the company also continues to build more AI-powered features into its flagship Photoshop applications. At its MAX conference, Adobe today announced a handful of new AI features for Photoshop, with Sky Replacement being the most obvious example. Other new AI-driven features include new so-called “Neural Filters” that are essentially the next-generation of Photoshop filters and new and improved tools for selecting parts of images, in addition to other tools to improve on existing features or simplify the photo-editing workflow.
Photoshop isn’t the first tool to offer a Sky Replacement feature. Luminar, for example, has offered that for more than a year already, but it looks like Adobe took its time to get this one right. The idea itself is pretty straightforward: Photoshop can now automatically recognize the sky in your images and then replace it with a sky of your choosing. Because the colors of the sky also influence the overall scene, that would obviously result in a rather strange image, so Adobe’s AI also adjusts the colors of the rest of the image accordingly.
How well all of this works probably depends a bit on the images, too. We haven’t been able to give it a try ourselves, and Adobe’s demos obviously worked flawlessly.
Photoshop will ship with 25 sky replacements, but you can also bring in your own.
Neural Filters are the other highlight of this release. They provide you with new artistic and restorative filters for improving portraits, for example, or quickly replacing the background color of an image. The portrait feature will likely get the most immediate use, given that it allows you to change where people are looking, change the angle of the light source and “change hair thickness, the intensity of a smile, or add surprise, anger, or make someone older or younger.” Some of these are a bit more gimmicky than others, and Adobe says they work best for making subtle changes, but either way — making those changes would typically be a lot of manual labor, and now it’s just a click or two.
Among the other fun new filters are a style transfer tool and a filter that helps you colorize black and white images. The more useful new filters include the ability to remove JPEG artifacts.
As Adobe noted, it collaborated with Nvidia on these Neural Filters, and, while they will work on all devices running Photoshop 22.0, there’s a real performance benefit to using them on machines with built-in graphics acceleration. No surprise there, given how computationally intensive a lot of these are.
While improved object selection may not be quite as flashy as Sky Replacement and the new filters, “intelligent refine edge,” as Adobe calls it, may just save a few photo editors’ sanity. If you’ve ever tried to use Photoshop’s current tools to select a person or animal with complex hair — especially against a complex backdrop — you know how much manual intervention the current crop of tools still need. Now, with the new “Refine Hair” and “Object Aware Refine Mode,” a lot of that manual work should become unnecessary.
Other new Photoshop features include a new tool for creating patterns, a new Discover panel with improved search, help and contextual actions, faster plugins and more.
Also new is a plugin marketplace for all Creative Cloud apps that makes it easier for developers to sell their plugins.
Adobe today launched the first public version of its Illustrator vector graphics app on the iPad. That’s no surprise, given that it was already available for pre-order and as a private beta, but a lot of Illustrator users were looking forward to this day.
In addition, the company also today announced that its Fresco drawing and painting app is now available on Apple’s iPhone, too. Previously, you needed either a Windows machine or an iPad to use it.
Illustrator on the iPad supports Apple Pencil — no surprise there either — and should offer a pretty intuitive user experience for existing users. Like with Photoshop, the team adapted the user interface for a smaller screen and promises a more streamlined experience.
“While on the surface it may seem simple, more capabilities reveal themselves as you work. After a while you develop a natural rhythm where the app fades into the background, freeing you to express your creativity,” the company says.
Over time, the company plans to bring more effects, brushes and AI-powered features to Illustrator in general — including on the iPad.
As for Fresco, it’ll be interesting to see what that user experience will look like on a small screen. Since it uses Adobe’s Creative Cloud libraries, you can always start sketching on an iPhone and then move to another platform to finish your work. It’s worth noting that the iPhone version will feature the same interface, brushes and capabilities you’d expect on the other platforms.
The company also today launched version 2.0 of Fresco, with new smudge brushes, support for personalized brushes from Adobe Capture and more.
Vectary, a design platform for 3D and Augmented Reality (AR), has raised a $7.3 million round led by European fund EQT Ventures. Existing investor BlueYard (Berlin) also participated.
Vectary makes high-quality 3D design more accessible for consumers, garnering over one million creators worldwide, and has more than a thousand digital agencies and creative studios as users.
With the coronavirus pandemic shifting more people online, Vectary says it has seen a 300% increase in AR views as more businesses start showcasing their products in 3D and AR.
Vectary was founded in 2014 by Michal Koor (CEO) and Pavol Sovis (CTO), who were both from the design and technology worlds.
The complexity of using and sharing content created by traditional 3D design tools has been a barrier to the adoption of 3D, which is what Vectary addresses.
Although Microsoft, Facebook and Apple are making it easier for consumers, the creative tools remain lacking. Vectary believes that seamless 3D/AR content creation and sharing will be key to mainstream adoption.
Designers and creatives can use Vectary to apply 2D design on a 3D object in Figma or Sketch; create 3D customizers in Webflow with Embed API; and add 3D interactivity to decks.
Impact Creative Systems (formerly Imagine Impact) is bringing a startup accelerator-style approach to finding fresh creative talent, and it announced this morning that, with funding from venture capital firm Benchmark, it’s spinning out from Imagine Entertainment — the production company founded by director Ron Howard and producer Brian Grazer.
Right after the news broke, the accelerator’s founders — Howard, Grazer and CEO Tyler Mitchell — joined us at TechCrunch’s Disrupt conference to discuss their vision. Grazer (whose films with Howard include “Apollo 13,” “A Beautiful Mind” and the upcoming “Hillbilly Elegy” for Netflix) recalled the Hollywood of 25 years ago, which he described as an “opaque” system where original writers often struggled to break in, and he felt that Impact could “democratize access to Hollywood.”
“How can we create opportunity to have access to epicenter of employment in the media business, which is Hollywood?” he said.
For starters, Mitchell described what he claimed is a scalable system for evaluating 2,000 script submissions every week.
“We were able to build a system that leverages both technology as well as expert systems evaluating not just the writers, but the readers — almost like financial analysts — and try to come up with metrics in world where there aren’t stats,” he said.
Mitchell also noted that in Impact’s first cohort of 87 writers, 39% were BIPOC, 10% were LGBTQ and it was split 50-50 between men and women, with 11 different countries represented.
“If you try to find the most talented writers in the world, they’re going to look like the world,” he said.
Howard made a similar point, saying that this diversity results from an interest in “fresh new voices” with “no statistical goals or agendas in mind — it’s just happening in a really honest way.”
Asked whether they’re interested in finding new talent from social media, Howard pointed to Grazer as the one who’s always encouraging him to “know what’s going on up north” (a.k.a. in Silicon Valley).
“Right now we’re in a creative renaissance with podcasts and Instagrams … finding their way into the center of the narrative,” Howard said.
Grazer said he often looks at YouTube in particular. At the same time, he cautioned that creating content for these online platforms requires a different skillset than writing movies or TV.
“It doesn’t reduce the likelihood of their success necessarily, but it’s a different art form,” he said. “Because writing a teleplay or a screenplay, even the greatest playwrights can’t do that particular thing — you have to be trained.”
Still, Imagine found at least one idea in an Instagram Story, developing a comedic show around an actor (Grazer didn’t want to say who it was, but it’s probably Arnold Schwarzenegger) with a donkey named Lulu and a miniature horse named Whiskey. Apparently the show has attracted multiple bidders, and as for where it will end up, Grazer said, “It sort of seems like Amazon. I’ll let you know tomorrow.”
Spark is one of those products in Adobe’s Creative Suite that doesn’t always get a lot of attention. But the company’s tool for creating social media posts (which you can try for free) has plenty of fans, and maybe that’s no surprise, given that its mission is to help small business owners and agencies create engaging social media posts without having to learn a lot about design. Today, Adobe added one of the most requested features to Spark on mobile and the web: animations.
“At Adobe, we have this rich history with After Effects,” Spark product manager Lisa Boghosian told me. “We wanted to bring professional motion design to non-professionals, because what solopreneurs or small business owners know what keyframes are or know how to build pre-comps and have five layers. It’s just not where they’re spending their time and they shouldn’t have to. That’s really what Spark is for: you focus on your business and building that. We’ll help guide you into expressing building that base.”
Guiding users is what Spark does across its features, be that designing the flow of your text, adding imaging or now animations. It does that through providing a vast number of templates — which include a set of animated templates, as well as easy access to free images, Adobe Stock and icons from the Noun Project (on top of your own imagery, of course).
The team also decided to do away with a lot of the accouterments of movie editors, including timelines. Instead, the team pre-built the templates and the logic behind how new designs display those animations based on best practices. “Instead of exposing a timeline to a user and asking them to put things on a timeline and adjusting the speed — and guessing — we’ve taken on that role because we want to guide you to that best experience.”
In addition to the new animations feature, Spark is also getting improved tools for sharing assets across the Creative Cloud suite thanks to support for Creative Cloud Libraries. That makes it far easier for somebody to move images from Lightroom or Photoshop to Spark, but since Spark is also getting quite popular with agencies, it’ll make collaborating easier as well. The service already has tools for organizing assets today, but this makes it far easier to work across the various Creative Cloud tools.
Boghosian tells me the team had long had animations on its roadmap, but it took a while to bring it to market, in part because Adobe wanted to get the performance right. “We had to make sure that performance was up to par with what we wanted to deliver,” she said. “And so the experience of exporting a project — we didn’t want it to take a significant amount of time because we really didn’t want the user sitting there waiting for it. So we had to bring up the backend to really support the experience we wanted.” She also noted that the team wanted to have the Creative Cloud Libraries integration ready before launching animations.
Once you’ve created your animation, Spark lets you export it as an MP4 video file or as a static image. Spark will not let you download GIFs.