At its Worldwide Developer Conference, Apple announced a significant update to RealityKit, its suite of technologies that allow developers to get started building AR (augmented reality) experiences. With the launch of RealityKit 2, Apple says developers will have more visual, audio, and animation control when working on their AR experiences. But the most notable part of the update is how Apple’s new Object Capture technology will allow developers to create 3D models in minutes using only an iPhone.
Apple noted during its developer address that one of the most difficult parts of making great AR apps was the process of creating 3D models. These could take hours and thousands of dollars.
With Apple’s new tools, developers will be able take a series of pictures using just an iPhone (or iPad or DSLR, if they prefer) to capture 2D images of an object from all angles, including the bottom.
Then, using the Object Capture API on macOS Monterey, it only takes a few lines of code to generate the 3D model, Apple explained.
Image Credits: Apple
To begin, developers would start a new photogrammetry session in RealityKit that points to the folder where they’ve captured the images. Then, they would call the process function to generate the 3D model at the desired level of detail. Object Capture allows developers to generate the USDZ files optimized for AR Quick Look — the system that lets developers add virtual, 3D objects in apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy and others are using Object Capture to create 3D models of real-world objects — an indication that online shopping is about to get a big AR upgrade.
Wayfair, for example, is using Object Capture to develop tools for their manufacturers so they can create a virtual representation of their merchandise. This will allow Wayfair customers to be able to preview more products in AR than they could today.
Image Credits: Apple (screenshot of Wayfair tool))
In addition, Apple noted developers including Maxon and Unity are using Object Capture for creating 3D content within 3D content creation apps, such as Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the rendering pipeline to fine tune the look and feel of AR objects; dynamic loading for assets; the ability to build your own Entity Component System to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale and explore AR worlds in RealityKit-based games.
One developer, Mikko Haapoja of Shopify, has been trying out the new technology (see below) and shared some real-world tests where he shot objects using an iPhone 12 Max via Twitter.
Developers who want to test it for themselves can leverage Apple’s sample app and install Monterey on their Mac to try it out.
Apple says there are over 14,000 ARKit apps on the App Store today, which have been built by over 9,000 different developers. With the over 1 billion AR-enabled iPhones and iPads being used globally, it notes that Apple offers the world’s largest AR platform.
Apple's Object Capture on a Pineapple. One of my fav things to test Photogrammetry against. This was processed using the RAW detail setting.
More info in thread pic.twitter.com/2mICzbV8yY
— Mikko Haapoja (@MikkoH) June 8, 2021
Apple's Object Capture is the real deal. I'm impressed. Excited to see where @Shopify merchants could take this
Allbirds Tree Dashers. More details in thread pic.twitter.com/fNKORtdtdB
— Mikko Haapoja (@MikkoH) June 8, 2021
Tomorrow, June 9 is the big day, mobility fans! Get ready to rub virtual elbows with the brightest minds and makers, movers and shakers at TC Sessions: Mobility 2021. You, along thousands of other attendees from around the world, will find insight, inspiration and, most of all, opportunity to help you make your mobility startup dreams a reality.
Procrastination Station: It’s not too late to join your community and get the inside scoop on the latest mobility trends and tech. Buy your pass now and drive this opportunity like you stole it.
The event agenda features 20 different presentations, interviews, panel discussions and breakout sessions on range of topics — everything from servicing EV charging stations, autonomous vehicles — and the AI that powers them — the state of venture capital (come get your SPAC on), public-private partnerships, equity and accessibility and, whoa, so much more.
We’re going to point out just a few of tomorrow’s highlights to whet your mobility whistle and to help you make the most of your time. You can kiss schedule conflicts goodbye, thanks to video-on-demand. Catch any session you miss later at your convenience.
Ready? Take a look at what’s happening tomorrow at TC Sessions: Mobility 2021. Times listed below are EDT, but the agenda will automatically reflect your time zone.
12:05 pm – 12:35 pm
Self-Driving Deliveries: Autonomous vehicles and robotics were well on their way transforming deliveries before the pandemic struck. In the past year, these technologies have moved from novel applications to essential innovations. We’re joined by execs at Starship Technologies, Gatik and Nuro — each with individual approaches that span the critical middle and last mile of delivery.
1:45 pm – 2:05 pm
Public-Private Partnerships: Advancing the Future of Mobility and Electrification: The future of mobility starts with the next generation of transportation solutions. Attendees will hear from some of the most innovative names on opportunities that await when public and private entities team up to revolutionize the way we think about technology. Trevor Pawl, Michigan’s Chief Mobility Officer, will be joined by Nina Grooms Lee, Chief Product Officer of May Mobility.
3:00 pm – 4:00 pm
Startup Pitch Feedback Session: Tune in as the 28 startups exhibiting at TC Mobility pitch to, and hear feedback from, TechCrunch staff. The pitch deck you improve by watching may be your own.
5:05 pm – 5:15 pm
EV Founders in Focus: We sit down with the founders poised to take advantage of the rise in electric vehicle sales. This time, we will chat with Evette Ellis, co-founder of ChargerHelp! a startup that enables on-demand repair of electric vehicle charging stations.
12:05 pm – 6:20 pm
Explore the expo: Don’t miss the 28 game-changing startups exhibiting in the expo area. Ask for a live demo, a product walk-through or simply start a conversation and see where it leads. Opportunity awaits.
And that, mobility fans, is the classic tip of the iceberg. Get a good night’s sleep, carbo-load and prepare for a marathon of opportunity at TC Sessions: Mobility 2021. We’ll see you tomorrow!
Last month, Apple announced it would soon add lossless audio streaming and Spatial Audio with support for Dolby Atmos to its Apple Music subscription at no extra charge. That upgrade has now gone live, Apple announced this morning — though many noticed the additions actually rolled out yesterday, following the WWDC keynote.
The entire Apple Music catalog of 75+ million songs will support lossless audio.
The lossless tier begins at CD quality — 16 bit at 44.1 kHz, and goes up to 24 bit at 48 kHz, Apple previously said. Audiophiles can also opt for the high-resolution lossless that goes up to 24 bit at 192 kHz. Apple has said you’ll need to use an external, USB digital-to-analog converter to take advantage of the latter — simply plugging in a pair of headphones to an iPhone won’t work.
Apple Music subscribers will be able to enable the new lossless option under Settings > Music > Audio quality. Here, you’ll be able to choose the different resolutions you want to use for different connections, including Wi-Fi, cellular, and download.
When you make your selection in Settings, iOS warns that lossless files will use “significantly more space” on your device, as 10 GB of storage would allow you to store approximately 3,000 songs at high quality, 1,000 songs with lossless, or 200 songs with high-res lossless.
Image Credits: Apple
Meanwhile, Spatial Audio will be enabled by default on hardware that supports Dolby Atmos, like Apple’s AirPods and Beats headphones with an H1 or W1 chip. The latest iPhone, iPad, and Mac models also support Dolby Atmos. Spatial Audio on Apple Music will also be “coming soon” to Android devices, Apple said.
To kick off launch, Apple Music is today rolling out new playlists designed to showcase Spatial Audio. These include:
Apple is also adding a special guide to Spatial Audio on Apple Music, which will help music listeners hear the difference. This will include tracks from artists like Marvin Gaye and The Weeknd, among others. And Apple will air a roundtable conversation about Spatial Audio featuring top sound engineers and experts, hosted by Zane Lowe at 9 am PT today on Apple Music.
Because songs have to be remastered for Dolby Atmos specifically, these guides and playlists will help music fans experience the new format without having to hunt around. Apple says it’s working with artists and labels to add more new releases and the best catalog tracks in Spatial Audio. To help on this front, Apple notes there are various initiatives underway — including doubling the number of Dolby-enabled studios in major markets, offering educational programs, and providing resources to independent artists.
Apple also said it will build music-authoring tools directly into Logic Pro. Later this year, the company plans to release an update to Logic Pro that will allow any musician to create and mix their songs in Spatial Audio for Apple Music.
Just after the release of iOS 12 in 2018, Apple introduced its own built-in screen time tracking tools and controls. In then began cracking down on third-party apps that had implemented their own screen time systems, saying they had done so through via technologies that risked user privacy. What wasn’t available at the time? A Screen Time API that would have allowed developers to tap into Apple’s own Screen Time system and build their own experiences that augmented its capabilities. That’s now changed.
At Apple’s Worldwide Developer Conference on Monday, it introduced a new Screen Time API that offers developer access to frameworks that will allow parental control experience that also maintains user privacy.
— Guilherme Rambo (@_inside) June 7, 2021
The company added three new Swift frameworks to the iOS SDK that will allow developers to create apps that help parents manage what a child can do across their devices and ensure those restrictions stay in place.
The apps that use this API will be able to set restrictions like locking accounts in place, preventing password changes, filtering web traffic, and limiting access to applications. These sorts of changes are already available through Apple’s Screen Time system, but developers can now build their own experiences where these features are offered under their own branding and where they can then expand on the functionality provided by Apple’s system.
ScreenTime API looks great, I sincerely hope someone provides me a way to bulk change stuff for my kids. If I had known I would have to tweak each kids ScreenTime individually like I do today, I might have had less children. #WWDC21
— Stan Lemon (@stanlemon) June 7, 2021
Developers’ apps that take advantage of the API can also be locked in place so it can only be removed from the device with a parent’s approval.
The apps can authenticate the parents and ensure the device they’re managing belongs to a child in the family. Plus, Apple said the way the system will work lets parents choose the apps and websites they want to limit, without compromising user privacy. (The system returns only opaque tokens instead of identifiers for the apps and website URLs, Apple told developers, so the third-parties aren’t gaining access to private user data like app usage and web browsing details. This would prevent a shady company from building a Screen Time app only to collect troves of user data about app usage, for instance.)
The third-party apps can also create unique time windows for different apps or types of activities, and warn the child when time is nearly up. When it registers the time’s up, the app lock down access to websites and apps and perhaps remind the child it’s time to their homework — or whatever other experience the developer has in mind.
And on the flip side, the apps could create incentives for the child to gain screen time access after they complete some other task, like doing homework, reading or chores, or anything else.
Developers could use these features to design new experiences that Apple’s own Screen Time system doesn’t allow for today, by layering their own ideas on top of Apple’s basic set of controls. Parents would likely fork over their cash to make using Screen Time controls easier and more customized to their needs.
Other apps could tie into Screen Time too, outside of the “family” context — like those aimed at mental health and wellbeing, for example.
— Quentin Zervaas (@qzervaas) June 7, 2021
Of course, developers have been asking for a Screen Time API since the launch of Screen Time itself, but Apple didn’t seem to prioritize its development until the matter of Apple’s removal of rival screen time apps was brought up in an antitrust hearing last year. At the time, Apple CEO Tim Cook defended the company’s decision by explaining that apps had been using MDM (mobile device management) technology, which was designed for managing employee devices in the enterprise, not home use. This, he said, was a privacy risk.
Apple has a session during WWDC that will detail how the new API works, so we expect we’ll learn more soon as the developer info becomes more public.
TechCrunch Early Stage Part Two is set to take place July 8th and 9th. You can still shoot your shot to pitch to an amazing panel of judges and thousands of TC viewers. TechCrunch editors will select 10 founders from around the world to pitch on stage July 9th. Apply here.
Startups will have five minutes to pitch their companies, business models and innovative ideas — followed by a Q&A with our superb panel of judges. The winner will get a feature article on TechCrunch.com, one-year free subscription to Extra Crunch and a complimentary Founder Pass to TechCrunch Disrupt this fall.
TechCrunch Early Stage Part Two is set to be a game-changer for founders looking to take their startups to the next level. At this two-day virtual event, early-stage founders can take part in highly interactive group sessions with top investors and ecosystem experts, in fields ranging from fundraising and marketplace positioning, to growth marketing and content development.
Without further ado, here are your judges for the Early Stage Pitch-Off:
Ben Sun, Primary Venture Partners
Image Credits: Primary Venture Partners
Ben is a co-founder and general partner at Primary Venture Partners. He has been a serial entrepreneur and investor as a co-founder of LaunchTime, an incubator and investor in early-stage tech startups and as a co-founder of Community Connect, which was one of the first social networking companies. Ben focuses his investing activities on primarily consumer-facing companies. His previous investments include Coupang, Jet.com, MakeSpace, Ollie, Mirror, Slice, Bounce Exchange, Selfmade, Shoptalk and Penrose Hill. Ben has been active in the NYC tech community for almost 20 years. Prior to working as an entrepreneur and investor, Ben worked at Merrill Lynch in the Technology Investment Banking Group. He graduated from the University of Michigan with a degree in Economics.
Leah Solivan, Fuel Capital
Image Credits: Leah Solivan
Leah Solivan is general partner at Fuel Capital, where she invests in early-stage companies across consumer technology, hardware, marketplaces and retail. She’s passionate about supporting teams who are taking on world-changing ideas. Leah relates so well to founders because she is one herself. She created one of the most widely recognized consumer brands of the past decade with TaskRabbit. As TaskRabbit’s CEO for eight years, Leah scaled the company to 44 cities and raised more than $50 million. In 2016, Leah transitioned into the role of executive chairwoman and in 2017, TaskRabbit was acquired by IKEA.
Shardul Shah, Index Ventures
Image Credits: Index Ventures
Shardul joined Index in 2008. He focuses on security, cloud infrastructure and enterprise software investments. He is a director of Attack IQ, Brightback, Castle Intelligence, Datadog (Nasdaq:DDOG), Expel, Gatsby and Wiz.io. Shardul was previously a director of Adallom (Microsoft), Sourceclear (CA Technologies), Koality (Docker), Lacoon (Check Point), Base (Zendesk) and an investor in Duo Security (Cisco). After graduating from the University of Chicago, Shardul worked with Summit Partners where he focused on healthcare and internet technologies.
Apple today is releasing a new version of its App Store Review Guidelines, its lengthy document which dictates the rules which apps must abide by in order to be published to its App Store. Among the more notable changes rolling out today, are several sections that will see Apple taking a harder stance on App Store fraud, scams and developer misconduct, including a new process that aims to empower other developers to hold bad actors accountable.
One of the key updates on this front involves a change to Apple’s Developer Code of Conduct (Section 5.6 and 5.6.1-5.6.4 of the Review Guidelines).
This section has been significantly expanded to include guidance stating that repeated manipulative or misleading behavior or other fraudulent conduct will lead to the developer’s removal from the Apple Developer Program. This is something Apple has done for repeated violations, it claims, but wanted to now ensure was clearly spelled out in the guidelines.
In an entirely new third paragraph in this section, Apple says that if a developer engages in activities or actions that are not in accordance with the developer code of conduct, they will have their Apple Developer account terminated.
It also details what, specifically, must be done to restore the account, which includes providing Apple with a written statement detailing the improvements they’ve made, which will have to be approved by Apple. If Apple is able to confirm the changes has been made, it may then restore the developer’s account.
Apple explained in a press briefing that this change was meant to prevent a sort of catch and release scenario where a developer gets caught by Apple, but then later reverts their changes to continue their bad behavior.
As part of this update, Apple added a new section about developer identity (5.6.2). This is meant to ensure the contact information for developers provided to Apple and customers is accurate and functional, and that the developer isn’t impersonating other, legitimate developers on the App Store. This was a particular issue in a high-profile incident of App Store fraud which involved a crypto wallet app that scammed a user out of his life savings (~$600,000) in Bitcoin. The scam victim had been deceived because the app was using the same name and icon as a different company that made a hardware crypto device, and because the scan app was rated 5 stars. (Illegitimately, that is).
Related to this, Apple clarified the language around App Store discovery fraud (5.6.3) to more specifically call out any type of manipulations of App Store charts, search, reviews and referrals. The former would mean to crack down on the clearly booming industry of fake App Store ratings and reviews, which can send scam app up higher in charts and search.
Meanwhile, the referral crackdown would address consumers being shown incorrect pricing outside the App Store in an effort to boost installs.
There are hundreds of these. And then, there's hundreds of *real* ones too:
"SCAM. What shady business. Downloaded this app on concept. It doesn’t even work. There is no free version AT ALL. You are tricked into downloading and then asked to pay $7.99 per FREAKING WEEK. Wow."
— Kosta Eleftheriou (@keleftheriou) January 31, 2021
Another section (5.6.4) addresses issues that come up after an app is published, including negative customer reports and concerns and excessive refund rates, for example. If Apple notices this behavior, it will investigate the app for violations, it says.
Of course, the question here is: will Apple actually notice the potential scammers? In recent months, a growing number of developers believe Apple is allowing far too many scammers to fall through the cracks of App Review.
One particular thorn in Apple’s side has been Fleksy keyboard app founder Kosta Eleftheriou, who is not only suing Apple for the revenue he’s personally lost to scammers, but also formed a sort of one-man bunco squad to expose some of the more egregious scams to date. This has included the above-mentioned crypto scam; a kids game that actually contained a hidden online casino; and a VPN app scamming users out of $5 million per year, among many others.
The rampant fraud taking place on the App Store was also brought up during Apple’s antitrust hearing, when Georgia’s Senator Jon Ossoff asked Apple’s Chief Compliance Officer Kyle Andeer why Apple was not able to locate scams, given they’re “trivially easy” to identify.
Apple downplayed the concerns then, and continues to do so through press releases like this one which noted how the App Store stopped over $1.5 billion in fraudulent transactions in 2020.
But a new update to these Guidelines seems to be an admission that Apple may need a little help on this front. It says developers can now directly report possible violations they find in other developers’ apps. Through a new form that standardizes this sort of complaint, developers can point to guideline violations and any other trust and safety issues they discover. Often, developers notice the scammers whose apps are impacting their own business and revenue, so they’ll likely turn to this form now as a first step in getting the scammer dealt with.
Another change will allow developers to appeal a rejection if they think there was unfair treatment of any kind, including political bias. Previously, Apple had allowed developers to appeal App Store decisions and suggest changes to guidelines.
These are only a handful of the many changes rolling out with today’s updated App Store Review Guidelines.
There are a few others, however, also worth highlighting:
Today’s WWDC keynote from Apple covered a huge range of updates. From a new macOS to a refreshed watchOS to a new iOS, better privacy controls, FaceTime updates, and even iCloud+, there was something for everyone in the laundry list of new code.
Apple’s keynote was essentially what happens when the big tech companies get huge; they have so many projects that they can’t just detail a few items. They have to run down their entire parade of platforms, dropping packets of news concerning each.
But despite the obvious indication that Apple has been hard at work on the critical software side of its business, especially its services-side (more here), Wall Street gave a firm, emphatic shrug.
This is standard but always slightly confusing.
Investors care about future cash flows, at least in theory. Those future cash flows come from anticipated revenues, which are born from product updates, driving growth in sales of services, software, and hardware. Which, apart from the hardware portion of the equation, is precisely what Apple detailed today.
And lo, Wall Street looked upon the drivers of its future earnings estimates, and did sayeth “lol, who really cares.”
Shares of Apple were down a fraction for most of the day, picking up as time passed not thanks to the company’s news dump, but because the Nasdaq largely rose as trading raced to a close.
Here’s the Apple chart, via YCharts:
And here’s the Nasdaq:
Presuming that you are not a ChartMaster, those might not mean much to you. Don’t worry. The charts say very little all-around so you are missing little. Apple was down a bit, and the Nasdaq up a bit. Then the Nasdaq went up more, and Apple’s stock generally followed. Which is good to be clear, but somewhat immaterial.
So after yet another major Apple event that will help determine the health and popularity of every Apple platform — key drivers of lucrative hardware sales! — the markets are betting that all their prior work estimating the True and Correct value of Apple was dead-on and that there is no need for any sort of up-or-down change.
That, or Apple is so big now that investors are simply betting it will grow in keeping with GDP. Which would be a funny diss. Regardless, more from the Apple event here in case you are behind.
It’s that time again! This morning Apple kicked off its annual Worldwide Developer Conference the same way it does each year: a keynote jammed to the brim with new stuff.
Didn’t have time to tune in to the liveblog or watch the stream? We get it. That’s why we’ve boiled all of the biggest news down to the bulletpoints for you below. Skim at your leisure!
Craig Federighi started things off with details on the latest major update to iOS, noting that this release focuses on four points: staying connected, finding focus, using intelligence, and exploring the world. iOS 15 will roll into public beta in July, with a full release “this fall.”
The camera can now recognize text in photos (handwritten or printed) and make it selectable, searchable, copy/pastable, etc. It can recognize things beyond text, as well, including animal breeds, landmarks, etc.
Apple is incorporating home, office, and hotel keys into Apple Wallet, allowing you to unlock doors in those places with your phone. You’ll be able to check into select hotels, for example, and have your room key show up on your phone as soon as your room is ready.
They’re also working with the TSA (in select states, at first) to put an encrypted copy of your state driver’s license (!) in Wallet that will be accepted at security screenings.
Image Credits: Apple
Maps is getting a details-focused overhaul, with the addition of 3D elevation maps, 3D rendered landmarks, crosswalks, bikelanes, etc.
Apple Maps is also taking some cues from Google Maps, including a mode that has you use the camera to scan nearby buildings to more precisely orient the phone and help you figure out which direction you’re supposed to go at the beginning of a walk.
Apple says it’s opening up Siri to third party manufacturers and their devices, allowing Siri to live on things like ecobee thermostats beginning later this year.
Though its called the Worldwide Developer Conference, Apple tends to keep the keynote focused largely on the consumer-facing stuff and save the most technical bits for the week’s many breakout sessions. They did touch on a few developer highlights, though, including:
Apple today announced a number of coming changes and improvements to the App Store that will help developers better target their apps to users, get their apps discovered by more people and even highlight what sort of events are taking place inside their apps to entice new users to download the app and encourage existing users to return.
The company said its App Store today sees 600 million weekly users across 175 countries, and has paid out more than $230 billion to developers since the App Store launched, highlighting the business opportunity for app developers.
However, as the App Store has grown, it’s become harder for app developers to market their apps to new users or get their apps found. The new features aim to address that.
Image Credits: Apple
One change involves the app’s product page. Starting this year, app developers will be able to create multiple custom product pages to showcase different features of their app for different users. For instance, they’ll be able to try out things like different screenshots, videos, and even different app icons to A/B test what users like the most.
They’ll also be able to advertise the dynamic things that are taking place inside their apps on an ongoing basis. Apple explained that apps and games are constantly rolling out new content and limited time events like film premieres on streaming services, events like Pokémon GO fests, or Nike fitness challenges. But these events were often only discoverable by those who already had the app installed and then opted in to push notifications.
Image Credits: Apple
Apple will now allow developers to better advertise these events, with the launch of in-app events “front and center on the App Store.” The events can be showcased on the app’s product page. Users can learn more about the events, sign up to be notified or quickly join the event, if it’s happening now. They can also discover events with personalized recommendations and through App Store search.
App Store editors will curate the best events and the new App Store widget will feature upcoming events right on users’ home screens, too.
Apple says the feature will be open to all developers, including those who already run events and those who are just getting started.
During the WWDC conference today, Apple unveiled the new macOS 12 Monterey. A major feature in the macOS update is Universal Control, which builds upon the Continuity features first introduced in OS X Yosemite. For years it’s been possible to open a news article on your iPhone and keep reading it on your MacBook, or to copy and paste a link from your iPad to your iMac. But Universal Control takes these features further.
With Universal Control, you can use a single mouse and keyboard to navigate across multiple Apple devices at once. This functionality works across more than two devices — in the demo video, the feature is used to seamlessly move across an iPad, MacBook and iMac. Users can drag and drop files across multiple devices at once, making it possible, for example, to use a multi-screen setup while editing video on Final Cut Pro.
What’s possible in Universal Control isn’t necessarily new — this has been made possible before through third-party apps. Plus, in 2019, Apple debuted Sidecar, which allowed users to connect their iPad as a second monitor for their MacBook or iMac. But, Universal Control improves upon Sidecar — and maybe renders it obsolete — by allowing users to link any Apple devices together, even if it’s not an iPad. Though this update may not be groundbreaking, it’s a useful upgrade to existing features.
Apple didn’t announce that rumored combined Apple TV device that would combine the set-top box with a HomePod speaker during its WWDC keynote, but it did announce a few features that will improve the Apple TV experience — including one that involves a HomePod Mini. Starting this fall, Apple said you’ll be able to select the HomePod Mini as the speaker for your Apple TV 4K. It also introduced a handful of software updates for Apple TV users, including a new way to see shows everyone in the family will like, and support for co-watching shows through FaceTime.
The co-watching feature is actually a part of a larger FaceTime update, which will let users stream music, TV, and screen share through their FaceTime calls. The Apple TV app is one of those that’s supported through this new system, called SharePlay. It will now include a new “Shared with You” row that highlights the shows and movies your friends are sharing, as well.
Another feature called “For All of You” will display a collection of shows and movies based on everyone’s interests within Apple TV’s interface. This is ideal you’re planning to watch something as a family — like for movie night, for example. And you can fine tune the suggestions based on who’s watching.
A new Apple TV widget is also being made available, which now includes iPad support.
And the new support for HomePod Mini will help deliver “rich, balanced sound” and “crystal clear dialog,” when you’re watching Apple TV with the Mii set up as your speakers, Apple said.
The past year has seen some of the most dramatic updates to Macs in recent memory. At last year’s WWDC, Apple announced its long-awaited move from Intel chip’s to its own first party silicon. By the end of the year, the company launched the first three M1 Macs, along with Big Sur, one of the biggest updates to macOS.
At this morning’s kickoff to WWDC, the company unveiled macOS 12 — named, you guessed it, Monterey. Universal Control is the top line new feature here, which further bridges the gap between desktop and tablet. Sticking the iPad next to a Mac, you can move the cursor between devices using the same trackpad and keyboard. The feature works on up to three devices at once.
Apple is rolling out some updates to iCloud under the name iCloud+. The company is announcing those features at its developer conference. Existing paid iCloud users are going to get those iCloud+ features for the same monthly subscription price.
In Safari, Apple is going to launch a new privacy feature called Private Relay. It sounds a bit like the new DNS feature that Apple has been developing with Cloudflare. Originally named Oblivious DNS-over-HTTPS, Private Relay could be a better name for something quite simple — a combination of DNS-over-HTTPS with proxy servers.
When Private Relay is turned on, nobody can track your browsing history — not your internet service provider, anyone standing in the middle of your request between your device and the server you’re requesting information from. We’ll have to wait a bit to learn more about how it works exactly.
The second iCloud+ feature is ‘Hide my email’. It lets you generate random email addresses when you sign up to a newsletter or when you create an account on a website. If you’ve used ‘Sign in with Apple’, you know that Apple offers you the option to use fake iCloud email addresses. This works similarly, but for any app.
Finally, Apple is overhauling HomeKit Secure Video. With the name iCloud+, Apple is separating free iCloud users from paid iCloud users. Basically, you used to pay for more storage. Now, you pay for more storage and more features. Subscriptions start at $0.99 per month for 50GB (and iCloud+ features).
More generally, Apple is adding two much needed to iCloud accounts. Now, you can add a friend for account recovery. This way, you can request access to your data to your friend. But that doesn’t mean that your friend can access your iCloud data — it’s just a way to recover your account.
The last much-needed update is a legacy feature. You’ll soon be able to add one or several legacy contacts. Data can be passed along when you pass away. And this is a much needed feature as many photo libraries become inaccessible when someone close to you passes away.
Among many updates coming to iOS 15, Apple Maps will receive a number of upgrades that will bring more detailed maps, improvements for transit riders, AR experiences and other changes to the platform. The improvements build on the new map Apple began rolling out two years ago, which had focused on offering richer details, and — in response to user feedback and complaints — more accurate navigation.
Since then, Apple Maps has steadily improved.
The new map experience has since launched in the U.S., U.K., Ireland and Canada and will now make its way to Spain and Portugal, starting today. It will then arrive in Italy and Australia later this year, Apple announced during its keynote address during its Worldwide Developer Conference on Monday.
Image Credits: Apple
In addition, Apple said iOS 15 Maps will include new details for commercial districts, marinas, buildings and more. Plus, Apple has added things like elevation, new road colors and labels, as well as hundreds of custom designed landmarks — for example, for places like the Golden Gate Bridge.
Apple also built a new nighttime mode for Maps with a “moonlit glow,” it said.
For drivers, Apple added new road details to the map, so it can help drivers as they move throughout a city to better see and understand important things like turn lanes, medians, bus and taxi lanes and other things. The changes are competitive with some of the updates Google has been making to its own Google Maps platform, which brought street-level details in select cities. These allowed people — including those navigating on foot, in a wheelchair, on a bike or on a scooter, for example — to better see things like sidewalks and intersections.
Apple is now catching up, saying it, too, will show features like crosswalks and bike lanes.
It will also render things like overlapping complex interchanges in 3D space, making it easier to see upcoming traffic conditions or what lane to take. These features will come to CarPlay later in the year.
Image Credits: Apple
For transit riders, meanwhile, Maps has made improvements to help users find nearby stations.
Users can now pin their favorite lines to the top, and even keep track on their Apple Watch so they don’t have to pull out their phone. The updated Maps app will automatically follow your transit route and notify you when it’s time to disembark, making the app more competitive to third-party apps often favored by transit takers, like Citymapper, for instance.
Image Credits: Apple
When you exit your station, you can also now hold up your iPhone to scan the buildings in the area and Maps will generate an accurate position, offering direction in augmented reality. This is similar to the Live View AR directions Google announced last year.
This feature is launching in select cities in 2021 with more to come in the year ahead, Apple said.
Image Credits: Apple
Apple has introduced a new feature to its camera system that automatically recognizes and transcribes text in your photos, from a phone number on a business card to a whiteboard full of notes. Live Text, as the feature is called, doesn’t need any prompting or special work from the user — just tap the icon and you’re good to go.
Announced by Craig Federighi on the virtual stage of WWDC, Live Text will be arriving on iPhones with iOS 15. He demonstrated it with a couple pictures, one of a whiteboard after a meeting, and a couple snapshots that included restaurant signs in the background.
Tapping the Live Text button in the lower right gave detected text a slight underline, and then a swipe allowed it to be selected and copied. In the case of the whiteboard, it collected several sentences of notes including bullet points, and with one of the restaurant signs it grabbed the phone number, which could be called or saved.
Certain types of text strings can be recognized, as well: a tracking code will be seen as such and a link to the tracking URL will be made immediately available. Translation can be done quickly too, to or from any language supported by Apple’s other translation tools.
The feature is reminiscent of many found in Google’s long-developed Lens app, and the Pixel 4 added more robust scanning capability in 2019. The difference is that the text is captured more or less passively in every photo taken by an iPhone running the new system — you don’t have to enter scanner mode or launch a separate app.
This is a nice thing for anyone to have, but it could be especially helpful for people with visual impairments. A snapshot or two makes any text, otherwise difficult to read, able to be dictated or saved.
The process takes place entirely on the phone, so don’t worry that this info is being sent to a datacenter somewhere. That also means it’s fairly quick, though until we test it for ourselves we can’t say whether it’s instantaneous or, like some other machine learning features, something that happens over the next few seconds or minutes after you take a shot. Your back catalog of photos will be Live Text-ified in your phone’s idle moments, though.
Over a year into a global pandemic, Apple announced some major updates to its FaceTime app at WWDC 2021, which will be available in iOS 15.
Notably, Android users will now be able to join in on FaceTime calls, posing some competition to apps like Zoom and Google Meet, which have boomed during lockdown. This FaceTime makeover will also include links to join calls, which can be sent via Calendar invites in advance of your meeting. These links work across platforms, whether you’re on the web, an Android phone or your iPhone.
Apple is also adding updates that make the experience of video chatting on FaceTime more closely resemble real life conversations.
“When talking in person, our brains process hundreds of social auditory and visual cues. When talking on a video call, many of those signals can get lost, leaving us feeling drained. So this year, we’ve set out to make FaceTime calls feel more natural, comfortable, and lifelike,” said Craig Federighi, Apple’s senior vice president of Software Engineering.
Through a spatial audio feature, FaceTime calls will sound like you’re sitting in the same room with your friends — that means if someone is on the left side of your screen, their audio will come through the left side of your speaker. This functionality might not translate well to smaller devices like an iPhone, but could be interesting on devices like an iMac. The person speaking will have a white ring appear around their video while they’re talking, and users will be able to select a grid view to see the other people on the call, which seems quite similar to Zoom.
Next, Apple announced voice isolation, which will improve the speaker’s audio quality when calling from a noisy area. The demo video that was shown during the WWDC announcement featured a child walking into the video frame with a leaf blower. In TechCrunch’s liveblog, Darrell Etherington pointed out that the video seemed heavily edited.
Finally, a feature called SharePlay will come to FaceTime, which makes it easier for friends to watch streaming videos together. SharePlay allows for group listening, watching and screen sharing, and Apple shared partners for SharePlay, including Disney+, Hulu, HBO Max, NBA, Twitch, TikTok, MasterClass, ESPN+, Paramount+ and PlutoTV. The SharePlay API will be available so that all video app makers can access it and integrate their own apps.
As part of its FaceTime update in iOS 15, Apple introduced a new set of features designed for shared experiences — like co-watching TV shows or TikTok videos, listening to music together, screen sharing and more — while on a FaceTime call. The feature, called SharePlay, enables real-time connections with family and friends while you’re hanging out on FaceTime, Apple explained, by integrating access to apps from within the call itself.
Image Credits: Apple
Apple demonstrated the new feature during its Worldwide Developer Conference keynote, showing how friends could press play in Apple Music to listen together, as the music streams to everyone on the call. Shared playback controls also let anyone on the call play, pause or jump to the next track.
The company also showed off watching video from its Apple TV+ streaming service, where the video was synced in real time between call participants. This was a popular trend during the pandemic, as people looked to virtually watch movies and TV with family and friends, prompting services like Hulu and Amazon Prime Video to add native co-watching features.
But Apple’s SharePlay goes much further than streaming music and video from just Apple’s own services.
The company announced a set of launch partners for SharePlay, including Disney+, Hulu, HBO Max, NBA, Twitch, TikTok, MasterClass, ESPN+, Paramount+ and Pluto TV. It’s also making an API available to developers so they can integrate their own apps with SharePlay.
Image Credits: Apple
Users can screen share via SharePlay, too, so you can do things like browse Zillow listings together or show off a mobile gameplay, Apple suggested.
“Screen sharing is also a simple and super effective way to help someone out and answer questions right in the moment, and it works across Apple devices,” noted Apple SVP of Software Engineering, Craig Federighi.
The feature will roll out with iOS 15.
As part of the update to iOS 15, Apple will allow iPhone users to better customize how they want to be notified about incoming calls, texts or updates from apps based on their current status. With the new Focus feature instead of just silencing calls and notifications through Do Not Disturb, you’ll be able to set different types of notification preferences based on your status — like whether you’re driving, working, sleeping or through the use of custom categories of your own choosing.
The system also uses on-board AI to help determine what your status is at a given time. For example, the phone might suggest you turn off notifications when you’re arriving at the gym.
With the new Focus feature, you can customize how you want to be notified and when. For example, you could choose to only be notified by co-workers or by apps like Mail, Calendar or Slack. You can even dedicate a page on your home screen to match your focus and organize your apps and widgets in a way that reduces temptations by making only your work apps visible. When you use Focus, it will sync to all your other Apple devices, too, Apple notes.
Notifications have a new look too, with contact photos for people and larger icons for apps, making them easier to identify.
In 2019, Tim Cook admitted that he himself had silenced many notifications, famously saying that “Apple never wanted to maximize user time. We’ve never been about that.” But we disputed that was the case, saying Apple could have designed a system that delivered notifications on a schedule, not in real time.
Apparently, Apple has now has: it’s called Notification Summary, which bundles and prioritizes incoming notifications, based on things like time of day. You can schedule this summary to be delivered at any time you choose, like in the morning or evening, for example.
Apple says it uses device intelligence based on how you use your apps to arrange this summary, making it easier to quickly catch up. The most important notifications appear above the less important notifications in the summary.
Image Credits: Apple
In another change, Do Not Disturb has been integrated directly into iMessage, so other users will know when you don’t want to be bothered by incoming notes.
Now, when you’re using Do Not Disturb, your status is automatically displayed in iMessage. When you reach out to someone who’s indicated they don’t want to be disturbed, you’ll be reminded in that moment. And for truly urgent messages, there’s still a way to get through, Apple noted.
The system can also be set up to send you a briefing as a big chunk during a time of day, making the lock screen notification a more interactive experience than in previous iOS versions.
These features, which involve better personalizing the phone to the user, somewhat recall the smart launchers that had been popular on Android devices many years ago, like Aviate or EverythingMe, among others, which customized your device based on what you were doing, the time of day and other factors. These apps never took off on iOS because Apple doesn’t allow third-party apps to deeply integrate with its mobile operating system or reconfigure the device’s home screen, and the trend fizzled out.
But Apple more recently has been customizing its iPhone to better reflect user preferences, including with the introduction of widgets and Siri Suggestions, which can even be a widget of suggested apps on the home screen.
During the virtual keynote of WWDC, Apple shared the first details about iOS 15, the next major version of iOS that is going to be released later this year. There are four pillars with this year’s release: staying connected, focusing without distraction, using intelligence and exploring the world.
“For many of us, our iPhones have become indispensable,” SVP of Software Engineering Craig Federighi said. “Our new release is iOS 15. It’s packed with features that make the iOS experience adapt to and complement the way you use iPhone, whether it’s staying connected with those who matter to you most, finding the space to focus without distraction, using intelligence to discover the information you need or exploring the world around you.”
Apple is adding spatial audio to FaceTime. Now the voices are spread out depending on the position of your friends on the screen. For instance, if someone appears on the left, it’ll sound like they’re on the left in your ears. In other FaceTime news, iOS now detects background noise and tries to suppress it so that you can hear your friends and family members more easily. That’s an optional feature, which means that you can disable it in case you’re showing a concert during a FaceTime call for instance.
Another FaceTime feature is “Portrait mode”. Behind this term, Apple means that it can automatically blur the background, like in “Portrait mode” photos. In case you want to use FaceTime for work conferences, you can now generate a FaceTime link and add it to a calendar invite. FaceTime will also work in a web browser, which means that people without an Apple device can join a FaceTime call. All of these features make FaceTime more competitive with other video call services, such as Zoom and Google Meet.
FaceTime is a big focus as Apple is also introducing SharePlay. With this feature, you can listen together to a music album. Press play in Apple Music and the music will start for everyone on the call. The queue is shared with everyone else, which means anyone can add songs, skip to the next track, etc.
SharePlay also lets you watch movies and TV shows together. Someone on the call starts a video and it starts on your friend’s phone or tablet. It is also compatible with AirPlay, picture-in-picture and everything you’d expect from videos on iOS.
This isn’t just compatible with videos in the Apple TV app. Apple said there will be an API to make videos compatible with SharePlay. Partners include Disney+, Hulu, HBO Max, Twitch, TikTok and more. Here’s a screenshot of the initial partners:
Now let’s switch to Messages. The app is getting better integration with other Apple apps like News, Photos and Music. Items shared via Messages show up in those apps. In other words, Messages (and iMessage) is acting as the social layer on top of Apple’s apps.
Apple is going to use on-device intelligence to create summaries of your notifications. Instead of being sorted by apps and by date, it is sorted by priority. For instance, notifications from friends will be closer to the top.
When you silence notifications, your iMessage contacts will see that you have activated “Do not disturb”. It works a bit like “Do not disturb” in Slack. But there are new settings. Apple calls this Focus mode. You can choose apps and people you want notifications from and change your focus depending on what you’re doing.
For instance, if you’re at work, you can silence personal apps and personal calls and messages. If it’s the weekend, you can silence your work emails. Your settings sync across your iCloud account if you have multiple Apple devices. And it’ll even affect your home screen by showing and hiding apps and widgets.
Apple is going to scan your photos for text. Called Live Text, this feature lets you highlight, copy and paste text in photos. It could be a nice accessibility feature as well. And, iOS is going to leverage that info for Spotlight. You can search from text in your photos directly in Spotlight. These features are handled on-device directly.
With iOS 15, memories are getting an upgrade. “These new memories are built on the fly. They are interactive and alive,” said Chelsea Burnette, senior manager, Photos Engineering. Memories are those interactive movies that you can watch in the Photos app. Now, you can tap with your finger to pause the movie. While music still plays in the background, your photo montage resumes when you lift your finger.
You can now search for a specific song to pair with a memory. It’s going to be interesting to see in detail what’s new for the Photos app.
After a recap of all the features of Apple Wallet, the company announced that you’ll be able to scan your ID and store it in Wallet. It’ll be available in participating states so it’s going to be a slow rollout. When a government service wants some info from your ID, you can choose to share some data with this service directly on your iPhone.
When it comes to the Weather app, it has been updated to include many of the features that were available in Dark Sky, a popular weather app that has been acquired by Apple. Expect a new design and more data.
As for Apple Maps, the new mapping data has been rolled out in several countries and Apple is still rolling it out in Europe. Apple has added a ton of new details to some areas, such as San Francisco. You can see bus and taxi lanes, crosswalks, bike lanes, etc. On highways, you see complex interchanges in 3D. All of this is also coming to Car Play later this year.
With transit, users can pin their favorite lines and view info on their Apple Watch. When you’re in a subway or bus, you can see your location in real time. It sounds a bit like Citymapper’s itinerary feature. You can also get directions in augmented reality by holding your phone in front of you.
Apple is also announcing a bunch of new features for users who have AirPods. There’s a new conversation mode that makes it a smart hearing aid to boost conversation volume. You’ll also get more notifications if you’ve activated the “Announce notifications” setting. You can tweak that setting to limit it to certain apps and change depending on your focus mode.
You can also find your AirPods with the Find My app with audio notifications even when they’re in the case. Spatial audio is coming to the Apple TV and Macs with an M1 chip. As announced a few weeks ago, spatial audio for Apple Music is launching right now.
As you can see, iOS 15 is packed with new features. Apple is releasing a developer beta with an initial release today. The public beta phase will start in July. You can expect beta updates throughout the summer and a final release this fall.
And we’re back. Well, not back-back. But we’re here in the San Jose McEnery Convention Center of the mind. The parking is awful and the hotels all got booked up five months ago, so we’re taking the CalTrain in from Redwood City (of the mind).
We’ve got a full house at this morning’s virtual kick-off to Apple’s annual developer conference. And good thing, too. It’s shaping up to be a packed event. You can read more about that here. You can also check out Apple’s own livestream here. And, of course, we’ll be breaking out the biggest news into bite-sized chunks.
As always, the kick-off event is focused on Apple’s (numerous) operating systems: iOS/iPadOS, macOS, watchOS, tvOS and, perhaps, a new homeOS. Oftentimes that also comes with some new hardware. After all, you’ll need something to run those operating systems on.
Matthew will be leading the show, with help from various members of the TC team. Things kick off today at 10AM PT/1PM ET.