At its Worldwide Developer Conference, Apple announced a significant update to RealityKit, its suite of technologies that allow developers to get started building AR (augmented reality) experiences. With the launch of RealityKit 2, Apple says developers will have more visual, audio, and animation control when working on their AR experiences. But the most notable part of the update is how Apple’s new Object Capture technology will allow developers to create 3D models in minutes using only an iPhone.
Apple noted during its developer address that one of the most difficult parts of making great AR apps was the process of creating 3D models. These could take hours and thousands of dollars.
With Apple’s new tools, developers will be able take a series of pictures using just an iPhone (or iPad or DSLR, if they prefer) to capture 2D images of an object from all angles, including the bottom.
Then, using the Object Capture API on macOS Monterey, it only takes a few lines of code to generate the 3D model, Apple explained.
Image Credits: Apple
To begin, developers would start a new photogrammetry session in RealityKit that points to the folder where they’ve captured the images. Then, they would call the process function to generate the 3D model at the desired level of detail. Object Capture allows developers to generate the USDZ files optimized for AR Quick Look — the system that lets developers add virtual, 3D objects in apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy and others are using Object Capture to create 3D models of real-world objects — an indication that online shopping is about to get a big AR upgrade.
Wayfair, for example, is using Object Capture to develop tools for their manufacturers so they can create a virtual representation of their merchandise. This will allow Wayfair customers to be able to preview more products in AR than they could today.
Image Credits: Apple (screenshot of Wayfair tool))
In addition, Apple noted developers including Maxon and Unity are using Object Capture for creating 3D content within 3D content creation apps, such as Cinema 4D and Unity MARS.
Other updates in RealityKit 2 include custom shaders that give developers more control over the rendering pipeline to fine tune the look and feel of AR objects; dynamic loading for assets; the ability to build your own Entity Component System to organize the assets in your AR scene; and the ability to create player-controlled characters so users can jump, scale and explore AR worlds in RealityKit-based games.
One developer, Mikko Haapoja of Shopify, has been trying out the new technology (see below) and shared some real-world tests where he shot objects using an iPhone 12 Max via Twitter.
Developers who want to test it for themselves can leverage Apple’s sample app and install Monterey on their Mac to try it out.
Apple says there are over 14,000 ARKit apps on the App Store today, which have been built by over 9,000 different developers. With the over 1 billion AR-enabled iPhones and iPads being used globally, it notes that Apple offers the world’s largest AR platform.
Apple's Object Capture on a Pineapple. One of my fav things to test Photogrammetry against. This was processed using the RAW detail setting.
More info in thread pic.twitter.com/2mICzbV8yY
— Mikko Haapoja (@MikkoH) June 8, 2021
Apple's Object Capture is the real deal. I'm impressed. Excited to see where @Shopify merchants could take this
Allbirds Tree Dashers. More details in thread pic.twitter.com/fNKORtdtdB
— Mikko Haapoja (@MikkoH) June 8, 2021
Last month, Apple announced it would soon add lossless audio streaming and Spatial Audio with support for Dolby Atmos to its Apple Music subscription at no extra charge. That upgrade has now gone live, Apple announced this morning — though many noticed the additions actually rolled out yesterday, following the WWDC keynote.
The entire Apple Music catalog of 75+ million songs will support lossless audio.
The lossless tier begins at CD quality — 16 bit at 44.1 kHz, and goes up to 24 bit at 48 kHz, Apple previously said. Audiophiles can also opt for the high-resolution lossless that goes up to 24 bit at 192 kHz. Apple has said you’ll need to use an external, USB digital-to-analog converter to take advantage of the latter — simply plugging in a pair of headphones to an iPhone won’t work.
Apple Music subscribers will be able to enable the new lossless option under Settings > Music > Audio quality. Here, you’ll be able to choose the different resolutions you want to use for different connections, including Wi-Fi, cellular, and download.
When you make your selection in Settings, iOS warns that lossless files will use “significantly more space” on your device, as 10 GB of storage would allow you to store approximately 3,000 songs at high quality, 1,000 songs with lossless, or 200 songs with high-res lossless.
Image Credits: Apple
Meanwhile, Spatial Audio will be enabled by default on hardware that supports Dolby Atmos, like Apple’s AirPods and Beats headphones with an H1 or W1 chip. The latest iPhone, iPad, and Mac models also support Dolby Atmos. Spatial Audio on Apple Music will also be “coming soon” to Android devices, Apple said.
To kick off launch, Apple Music is today rolling out new playlists designed to showcase Spatial Audio. These include:
Apple is also adding a special guide to Spatial Audio on Apple Music, which will help music listeners hear the difference. This will include tracks from artists like Marvin Gaye and The Weeknd, among others. And Apple will air a roundtable conversation about Spatial Audio featuring top sound engineers and experts, hosted by Zane Lowe at 9 am PT today on Apple Music.
Because songs have to be remastered for Dolby Atmos specifically, these guides and playlists will help music fans experience the new format without having to hunt around. Apple says it’s working with artists and labels to add more new releases and the best catalog tracks in Spatial Audio. To help on this front, Apple notes there are various initiatives underway — including doubling the number of Dolby-enabled studios in major markets, offering educational programs, and providing resources to independent artists.
Apple also said it will build music-authoring tools directly into Logic Pro. Later this year, the company plans to release an update to Logic Pro that will allow any musician to create and mix their songs in Spatial Audio for Apple Music.
During the WWDC conference today, Apple unveiled the new macOS 12 Monterey. A major feature in the macOS update is Universal Control, which builds upon the Continuity features first introduced in OS X Yosemite. For years it’s been possible to open a news article on your iPhone and keep reading it on your MacBook, or to copy and paste a link from your iPad to your iMac. But Universal Control takes these features further.
With Universal Control, you can use a single mouse and keyboard to navigate across multiple Apple devices at once. This functionality works across more than two devices — in the demo video, the feature is used to seamlessly move across an iPad, MacBook and iMac. Users can drag and drop files across multiple devices at once, making it possible, for example, to use a multi-screen setup while editing video on Final Cut Pro.
What’s possible in Universal Control isn’t necessarily new — this has been made possible before through third-party apps. Plus, in 2019, Apple debuted Sidecar, which allowed users to connect their iPad as a second monitor for their MacBook or iMac. But, Universal Control improves upon Sidecar — and maybe renders it obsolete — by allowing users to link any Apple devices together, even if it’s not an iPad. Though this update may not be groundbreaking, it’s a useful upgrade to existing features.
Among many updates coming to iOS 15, Apple Maps will receive a number of upgrades that will bring more detailed maps, improvements for transit riders, AR experiences and other changes to the platform. The improvements build on the new map Apple began rolling out two years ago, which had focused on offering richer details, and — in response to user feedback and complaints — more accurate navigation.
Since then, Apple Maps has steadily improved.
The new map experience has since launched in the U.S., U.K., Ireland and Canada and will now make its way to Spain and Portugal, starting today. It will then arrive in Italy and Australia later this year, Apple announced during its keynote address during its Worldwide Developer Conference on Monday.
Image Credits: Apple
In addition, Apple said iOS 15 Maps will include new details for commercial districts, marinas, buildings and more. Plus, Apple has added things like elevation, new road colors and labels, as well as hundreds of custom designed landmarks — for example, for places like the Golden Gate Bridge.
Apple also built a new nighttime mode for Maps with a “moonlit glow,” it said.
For drivers, Apple added new road details to the map, so it can help drivers as they move throughout a city to better see and understand important things like turn lanes, medians, bus and taxi lanes and other things. The changes are competitive with some of the updates Google has been making to its own Google Maps platform, which brought street-level details in select cities. These allowed people — including those navigating on foot, in a wheelchair, on a bike or on a scooter, for example — to better see things like sidewalks and intersections.
Apple is now catching up, saying it, too, will show features like crosswalks and bike lanes.
It will also render things like overlapping complex interchanges in 3D space, making it easier to see upcoming traffic conditions or what lane to take. These features will come to CarPlay later in the year.
Image Credits: Apple
For transit riders, meanwhile, Maps has made improvements to help users find nearby stations.
Users can now pin their favorite lines to the top, and even keep track on their Apple Watch so they don’t have to pull out their phone. The updated Maps app will automatically follow your transit route and notify you when it’s time to disembark, making the app more competitive to third-party apps often favored by transit takers, like Citymapper, for instance.
Image Credits: Apple
When you exit your station, you can also now hold up your iPhone to scan the buildings in the area and Maps will generate an accurate position, offering direction in augmented reality. This is similar to the Live View AR directions Google announced last year.
This feature is launching in select cities in 2021 with more to come in the year ahead, Apple said.
Image Credits: Apple
Apple has introduced a new feature to its camera system that automatically recognizes and transcribes text in your photos, from a phone number on a business card to a whiteboard full of notes. Live Text, as the feature is called, doesn’t need any prompting or special work from the user — just tap the icon and you’re good to go.
Announced by Craig Federighi on the virtual stage of WWDC, Live Text will be arriving on iPhones with iOS 15. He demonstrated it with a couple pictures, one of a whiteboard after a meeting, and a couple snapshots that included restaurant signs in the background.
Tapping the Live Text button in the lower right gave detected text a slight underline, and then a swipe allowed it to be selected and copied. In the case of the whiteboard, it collected several sentences of notes including bullet points, and with one of the restaurant signs it grabbed the phone number, which could be called or saved.
Certain types of text strings can be recognized, as well: a tracking code will be seen as such and a link to the tracking URL will be made immediately available. Translation can be done quickly too, to or from any language supported by Apple’s other translation tools.
The feature is reminiscent of many found in Google’s long-developed Lens app, and the Pixel 4 added more robust scanning capability in 2019. The difference is that the text is captured more or less passively in every photo taken by an iPhone running the new system — you don’t have to enter scanner mode or launch a separate app.
This is a nice thing for anyone to have, but it could be especially helpful for people with visual impairments. A snapshot or two makes any text, otherwise difficult to read, able to be dictated or saved.
The process takes place entirely on the phone, so don’t worry that this info is being sent to a datacenter somewhere. That also means it’s fairly quick, though until we test it for ourselves we can’t say whether it’s instantaneous or, like some other machine learning features, something that happens over the next few seconds or minutes after you take a shot. Your back catalog of photos will be Live Text-ified in your phone’s idle moments, though.
An email has been going around the internet as a part of a release of documents related to Apple’s App Store based suit brought by Epic Games. I love this email for a lot of reasons, not the least of which is that you can extrapolate from it the very reasons Apple has remained such a vital force in the industry for the past decade.
The gist of it is that SVP of Software Engineering, Bertrand Serlet, sent an email in October of 2007, just three months after the iPhone was launched. In the email, Serlet outlines essentially every core feature of Apple’s App Store — a business that now brings in an estimated $64B per year. And that, more importantly, allowed the launch of countless titanic internet startups and businesses built on and taking advantage of native apps on iPhone.
Forty five minutes after the email, Steve Jobs replies to Serlet and iPhone lead Scott Forstall, from his iPhone, “Sure, as long as we can roll it all out at Macworld on Jan 15, 2008.”
Apple University should have a course dedicated to this email.
Here it is, shared by an account I enjoy, Internal Tech Emails, on Twitter. If you run the account let me know, happy to credit you further here if you wish:
Bertrand Serlet to Steve Jobs: "Fine, let's enable Cocoa Touch apps"
October 2, 2007 pic.twitter.com/9aTxmjgkRS
— Internal Tech Emails (@TechEmails) June 3, 2021
First, we have Serlet’s outline. It’s seven sentences that outline the key tenets of the App Store. User protection, network protection, an owned developer platform and a sustainable API approach. There is a direct ask for resources — whoever we need in software engineering — to get it shipped ASAP.
It also has a clear ask at the bottom, ‘do you agree with these goals?’
Enough detail is included in the parentheticals to allow an informed reader to infer scope and work hours. And at no point during this email does Serlet include an ounce of justification for these choices. These are the obvious and necessary framework, in his mind, for accomplishing the rollout of an SDK for iPhone developers.
There is no extensive rationale provided for each item, something that is often unnecessary in an informed context and can often act as psychic baggage that telegraphs one of two things:
Neither one of those is the wisest way to provide an initial scope of work. There is plenty of time down the line to flesh out rationale to those who have less command of the larger context.
If you’re a historian of iPhone software development, you’ll know that developer Nullriver had released Installer, a third-party installer that allowed apps to be natively loaded onto iPhone, in the summer of 2007, early September, I believe. It was followed in 2008 by the eventually far more popular Cydia. And there were developers that August and September already experimenting with this completely unofficial way of getting apps on the store, like the venerable Twitterific by Craig Hockenberry and Lights Off by Lucas Newman and Adam Betts.
Though there has been plenty of established documentation of Steve being reluctant about allowing third-party apps on iPhone, this email establishes an official timeline for when the decision was not only made but essentially fully formed. And it’s much earlier than the apocryphal discussion about when the call was made. This is just weeks after the first hacky third-party attempts had made their way to iPhone and just under two months since the first iPhone jailbreak toolchain appeared.
There is no need or desire shown here for Steve to ‘make sure’ that his touch is felt on this framework. All too often I see leaders that are obsessed with making sure that they give feedback and input at every turn. Why did you hire those people in the first place? Was it for their skill and acumen? Their attention to detail? Their obsessive desire to get things right?
Then let them do their job.
Serlet’s email is well written and has the exact right scope, yes. But the response is just as important. A demand of what is likely too short a timeline (the App Store was eventually announced in March of 2008 and shipped in July of that year.) sets the bar high — matching the urgency of the request for all teams to work together on this project. This is not a side alley, it’s the foundation of a main thoroughfare. It must get built before anything goes on top.
This efficacy is at the core of what makes Apple good when it is good. It’s not always good, but nothing ever is 100% of the time and the hit record is incredibly strong across a decade’s worth of shipped software and hardware. Crisp, lean communication that does not coddle or equivocate, coupled with a leader that is confident in their own ability and the ability of those that they hired means that there is no need to bog down the process in order to establish a record of involvement.
One cannot exist without the other. A clear, well argued RFP or project outline that is sent up to insecure or ineffective management just becomes fodder for territorial games or endless rounds of requests for clarification. And no matter how effective leadership is and how talented their employees, if they do not establish an environment in which clarity of thought is welcomed and rewarded then they will never get the kind of bold, declarative product development that they wish.
All in all, this exchange is a wildly important bit of ephemera that underpins the entire app ecosystem era and an explosive growth phase for Internet technology. And it’s also an encapsulation of the kind of environment that has made Apple an effective and brutally efficient company for so many years.
Can it be learned from and emulated? Probably, but only if all involved are willing to create the environment necessary to foster the necessary elements above. Nine times out of ten you get moribund management, an environment that discourages blunt position taking and a muddy route to the exit. The tenth time, though, you get magic.
And, hey, maybe we can take this opportunity to make that next meeting an email?
If Bertrand Serlet and Steve Jobs could change the world over an email perhaps we don’t need to have that meeting. https://t.co/NZ1HmVAnwb
— Matthew Panzarino (@panzer) June 3, 2021
The European Commission has announced that it’s issued formal antitrust charges against Apple, saying today that its preliminary view is Apple’s app store rules distort competition in the market for music streaming services by raising the costs of competing music streaming app developers.
The Commission begun investigating competition concerns related to iOS App Store (and also Apple Pay) last summer.
“The Commission takes issue with the mandatory use of Apple’s own in-app purchase mechanism imposed on music streaming app developers to distribute their apps via Apple’s App Store,” it wrote today. “The Commission is also concerned that Apple applies certain restrictions on app developers preventing them from informing iPhone and iPad users of alternative, cheaper purchasing possibilities.”
Commenting in a statement, EVP and competition chief Margrethe Vestager, said: “App stores play a central role in today’s digital economy. We can now do our shopping, access news, music or movies via apps instead of visiting websites. Our preliminary finding is that Apple is a gatekeeper to users of iPhones and iPads via the App Store. With Apple Music, Apple also competes with music streaming providers. By setting strict rules on the App store that disadvantage competing music streaming services, Apple deprives users of cheaper music streaming choices and distorts competition. This is done by charging high commission fees on each transaction in the App store for rivals and by forbidding them from informing their customers of alternative subscription options.”
Apple sent us this statement in response:
“Spotify has become the largest music subscription service in the world, and we’re proud for the role we played in that. Spotify does not pay Apple any commission on over 99% of their subscribers, and only pays a 15% commission on those remaining subscribers that they acquired through the App Store. At the core of this case is Spotify’s demand they should be able to advertise alternative deals on their iOS app, a practice that no store in the world allows. Once again, they want all the benefits of the App Store but don’t think they should have to pay anything for that. The Commission’s argument on Spotify’s behalf is the opposite of fair competition.”
Vestager is due to hold a press conference shortly — so stay tuned for updates.
This story is developing…
A number of complaints against Apple’s practices have been lodged with the EU’s competition division in recent years — including by music streaming service Spotify; video games maker Epic Games; and messaging platform Telegram, to name a few of the complainants who have gone public (and been among the most vocal).
The main objection is over the (up to 30%) cut Apple takes on sales made through third parties’ apps — which critics rail against as an ‘Apple tax’ — as well as how it can mandate that developers do not inform users how to circumvent its in-app payment infrastructure, i.e. by signing up for subscriptions via their own website instead of through the App Store. Other complaints include that Apple does not allow third party app stores on iOS.
Apple, meanwhile, has argued that its App Store does not constitute a monopoly. iOS’ global market share of mobile devices is a little over 10% vs Google’s rival Android OS — which is running on the lion’s share of the world’s mobile hardware. But monopoly status depends on how a market is defined by regulators (and if you’re looking at the market for iOS apps then Apple has no competitors).
The iPhone maker also likes to point out that the vast majority of third party apps pay it no commission (as they don’t monetize via in-app payments). While it argues that restrictions on native apps are necessary to protect iOS users from threats to their security and privacy.
Last summer the European Commission said its App Store probe was focused on Apple’s mandatory requirement that app developers use its proprietary in-app purchase system, as well as restrictions applied on the ability of developers to inform iPhone and iPad users of alternative cheaper purchasing possibilities outside of apps.
It also said it was investigating Apple Pay: Looking at the T&Cs and other conditions Apple imposes for integrating its payment solution into others’ apps and websites on iPhones and iPads, and also on limitations it imposes on others’ access to the NFC (contactless payment) functionality on iPhones for payments in stores.
The EU’s antitrust regulator also said then that it was probing allegations of “refusals of access” to Apple Pay.
In March this year the UK also joined the Apple App Store antitrust investigation fray — announcing a formal investigation into whether it has a dominant position and if it imposes unfair or anti-competitive terms on developers using its app store.
US lawmakers have, meanwhile, also been dialling up attention on app stores, plural — and on competition in digital markets more generally — calling in both Apple and Google for questioning over how they operate their respective mobile app marketplaces in recent years.
Last month, for example, the two tech giants’ representatives were pressed on whether their app stores share data with their product development teams — with lawmakers digging into complaints against Apple especially that Cupertino frequently copies others’ apps, ‘sherlocking’ their businesses by releasing native copycats (as the practice has been nicknamed).
Back in July 2020 the House Antitrust Subcommittee took testimony from Apple CEO Tim Cook himself — and went on, in a hefty report on competition in digital markets, to accuse Apple of leveraging its control of iOS and the App Store to “create and enforce barriers to competition and discriminate against and exclude rivals while preferencing its own offerings”.
“Apple also uses its power to exploit app developers through misappropriation of competitively sensitive information and to charge app developers supra-competitive prices within the App Store,” the report went on. “Apple has maintained its dominance due to the presence of network effects, high barriers to entry, and high switching costs in the mobile operating system market.”
The report did not single Apple out — also blasting Google-owner Alphabet, Amazon and Facebook for abusing their market power. And the Justice Department went on to file suit against Google later the same month. So, over in the U.S., the stage is being set for further actions against big tech. Although what, if any, federal charges Apple could face remains to be seen.
At the same time, a number of state-level tech regulation efforts are brewing around big tech and antitrust — including a push in Arizona to relieve developers from Apple and Google’s hefty cut of app store profits.
While an antitrust bill introduced by Republican Josh Hawley earlier this month takes aim at acquisitions, proposing an outright block on big tech’s ability to carry out mergers and acquisitions.
Although that bill looks unlikely to succeed, a flurry of antitrust reform bills are set to introduced as U.S. lawmakers on both sides of the aisle grapple with how to cut big tech down to a competition-friendly size.
In Europe lawmakers are already putting down draft laws with the same overarching goal.
In the EU the Commission has proposed an ex ante regime to prevent big tech from abusing its market power, with the Digital Markets Act set to impose conditions on intermediating platforms who are considered ‘gatekeepers’ to others’ market access.
In the UK, which now sits outside the bloc, the government is also drafting new laws in response to tech giants’ market power — saying it will create a ‘pro-competition’ regime that will apply to platforms with so-called ‘strategic market status’ — but instead of a set list of requirements it wants to target specific measures per platform.
Huawei’s smartphone rivals in China are quickly divvying up the market share it has lost over the past year.
92.4 million units of smartphones were shipped in China during the first quarter, with Vivo claiming the crown with a 23% share and its sister company Oppo following closely behind with 22%, according to market research firm Canalys. Huawei, of which smartphone sales took a hit after U.S. sanctions cut key chip parts off its supply chain, came in third at 16%. Xiaomi and Apple took the fourth and fifth spot respectively.
All major smartphone brands but Huawei saw a jump in their market share in China from Q1 2020. Apple’s net sales in Greater China nearly doubled year-over-year to $17.7 billion in the three months ended March, a quarter of all-time record revenue for the American giant, according to its latest financial results.
“We’ve been especially pleased by the customer response in China to the iPhone 12 family,”
said Tim Cook during an earnings call this week. “You have to remember that China entered the shutdown phase earlier in Q2 of last year than other countries. And so they were relatively more affected in that quarter, and that has to be taken into account as you look at the results.”
Huawei’s share shrunk from a dominant 41% to 16% in a year’s time, though the telecom equipment giant managed to increase its profit margin partly thanks to slashed costs. In November, it sold off its budget phone line Honor.
This quarter is also the first time China’s smartphone market has grown in four years, with a growth rate of 27%, according to Canalys.
“Leading vendors are racing to the top of the market, and there was an unusually high number of smartphone launches this quarter compared with Q1 2020 or even Q4 2020,” said Canalys analyst Amber Liu.
“Huawei’s sanctions and Honor’s divestiture have been hallmarks of this new market growth, as consumers and channels become more open to alternative brands.”
When the third minute of Apple’s first product event of 2021 ticked over and they had already made 3 announcements we knew it was going to be a packed one. In a tight single hour this week, Apple launched a ton of new product including AirTags, new Apple Card family sharing, a new Apple TV, a new set of colorful iMacs, and a purple iPhone 12 shade.
Of the new devices announced, though, Apple’s new 12.9” iPad Pro is the most interesting from a market positioning perspective.
This week I got a chance to speak to Apple Senior Vice President of Worldwide Marketing Greg Joswiak and Senior Vice President of Hardware Engineering John Ternus about this latest version of the iPad Pro and its place in the working universe of computing professionals.
In many ways, this new iPad Pro is the equivalent of a sprinter being 3 lengths ahead going into the last lap and just turning on the afterburners to put a undebatable distance between themselves and the rest of the pack. Last year’s model is still one of the best computers you can buy, with a densely packed offering of powerful computing tools, battery performance and portability. And this year gets upgrades in the M1 processor, RAM, storage speed, Thunderbolt connection, 5G radio, new ultra wide front camera and its Liquid Retina XDR display.
This is a major bump even while the 2020 iPad Pro still dominates the field. And at the center of that is the display.
Apple has essentially ported its enormously good $5,000 Pro Display XDR down to a 12.9” touch version, with some slight improvements. But the specs are flat out incredible. 1,000 nit brightness peaking at 1,600 nits in HDR with 2,500 full array local dimming zones — compared to the Pro Display XDR’s 576 in a much larger scale.
Given that this year’s first product launch from Apple was virtual, the media again got no immediate hands on with the new devices introduced, including iPad Pro. This means that I have not yet seen the XDR display in action. Unfortunately, these specs are so good that estimating them without having seen the screen yet is akin to trying to visualize “a trillion” in your head. It’s intellectually possible but not really practical.
It’s brighter than any Mac or iOS device not the market and could be a big game changing device for professionals working in HDR video and photography. But even still, this is a major investment to ship a micro-LED display in the millions or tens of millions of units with more density and brightness than any other display on the market.
I ask both of them why there’s a need to do this doubling down on what is already one of the best portable displays ever made — if not one of the best displays period.
“We’ve always tried to have the best display,” says Ternus. “We’re going from the best display on any device like this and making it even better, because that’s what we do and that’s why we, we love coming to work every day is to take that next big step.
“[With the] Pro Display XDR if you remember one thing we talked about was being able to have this display and this capability in more places in the work stream. Because traditionally there was just this one super expensive reference monitor at the end of the line. This is like the next extreme of that now you don’t even have to be in the studio anymore you can take it with you on the go and you can have that capability so from a, from a creative pro standpoint we think this is going to be huge.”
In my use of the Pro Display and my conversations with professionals about it one of the the common themes that I’ve heard is the reduction in overall workload due to the multiple points in the flow where color and image can be managed accurately to spec now. The general system in place puts a reference monitor very late in the production stage which can often lead to expensive and time consuming re-rendering or new color passes. Adding the Liquid Retina XDR display into the mix at an extremely low price point means that a lot more plot points on the production line suddenly get a lot closer to the right curve.
One of the stronger answers on the ‘why the aggressive spec bump’ question comes later in our discussion but is worth mentioning in this context. The point, Joswiak says, is to offer headroom. Headroom for users and headroom for developers.
“One of the things that iPad Pro has done as John [Ternus] has talked about is push the envelope. And by pushing the envelope that has created this space for developers to come in and fill it. When we created the very first iPad Pro, there was no Photoshop,” Joswiak notes. “There was no creative apps that could immediately use it. But now there’s so many you can’t count. Because we created that capability, we created that performance — and, by the way sold a fairly massive number of them — which is a pretty good combination for developers to then come in and say, I can take advantage of that. There’s enough customers here and there’s enough performance. I know how to use that. And that’s the same thing we do with each generation. We create more headroom to performance that developers will figure out how to use.
“The customer is in a great spot because they know they’re buying something that’s got some headroom and developers love it.”
The iPad Pro is now powered by the M1 chip — a move away from the A-series naming. And that processor part is identical (given similar memory configurations) to the one found in the iMac announced this week and MacBooks launched earlier this year.
“It’s the same part, it’s M1,” says Ternus. “iPad Pro has always had the best Apple silicon we make.”
“How crazy is it that you can take a chip that’s in a desktop, and drop it into an iPad,” says Joswiak. “I mean it’s just incredible to have that kind of performance at such amazing power efficiency. And then have all the technologies that come with it. To have the neural engine and ISP and Thunderbolt and all these amazing things that come with it, it’s just miles beyond what anybody else is doing.”
As the M1 was rolling out and I began running my testing, the power per watt aspects really became the story. That really is the big differentiator for M1. For decades, laptop users have been accustomed to saving any heavy or intense workloads for the times when their machines were plugged in due to power consumption. M1 is in the process of resetting those expectations for desktop class processors. In fact, Apple is offering not only the most powerful CPUs but also the most power-efficient CPUs on the market. And it’s doing it in a $700 Mac Mini, a $1,700 iMac and a $1,100 iPad Pro at the same time. It’s a pretty ridiculous display of stunting, but it’s also the product of more than a decade of work building its own architecture and silicon.
“Your battery life is defined by the capacity of your battery and the efficiency of your system right? So we’re always pushing really really hard on the system efficiency and obviously with M1, the team’s done a tremendous job with that. But the display as well. We designed a new mini LED for this display, focusing on efficiency and on package size, obviously, to really to be able to make sure that it could fit into the iPad experience with the iPad experience’s good battery life.
We weren’t going to compromise on that,” says Ternus.
One of the marquee features of the new iPad Pro is its 12MP ultra-wide camera with Center Stage. An auto-centering and cropping video feature designed to make FaceTime calling more human-centric, literally. It finds humans in the frame and centers their faces, keeping them in the frame even if they move, standing and stretching or leaning to the side. It also includes additional people in the frame automatically if they enter the range of the new ultra-wide 12MP front-facing camera. And yes, it also works with other apps like Zoom and Webex and there will be an API for it.
I’ve gotten to see it in action a bit more and I can say with surety that this will become an industry standard implementation of this kind of subject focusing. The crop mechanic is handled with taste, taking on the characteristics of a smooth zoom pulled by a steady hand rather than an abrupt cut to a smaller, closer framing. It really is like watching a TV show directed by an invisible machine learning engine.
“This is one of the examples of some of our favorite stuff to do because of the way it marries the hardware and software right,” Ternus says. “So, sure it’s the camera but it’s also the SOC and and the algorithms associated with detecting the person and panning and zooming. There’s the kind of the taste aspect right which is how do we make something that feels good it doesn’t move too fast and doesn’t move too slow. That’s a lot of talented, creative people coming together and trying to find the thing that makes it Apple like.”
It also goes a long way to making the awkward horizontal camera placement when using the iPad Pro with Magic Keyboard. This has been a big drawback for using the iPad Pro as a portable video conferencing tool, something we’ve all been doing a lot of lately. I ask Ternus whether Center Stage was designed to mitigate this placement.
“Well, you can use iPad in any orientation right? So you’re going to have different experiences based on how you’re using it. But what’s amazing about this is that we can keep correcting the frame. What’s been really cool is that we’ve all been sitting around in these meetings all day long on video conferencing and it’s just nice to get up. This experience of just being able to stand up and kind of stretch and move around the room without walking away from the camera has been just absolutely game changing, it’s really cool.”
It’s worth noting that several other video sharing devices like the Portal and some video software like Teams already offer cropping-type follow features, but the user experience is everything when you’re shipping software like this to millions of people at once. It will be interesting to see how Center Stage stacks up agains the competition when we see it live.
With the ongoing chatter about how the iPad Pro and Mac are converging from a feature-set perspective, I ask how they would you characterize an iPad Pro vs. a MacBook buyer? Joswiak is quick to respond to this one.
“This is my favorite question because you know, you have one camp of people who believe that the iPad and the Mac are at war with one another right it’s one or the other to the death. And then you have others who are like, no, they’re bringing them together — they’re forcing them into one single platform and there’s a grand conspiracy here,” he says.
“They are at opposite ends of a thought spectrum and reality is neither is correct, right? We pride ourselves in the fact that we work really, really, really hard to have the best products in the respective categories. The Mac is the best personal computer, it just is. Customer satisfaction would indicate that is the case, by a longshot.”
Joswiak points out that the whole PC category is growing, which he says is nice to see. But he points out that Macs are way outgrowing PCs and doing ‘quite well’. He also notes that the iPad business is still outgrowing the tablets category (while still refusing to label the iPad a tablet).
“And it’s also the case that it’s not an ‘either or’. The majority of our Mac customers have an iPad. That’s an awesome thing. They don’t have it because they’re replacing their Mac, it’s because they use the right tool at the right time.
What’s very cool about what [Ternus] and his team have done with iPad Pro is that they’ve created something where that’s still the case for creative professionals too — the hardest to please audience. They’ve given them a tool where they can be equally at home using the Mac for their professional making money with it kind of work, and now they can pick up an iPad Pro — and they have been for multiple generations now and do things that, again, are part of how they make money, part of their creative workflow flow,” says Joswiak. “And that test is exciting. it isn’t one or the other, both of them have a role for these people.”
Since converting over to an iPad Pro as my only portable computer, I’ve been thinking a lot about the multimodal aspects of professional work. And, clearly, Apple has as well given its launch of a Pro Workflows team back in 2018. Workflows have changed massively over the last decade, and obviously the iPhone and an iPad, with their popularization of the direct manipulation paradigm, have had everything to do with that. In the current world we’re in, we’re way past ‘what is this new thing’, and we’re even way past ‘oh cool, this feels normal’ and we’re well into ‘this feels vital, it feels necessary.’
Contrary to some people’s beliefs, we’re never thinking about what we should not do on an iPad because we don’t want to encroach on Mac or vice versa,” says Ternus. “Our focus is, what is the best way? What is the best iPad we can make what are the best Macs we can make. Some people are going to work across both of them, some people will kind of lean towards one because it better suits their needs and that’s, that’s all good.
If you follow along, you’ll know that Apple studiously refuses to enter into the iPad vs. Mac debate — and in fact likes to place the iPad in a special place in the market that exists unchallenged. Joswiak often says that he doesn’t even like to say the word tablet.
“There’s iPads and tablets, and tablets aren’t very good. iPads are great,” Joswiak says. “We’re always pushing the boundaries with iPad Pro, and that’s what you want leaders to do. Leaders are the ones that push the boundaries leaders are the ones that take this further than has ever been taken before and the XDR display is a great example of that. Who else would you expect to do that other than us. And then once you see it, and once you use it, you won’t wonder, you’ll be glad we did.”
Image Credits: Apple
Apple was questioned on its inability to reign in subscription scammers on its App Store during yesterday’s Senate antitrust hearing. The tech giant has argued that one of the reasons it requires developers to pay App Store commissions is to help Apple fight marketplace fraud and protect consumers. But developers claim Apple is doing very little to stop obvious scams that are now raking in millions and impacting consumer trust in the overall subscription economy, as well as in their own legitimate, subscription-based businesses.
One developer in particular, Kosta Eleftheriou, has made it his mission to highlight some of the most egregious scams on the App Store. Functioning as a one-man bunco squad, Eleftheriou regularly tweets out examples of apps that are leveraging fake reviews to promote their harmful businesses.
Some of the more notable scams he’s uncovered as of late include a crypto wallet app that scammed a user out of his life savings (~$600,000) in bitcoin; a kids game that actually contained a hidden online casino; and a VPN app scamming users out of $5 million per year. And, of course, there’s the scam that lit the fire in the first place: a competitor to Eleftheriou’s own Apple Watch app that he alleges scammed users out of $2 million per year, after stealing his marketing materials, cloning his app, and buying fake reviews to make the scammer’s look like the better choice.
Eleftheriou’s tweets have caught the attention of the larger app developer community, who now email him other examples of scams they’ve uncovered. Eleftheriou more recently took his crusade a step further by filing a lawsuit against Apple over the revenue he’s lost to App Store scammers.
Though Eleftheriou wasn’t name-checked in yesterday’s antitrust hearing, his work certainly was.
In a line of questioning from Georgia’s Senator Jon Ossoff, Apple’s Chief Compliance Officer Kyle Andeer was asked why Apple was not able to locate scams, given that these fraudulent apps are, as Ossoff put it, “trivially easy to identify as scams.”
He asked why do we have rely upon “open source reporting and journalists” to find the app scams — a reference that likely, at least in part, referred to Eleftheriou’s recent activities.
Eleftheriou himself has said there’s not much to his efforts. You simply find the apps generating most revenues and then check them for suspicious user reviews and high subscription prices. When you find both, you’ve probably uncovered a scam.
Andeer demurred, responding to Ossoff’s questions by saying that Apple has invested “tens of millions, hundreds of millions of dollars” in hardening and improving the security of its App Store.
“Unfortunately, security and fraud is a cat and mouse game. Any retailer will tell you that. And so we’re constantly working to improve,” Andeer said. He also claimed Apple was investing in more resources and technologies to catch wrong-doers, and noted that the App Store rejected thousands of apps every year for posing a risk to consumers.
The exec then warned that if Apple wasn’t the intermediary, the problem would be even worse.
“…No one is perfect, but I think what we’ve shown over and over again that we do a better job than others. I think the real risks of opening up the iPhone to sideloading or third-party app stores is that this problem will only multiply. If we look at other app stores out there, we look at other distribution platforms, it scares us.”
Ossoff pressed on, noting the sideloading questions could wait, and inquired again about the scam apps.
“Apple is making a cut on those abusive billing practices, are you not?,” he asked.
Andeer said he didn’t believe that was the case.
“If we find fraud — if we find a problem, we’re able to rectify that very quickly. And we do each and every day,” he said.
But to what extent Apple was profiting from the App Store scams was less clear. Ossoff wanted to know if Apple refunded “all” of its revenues derived from the scam billing practices — in other words, if every customer who ever subscribed got their money back when a scam was identified.
Andeer’s answer was a little vague, however, as it could be interpreted to mean Apple refunds customers who report the scam or file a complaint — procedures it already has in place today. Instead of saying that Apple refunds “all customers” when scams are identified, he carefully worded his response to say Apple worked to make sure “the customer” is made whole.
“Senator, that’s my understanding. There’s obviously a dedicated team here at Apple who works this each and every day. But my understanding is that we work hard to make sure the customer is in a whole position. That’s our focus at the end of the day. If we lose the trust of our customers, that’s going to hurt us,” he said.
For what it’s worth, Eleftheriou wasn’t buying it.
“Apple’s non-answers to Senator Ossoff’s great questions in yesterday’s hearing should anger all of us. They did not offer any explanation for why it’s so easy for people like me to keep finding multi-million-dollar scams that have been going on unchecked on the App Store for years. They also gave no clear answer to whether they’re responsible for fraudulent activity in their store,” he told TechCrunch.
“Apple appears to profit from these scams, instead of refunding all associated revenues back to affected users when they belatedly take some of these down. We’ve been letting Apple grade their own homework for over a decade. I urge the committee to get to the bottom of these questions, including Apple’s baffling decision years ago to remove the ability for users to flag suspicious apps on the App Store,” Eleftheriou added.
Apple did not provide a comment.
With the spring comes color from Apple. The new iMacs are offered in seven different shades including a nice deep purple. As a refresh to the lineup, Apple has also released an iPhone 12 and iPhone 12 mini in a purple hue as well. I have a preview unit in hand to look at and so look at it I did. The color is great, closer to a violet on the sides and a lilac on the back.
This is a great color. In my opinion probably the best color of iPhone 12 released so far. Apple releasing this new purple shade also, to me, says to the people that love the mini: Don’t worry this will still be available for a while. But, conversely, it could be a sign that this version of the mini might be the only one we get for a while. Maybe I’m reading into it too much and this is a “because we could” thematic tie-in that offers a new option for spring buyers. Either way, it’s a really nice-looking phone that ties into the “millenial purple” (read: lilac) trend that is booming in design and fashion right now. Apple’s color theory team is always pretty well on trend, so no change here.
Apple has also released a nice purple silicone case that complements it well.
If you want a deep dive on the seriously capable offering that the iPhone 12 mini is, feel free to reference our review from late last year.
Here are some nice pictures of the purple iPhone 12 mini for you to look at:
I’ve been playing around with Apple’s new AirTag location devices for a few hours now and they seem to work pretty much as advertised. The setup flow is simple and clean, taking clear inspiration from the one Apple developed for AirPods. The precision finding feature enabled by the U1 chip works as a solid example of utility-driven augmented reality, popping up a virtual arrow and other visual identifiers on the screen to make finding a tag quicker.
The basic way that AirTags work, if you’re not familiar, is that they use Bluetooth beaconing technology to announce their presence to any nearby devices running iOS 14.5 and above. These quiet pings are encrypted and invisible (usually) to any passer by, especially if they are with their owners. This means that no one ever knows what device actually ‘located’ your AirTag, not even Apple.
With you, by the way, means in relative proximity to a device signed in to the iCloud account that the AirTags are registered to. Bluetooth range is typically in the ~40 foot range depending on local conditions and signal bounce.
In my very limited testing so far, AirTag location range fits in with that basic Bluetooth expectation. Which means that it can be foiled by a lot of obstructions or walls or an unflattering signal bounce. It often took 30 seconds or more to get an initial location from an AirTag in another room, for instance. Once the location was received, however, the instructions to locate the device seemed to update quickly and were extremely accurate down to a few inches.
The AirTags run for a year on a standard CR2032 battery that’s user replaceable. They offer some water resistance including submersion for some time. There are a host of accessories that seem nicely designed like leather straps for bags, luggage tags and key rings.
So far so good. More testing to come.
As with anything to do with location, security and privacy are a top of mind situation for AirTags, and Apple has some protections in place.
You cannot share AirTags — they are meant to be owned by one person. The only special privileges offered by people in your iCloud Family Sharing Group is that they can silence the ‘unknown AirTag nearby’ alerts indefinitely. This makes AirTags useful for things like shared sets of keys or maybe even a family pet. This means that AirTags will not show up on your family Find My section like other iOS devices might. There is now a discrete section within the app just for ‘Items’ including those with Find My functionality built in.
The other privacy features include a ‘warning’ that will trigger after some time that a tag is in your proximity and NOT in the proximity of its owner (aka, traveling with you perhaps in a bag or car). Your choices are then to make the tag play a sound to locate it — look at its information including serial number and to disable it by removing its battery.
Any AirTag that has been away from its owner for a while — this time is variable and Apple will tweak it over time as it observes how AirTags work — will start playing a sound whenever it is moved. This will alert people to its presence.
You can, of course, also place an AirTag into Lost Mode, offering a choice to share personal information with anyone who locates it as it plays an alert sound. Anyone with any smart device with NFC, Android included, can tap the device to see a webpage with information that you choose to share. Or just a serial number if you do not choose to do so.
This scenario addresses what happens if you don’t have an iOS device to alert you to a foreign AirTag in your presence, as it will eventually play a sound even if it is not in lost mode and the owner has no control over that.
It’s clear that Apple has thought through many of the edge cases, but some could still crop up as it rolls out, we’ll have to see.
Apple has some distinct market advantages here:
Important to note that Apple has announced the development of a specification for chipset makers that lets third-party devices with Ultra Wideband radios access the U1 chip onboard iPhones ‘later this Spring’. This should approximate the Precision Finding feature’s utility in accessories that don’t have the advantage of having a U1 built in like the AirTags do. And, of course, Apple has opened up the entire Find My mesh network to third party devices from Belkin, Chipolo and VanMoof that want to offer a similar basic finding function as offered by AirTags. Tile has announced plans to offer a UWB version of its tracker as well, even as it testified in Congress yesterday that Apple’s advantages made its entry into this market unfair.
It will be interesting to see these play out once AirTags are out getting lost in the wild. I have had them for under 12 hours so I’ve not been able to test edge cases, general utility in public spaces or anything like that.
The devices go on sale on April 23rd.
Elon Musk’s Neuralink, one of his many companies and the only one currently focused on mind control (that we’re aware of), has released a new blog post and video detailing some of its recent updates — including using its hardware to make it possible for a monkey to play pong with only its brain.
In the video above, Neuralink demonstrates how it used its sensor hardware and brain implant to record a baseline of activity from this macaque (named ‘Pager’) as it played a game on-screen where it had to move a token to different squares using a joystick with its hand. Using that baseline data, Neuralink was able to use machine learning to anticipate where Pager was going to be moving the physical controller, and was eventually able to predict it accurately before the move was actually made. Researchers then removed the paddle entirely, and eventually did the same thing with Pong, ultimately ending up at a place where Pager no longer was even moving its hand on the air on the nonexistent paddle, and was instead controlling the in-game action entirely with its mind via the Link hardware and embedded neural threads.
The last we saw of Neuralink, Musk himself was demonstrating the Link tech live in August 2020, using pigs to show how it was able to read signals from the brain depending on different stimuli. This new demo with Pager more clearly outlines the direction that the tech is headed in terms of human applications, since, as the company shared on its blog, the same technology could be used to help patients with paralysis manipulate a cursor on a computer, for instance. That could be applied to other paradigms as well, including touch controls on an iPhone, and even typing using a virtual keyboard, according to the company.
Musk separately tweeted that in fact, he expects the initial version of Neuralink’s product to be able to allow someone with paralysis that prevents standard modes of phone interaction to use one faster than people using their thumbs for input. He also added that future iterations of the product would be able to enable communication between Neuralinks in different parts of a patient’s body, transmitting between an in-brain node and neural pathways in legs, for instance, making it possible for “paraplegics to walk again.”
These are obviously bold claims, but the company cites a lot of existing research that undergirds its existing demonstrations and near-term goals. Musk’s more ambitious claims, should, like all of his projections, definitely be taken with a healthy dose of skepticism. He did add that he hopes human trials will begin to get underway “hopefully later this year,” for instance – which is already two years later than he was initially anticipating those might start.
In 2019, Spotify began testing a hardware device for automobile owners it lovingly dubbed “Car Thing,” which allowed Spotify Premium users to play music and podcasts using voice commands that began with “Hey, Spotify.” Last year, Spotify began developing a similar voice integration into its mobile app. Now, access to the “Hey Spotify” voice feature is rolling out more broadly.
Spotify chose not to officially announce the new addition, despite numerous reports indicating the voice option was showing up for many people in their Spotify app, leading to some user confusion about availability.
One early report by GSM Arena, for example, indicated Android users had been sent a push notification that alerted them to the feature. The notification advised users to “Just enable your mic and say ‘Hey Spotify, Play my Favorite Songs.” When tapped, the notification launched Spotify’s new voice interface where users are pushed to first give the app permission to use the microphone in order to be able to verbally request the music they want to hear.
Image Credits: GSM Arena (opens in a new window)
Several outlets soon reported the feature had launched to Android users, which is only partially true.
As it turns out, the feature is making its way to iOS devices, as well. When we launched the Spotify app here on an iPhone running iOS 14.5, for instance, we found the same feature had indeed gone live. You just tap on the microphone button by the search box to get to the voice experience. We asked around and found that other iPhone users on various versions of the iOS operating system also had the feature, including free users, Premium subscribers and Premium Family Plan subscribers.
The screen that appears suggests in big, bold text that you could be saying “Hey Spotify, play…” followed by a random artist’s name. It also presents a big green button at the bottom to turn on “Hey Spotify.”
Once enabled, you can ask for artists, albums, songs and playlists by name, as well as control playback with commands like stop, pause, skip this song, go back and others. Spotify confirms the command with a robotic-sounding male voice by default. (You can swap to a female voice in Settings, if you prefer.)
Image Credits: Spotify screenshot iOS
This screen also alerts users that when the app hears the “Hey Spotify” voice command, it sends the user’s voice data and other information to Spotify. There’s a link to Spotify policy regarding its use of voice data, which further explains that Spotify will collect recordings and transcripts of what you say along with information about the content it returned to you. The company says it may continue to use this data to improve the feature, develop new voice features and target users with relevant advertising. It may also share your information with service providers, like cloud storage providers.
The policy looks to be the same as the one that was used along with Spotify’s voice-enabled ads, launched last year, so it doesn’t seem to have been updated to fully reflect the changes enabled with the launch of “Hey Spotify.” However, it does indicate that, like other voice assistants, Spotify doesn’t just continuously record — it waits until users say the wake words.
Given the “Hey Spotify” voice command’s origins with “Car Thing,” there’s been speculation that the mobile rollout is a signal that the company is poised to launch its own hardware to the wider public in the near future. There’s already some indication that may be true — MacRumors recently reported finding references and photos to Car Thing and its various mounts inside the Spotify app’s code. This follows Car Thing’s reveal in FCC filings back in January of this year, which had also stoked rumors that the device was soon to launch.
Spotify was reached for comment this morning, but has yet been unable to provide any answers about the feature’s launch despite a day’s wait. Instead, we were told that they “unfortunately do not have any additional news to share at this time.” That further suggests some larger projects could be tied to this otherwise more minor feature’s launch.
Though today’s consumers are wary of tech companies’ data collection methods — and particularly their use of voice data after all three tech giants confessed to poor practices on this front — there’s still a use case for voice commands, particularly from an accessibility standpoint and, for drivers, from a safety standpoint.
And although you can direct your voice assistant on your phone (or via CarPlay or Android Auto, if available) to play content from Spotify, some may find it useful to be able to speak to Spotify directly — especially since Apple doesn’t allow Spotify to be set as a default music service. You can only train Siri to launch Spotify as your preferred service.
If, however, you have second thoughts about using the “Hey Spotify” feature after enabling it, you can turn it off under “Voice Interactions” in the app’s settings.
Apple has launched a new app, Find My Certification Asst., designed for use by MFi (Made for iPhone) Licensees, who need to test their accessories’ interoperability with Apple’s Find My network. The network helps users find lost Apple devices — like iPhones, AirPods, and Mac computers, among other things — but is poised to add support for finding other compatible accessories manufactured by third parties.
The launch of the testing app signals that Apple may be ready to announce the launch of the third-party device program in the near future.
According to the app’s description, MFi Licensees can use Find My Certification Asst. to test the “discovery, connection, and other key requirements” for their accessories that will incorporate Apple’s Find My network technology. It also points to information about the Find My network certification program on Apple’s MFi Portal at mfi.apple.com, which currently references Find My network as a MFi program technology that’s “launching soon.”
The new app’s screenshots indicate it allows device makers to run a wide variety of tests in areas like connectivity, sound (for example, if the item can make a noise when misplaced), firmware, key management, NFC, power, and more.
Image Credits: App Store screenshot
The app became publicly available on Sunday, April 4th on the iOS App Store, according to Sensor Tower data. It’s brand-new so is not yet ranking in any App Store categories, including its own, “Developer Tools,” or others. It also has no ratings and reviews at this time.
The app’s launch is step towards the larger goal of opening up the Apple Find My network to third-parties and Apple’s planned launch of its own new accessory, AirTags.
Apple at last year’s Worldwide Developer Conference had first announced it would open up Find My to third-party devices after facing pressure from regulators in the U.S. and Europe who had been looking into, among other things, whether Apple had been planning to give itself an advantage with its forthcoming launch of AirTags, a competitor to Tile’s lost-item finder.
Image Credits: screenshot of FMCA app
A prominent Apple critic, Tile had complained that AirTags would be able to connect with Apple’s U1 chips, which use UWB (ultra-wideband) technology for more precise finding capabilities, and at a Congressional hearing noted that AirTags would work with Apple’s own Find My app, which ships by default on Apple devices. This, Tile believed, would give Apple a first-party advantage in the lost-item finder market that Tile had successfully established and dominated for years.
Apple, in response, opened up third-party developer access to its U1 chip via its “NearbyInteraction” framework last year. As a result, Tile in Jan. 2021 announced its plan to launch a new tracker powered by UWB.
More recently, Apple updated its Find My app to include a new tab called “Items” in preparation for the app’s expanded support for AirTags and other third-party accessories, like those from Tile and others. This “Items” tab is enabled in latest Apple’s iOS 14.5 beta release, where the app explains how the Find My app will now be able to help users keep track of their everyday items — including accessories and other items that are compatible with Find My.
However, Tile (and likely others) feel that Apple’s concessions still disadvantage their businesses because participation in Apple’s FindMy program means that the third-party device maker would have to abandon its existing app and instead require its customers to use Apple’s FindMy app — effectively turning over its customers and their data to Apple.
It’s worth noting that, upon launch, the app features an icon that shows three items: headphones, a backpack and a suitcase. Not coincidentally, perhaps, Tile’s first integrations were with Bose headphones and luggage and bag makers, Away and Herschel.
Apple has not responded to a request for comment about the new app’s launch.
U.S. consumers spent an average of $138 on iPhone apps last year, an increase of 38% year over year, largely driven by the pandemic impacts, according to new data from app store intelligence firm Sensor Tower. Throughout 2020, consumers turned to iPhone apps for work, school, entertainment, shopping and more, driving per-user spending to a new record and the greatest annual growth since 2016, when it had then popped by 42% year over year.
Sensor Tower tells TechCrunch it expects the trend of increased consumer spend to continue in 2021, when it projects consumer spend per active iPhone in the U.S. to reach an average of $180. This will again be tied, at least in part, to the lift caused by the pandemic — and, particularly, the lift in pandemic-fueled spending on mobile games.
Image Credits: Sensor Tower
Last year’s increased spending on iPhone apps in the U.S. mirrored global trends, which saw consumers spend a record $111 billion on both iOS and Android apps, per Sensor Tower, and $143 billion, per App Annie, whose analysis had also included some third-party Android app stores in China.
In terms of where U.S. iPhone consumer spending was focused in 2020, the largest category was, of course, gaming.
In the U.S., per-device spending on mobile games grew 43% year over year from $53.80 in 2019 to $76.80 in 2020. That’s more than 20 points higher than the 22% growth seen between 2018 and 2019, when in-game spending grew from $44 to $53.80.
U.S. users spent the most money on puzzle games, like Candy Crush Saga and Gardenscapes, which may have helped to take people’s minds off the pandemic and its related stresses. That category averaged $15.50 per active iPhone, followed by casino games, which averaged $13.10, and was driven by physical casinos closures. Strategy games also saw a surge in spending in 2020, growing to an average of $12.30 per iPhone user spending.
Image Credits: Sensor Tower
Another big category for in-app spending was Entertainment. With theaters and concerts shut down, consumers turned to streaming apps in larger numbers. Disney+ launched in late 2019, just months ahead of the pandemic lockdowns and HBO Max soon followed in May 2020.
Average per-device spending in this category was second-highest, at $10.20, up 26% from the $8.10 spent in 2019. For comparison, per-device spending had only grown by 1% between 2018 and 2019.
Other categories in the top five by per-device spending included Photo & Video (up 56% to $9.80), Social Networking (up 41% to $7.90) and Lifestyle (up 14% to $6.50).
These increases were tied to apps like TikTok, YouTube, and Twitch. Twitch saw 680% year-over-year revenue growth in 2020 on U.S. iPhones, specifically. TikTok, meanwhile, saw 140% growth. In the Lifestyle category, dating apps were driving growth as consumers looked to connect with others virtually during lockdowns, while bars and clubs were closed.
Overall, what made 2020 unique was not necessarily what apps people where using, but how often they were being used and how much was being spent.
App Annie had earlier pointed out that the pandemic accelerated mobile adoption by two to three years’ time. And Sensor Tower today tells us that the industry didn’t see the same sort of “seasonality” around spending in certain types of apps, and particularly games, last year — even though, pre-pandemic, there are typically slower parts of the year for spending. That was not the case in 2020, when any time was a good time to spend on apps.
Apple has updated its native Maps app with more helpful information designed to assist with travel while mitigating the spread of COVID-19. Apple Maps on iPhone, iPad and Mac will now show COVID-19 health measure information for airports when searched via the app, either through a link to the airport’s own COVID-19 advisory page, or directly on the in-app location card itself.
The new information is made available through a partnership with the Airports Council International and provides details on COVID-19 safety guidelines in effect at over 300 airports worldwide. The type of information provided includes requirements around COVID-19 testing, mask usage, screening procedures and any quarantine measures in effect, and generally hopes to help make the process of traveling while the global pandemic continues easier, and as vaccination programs and other counterefforts are set to prompt a global travel recovery.
Earlier this month, Apple also added COVID-19 vaccination locations within the U.S. to Apple Maps, which can be found when searching either via text, with Siri, or using the “Find nearby” location-based feature. Last year, the company added testing sites in various locations around the world and added COVID-19 information modules to cards for other types of businesses.
Hello friends, and welcome to Week in Review.
Last week, I talked a bit about NFTs and their impact on artists. If you’re inundated with NFT talk just take one quick look at this story I wrote this week about the $69 million sale of Beeple’s photo collage. This hype cycle is probably all the result of crypto folks talking each other up and buying each other’s stuff, but that doesn’t mean there won’t be lasting impacts. That said, I would imagine we’re pretty close to the peak of this wave, with a larger one down the road after things cool off a bit. I’ve been wrong before though…
This week, I’m interested in a quick look at what your kids have been talking about all these years. Yes, Roblox.
(Photo by Ian Tuttle/Getty Images for Roblox)
Roblox went public on the New York Stock Exchange this week, scoring a $38 billion market cap after its first couple days of trading.
Investors rallied around the idea that Roblox is one of the most valuable gaming companies in existence. More than Unity, Zynga, Take-Two, even gaming giant Electronic Arts. It’s still got a ways to go to take down Microsoft, Sony or Apple though… The now-public company is so freaking huge because investors believe the company has tapped into something that none of the others have, a true interconnected creative marketplace where gamers can evolve alongside an evolving library of experiences that all share the same DNA (and in-game currency).
The gaming industry has entered a very democratic stride as cross-play tears down some of the walls of gaming’s platform dynamics. Each hardware platform that operates an app store of their own still has the keys to a kingdom, but it’s a shifting world with uncertainty ahead. While massive publishers have tapped cloud gaming as the trend that will string their blockbuster franchises together, they all wish they were in Roblox’s position. The gaming industry has seen plenty of Goliath’s in its day, but for every major MMO to strike it rich, it’s still just another winner in a field of disparate hits with no connective tissue.
Roblox is different, and while many of us still have the aged vision of the image above: a bunch of rudimentary Minecraft/Playmobile-looking mini-games, Roblox’s game creation tools are advancing quickly and developers are building photorealistic games that are wider in ambition and scope than before. As the company levels-up the age range it appeals to — both by holding its grasp on aging gamers on its platform and using souped-up titles to appeal to a new-generation — there’s a wholly unique platform opportunity here: the chance to have the longevity of an app store but with the social base layer that today’s cacophony of titles have never shared.
Whether or not Roblox is the “metaverse” that folks in the gaming world have been hyping, it certainly looks more like it than any other modern gaming company does.
SHENYANG, CHINA – MARCH 08: Customers try out iPhone 12 smartphones at an Apple store on March 8, 2021 in Shenyang, Liaoning Province of China. (Photo by VCG/VCG via Getty Images)
Apple releases some important security patches
It was honestly a pretty low-key week of tech news, I’ll admit, but folks in the security world might not totally buy that characterization. This week, Apple released some critical updates for its devices, fixing a Safari vulnerability that could allow attackers to run malicious code on a user’s unpatched devices. Update your stuff, y’all.
TikTok gets proactive on online bullying
New social media platforms have had the benefit of seeing the easy L’s that Facebook teed itself up for. For TikTok, its China connection means that there’s less room for error when it comes to easily avoidable losses. The team announced some new anti-bullying features aimed at cutting down on toxicity in comment feeds.
Dropbox buys DocSend
Cloud storage giants are probably in need of a little reinvention, the enterprise software boom of the pandemic has seemed to create mind-blowing amounts of value for every SaaS company except these players. This week, Dropbox made a relatively big bet on document sharing startup DocSend. It’s seemingly a pretty natural fit for them, but can they turn in into a bigger opportunity?
Epic Games buys photogrammetry studio
As graphics cards and consoles have hit new levels of power, games have had to satisfy desired for more details and complexity. It takes a wild amount of time to create 3D assets with that complexity so plenty of game developers have leaned on photogrammetry which turns a series of photos or scans of a real world object or environment into a 3D model. This week, Epic Games bought one of the better known software makers in this space, called Capturing Reality, with the aim of integrating the tech into future versions of their game engine.
Twitter Spaces launches publicly next month
I’ve spent some more time with Twitter Spaces this week and am growing convinced that it has a substantial chance to kneecap Clubhouse’s growth. Twitter is notoriously slow to roll out products, but it seems they’ve been hitting the gas on Spaces, announcing this week that it will be available widely by next month.
Seth Rogen starts a weed company
There’s a lot of money in startups, there’s really never been a better time to get capital for a project… if you know the right people and have the right kind of expertise. Seth Rogen and weed are a pretty solid mental combo and him starting a weed company shouldn’t be a big shock.
SeongJoon Cho/Bloomberg via Getty Images
Some of my favorite reads from our Extra Crunch subscription service this week:
Coupang follows Roblox to a strong first day of trading
“Another day brings another public debut of a multibillion-dollar company that performed well out of the gate.This time it’s Coupang, whose shares are currently up just over 46% to more than $51 after pricing at $35, $1 above the South Korean e-commerce giant’s IPO price range. Raising one’s range and then pricing above it only to see the public markets take the new equity higher is somewhat par for the course when it comes to the most successful recent debuts, to which we can add Coupang.” More
How nontechnical talent can break into deep tech
“Startup hiring processes can be opaque, and breaking into the deep tech world as a nontechnical person seems daunting. As someone with no initial research background wanting to work in biotech, I felt this challenge personally. In the past year, I landed several opportunities working for and with deep tech companies.“ More
Does your VC have an investment thesis or a hypothesis?
“Venture capitalists love to talk investment theses: on Twitter, Medium, Clubhouse, at conferences. And yet, when you take a closer look, theses are often meaningless and/or misleading…” More