As anticipated, Apple on Tuesday launched its new iPhone 12 vary (the entry-level iPhone 12 mini, the iPhone 12, the iPhone 12 Pro, and the iPhone 12 Pro Max) and a $99 HomePod mini.
We know the corporate’s observe file and so it’s sure its new smartphones will dwell as much as expectations. But (apart from the possibility for a greater cell system on the buying funds) what ought to enterprise professionals take into consideration?
Now is the time to get into 5G
There is little question in any respect that the iPhone 12 enhancement more than likely to be of curiosity to enterprise professionals would be the introduction of (actual) 5G.
This introduction means 5G deployments will now speed up — Verizon just about confirmed this with information of its plans. It’s a sample very seemingly be repeated globally, although there may nonetheless be a bit disappointment on condition that completely different nations are utilizing barely completely different iterations of the usual. Previous community enhancements have rolled out in a similar way. These issues will cross.
For enterprise professionals, 5G will ship safer and — maybe — extra dependable connectivity to get work finished wherever they occur to be (topic to contractual limitations and bandwidth expenses). It could possibly be a chilly glass of water within the desert when you don’t have good broadband however do have good cell connectivity.
What’s extra vital is that the transfer will speed up innovation within the 5G house. Switched-on enterprises will need to discover how 5G may be exploited beside Apple’s different instruments to ship new experiences and generate new business, whereas early-adopting companies will now start to convey the primary iterations of what they’ve been engaged on to the general public.
It’s additionally good to notice using the Ceramic Shield, which I suppose gained’t intensify the 5G sign. With any luck we gained’t sooner or later be advised, “You’re holding it mistaken.”
Grab the popcorn and watch the ultra-wideband story
Like Apple Watch Series 6 and the iPhone 11, HomePod mini and your entire iPhone 12 household carry Apple’s ultra-wideband U1 processor.
The initial uses of this chip seem pretty modest: sharing files, replacing car keys and as a tool for proximity sensing. Apple made it relatively clear that security is part of its overall plan for the standard, and it makes sense to think the big idea isn’t just around the home, given that both the iPhones and Apple Watch carry this chip. You take those things outside, too….
Given the information drought that currently exists, it’s hard to accurately predict how things might turn out. Apple appears to be pointing us toward thinking of UWB and how it supports smart homes, but as we know, UWB is already used in the manufacturing and warehousing industries. There is surely more potential for enterprise development here.
We know Apple is creating a platform around this chip. At present, it appears to be working on seeding wide-scale deployment, which means developers must explore what they can do with the chip now in order to be ready to leap into business once Apple delivers new APIs. How, for example, will this relate to Apple’s machine imaging tools, particularly LiDAR?
A MagSafe opportunity? Really?
As I see it, other than the Made for iPod system (which begat the Made for iPhone), Apple’s history shows much success. But its record in third-party focused interconnects seems a little weak.
Can MagSafe buck this trend?
In the Pro corner, sits a chance to create functionally useful accessories designed to take advantage of the Qi standard; in the other, there’s a reality in which some accessory vendors may feel like a Qi standard without another entity acting as gatekeeper may be just as profitable.
We’ll see how that turns out.
I’ve a feeling there’s a little over-reach here, but do think the accessories will prove popular; that said, my Spidey-sense isn’t singing.
Will enterprises exploit the Neural Engine?
All available iPhones now make use of Apple’s new A14 Bionic processor. The chip is world-beating, but what may be of more interest to enterprises (and enterprise developers) is the extent to which its multi-core Neural Engine’s proven power can be harnessed to deliver on a wider range of machine intelligence needs.
Can this computational imaging intelligence be successfully harnessed to support things like RPA or edge network security monitoring?
Will Apple deliver API’s to enable developers to innovate in such spaces?
Or does its model rely on rolling out such capacity within its Apple MDM?
Meanwhile, of course, we have WidgetKit, Vision APIs, Natural Language and ARKit to use as device-based data to inform machine learning models.
What does LiDAR bring to the party?
Does anyone else feel like LiDAR is a fantastic solution to a problem we haven’t quite identified? I do, to an extent — the tech always made sense to me as part of a collision detection system for autonomous vehicles, but how does it help me with my own life?
One demonstration — of how the technology can improve Night Mode – was impressive in terms of achieving amazing results by combining advanced CCDs with computational photography, but what else does LiDAR bring to the party (other than use as a tool to enable 3D and interior design solutions)?
At present, the main uses will likely be around the creation of new augmented reality experiences, making use of the sensor’s ability to measure the distance light travels between external objects and the camera itself. Even then, this appears to be a story to be played out.
What about the walled garden?
I’m not so certain it is terribly wise to read too much into Apple’s quietly spoken news that it intends permitting support for third-party streaming music services (such as Amazon) on HomePod mini. However, given ongoing discussions around Apple’s role as gatekeeper around its platforms and services might this suggest the company is preparing to take down some sections of wall around its garden?
The best available (Mac?) processor?
Apple claims the A14 Bionic’s processor and graphics processors are significantly (50%) faster than the processors available in the current crop of competing smartphones. Highlights:
- 70% faster ML accelerators.
- 80% faster Neural engine.
- 4-core GPU.
- 6-core CPU.
- 16-core Neural Engine.
- 8 billion transistors.
- 11 trillion operations per second on the Neural Engine.
The next big question will be the extent to which the Mac version of these chips (assuming such a thing) will accelerate — or otherwise — Mac performance. We’ll find out more on this quite soon.
Please follow me on Twitter, or be a part of me within the AppleHolic’s bar & grill and Apple Discussions teams on MeWe.
Copyright © 2020 IDG Communications, Inc.