As expected, Apple has introduced its new iPhone 12 range (the entry-level iPhone 12 mini, iPhone 12, Pro, and Pro Max) and $99 HomePod mini.

We know the company’s track record and so it is certain its new smartphones will live up to expectations, but (other than the chance for a better mobile device on the purchasing budget) what should enterprise professionals think about?

Now is the time to get into 5G

There is no doubt at all that the iPhone 12 enhancement most likely to be of interest to enterprise professionals will be the introduction of (real) 5G.

We can take it as read that this introduction means 5G deployments will now accelerate — Verizon pretty much confirmed this with news of its plans. It’s a pattern very likely be repeated globally, though there may still be a little disappointment given that different nations are using slightly different iterations of the standard. Previous network improvements have rolled out in a similar fashion. These things will pass.

For enterprise professionals, 5G will deliver more secure and – perhaps – more reliable connectivity to get work done wherever they happen to be. (Subject to contractual limitations and bandwidth charges). It could be a cold glass of water in the desert if you don’t have good broadband but do have good mobile connectivity.
What’s more important is that the move will accelerate innovation in the 5G space. Switched-on enterprises will want to explore how 5G can be exploited beside Apple’s other tools to deliver new experiences and generate new business, while early adopting businesses will now begin to bring the first iterations of what they have been working on to the public.

It’s also nice to note use of the Ceramic Shield, use of which I guess won’t accentuate the 5G signal. With any luck we won’t in future be told “You’re holding it wrong.”

Grab the popcorn and watch the ultra-wideband story

Like Apple Watch Series 6 and the iPhone 11, HomePod mini and the entire iPhone 12 family carry Apple’s ultra-wideband U1 processor.

The initial uses of this chip seem pretty modest: sharing files, to replace car keys and as a tool for proximity sensing. Apple made it relatively clear that security is part of its overall plan for the standard, and it makes sense to think the big idea isn’t just around the home, given that both the iPhones and Apple Watch carry this chip. You take those things outside, too…

Given the information drought that currently exists, it’s hard to accurately predict how things might turn out. Apple appears to be pointing us toward thinking of UWB and how it supports smart homes, but as we know UWB is already used in the manufacturing and warehousing industries, there is surely more potential for enterprise development here?

We know Apple is creating a platform around this chip, at present it appears to be working on seeding wide scale deployment, which means developers must explore what they can do with the chip now in order to be ready to leap into business once Apple delivers new APIs. How, for example, will this relate to Apple’s machine imaging tools, particularly LiDAR?

A MagSafe opportunity? Really?

As I see it, other than the Made for iPod system (which begat the Made for iPhone) Apple’s history shows much success, but its record in third-party focused interconnects seems a little weak.

Can MagSafe buck this trend?

In the Pro corner sits a chance to create functionally useful accessories designed to take advantage of the Qi standard; in the other there’s a reality in which some accessory vendors may feel like a Qi standard without another entity acting as gatekeeper may be just as profitable.

We’ll see how that turns out.

I’ve a feeling there’s a little over-reach here but do think the accessories will prove popular, though my Spidey-sense isn’t singing.

Will enterprises exploit the Neural Engine?

All available iPhones now make use of Apple’s new A14 Bionic processor. The chip is world-beating, but what may be of more interest to enterprises (and enterprise developers) is the extent to which its multi-core Neural Engine’s proven power can be harnessed to deliver on a wider range of machine intelligence needs.

Can this computational imaging intelligence be successfully harnessed to support things like RPA or edge network security monitoring?

Will Apple deliver API’s to enable developers to innovate in such spaces?

Or does its model rely on rolling out such capacity within its Apple MDM?

Meanwhile, of course, we have WidgetKit, Vision API’s, Natural Language and ARKit to use as device-based data to inform machine learning models.

What does LiDAR bring to the party?

Does anyone else feel like LiDAR is a fantastic solution to a problem we haven’t quite identified? I do, to an extent – the tech always made sense to me as part of a collision detection system for autonomous vehicles, but how does it help me with my own life?

One demonstration – of how the technology can improve Night Mode – was deeply impressive in terms of achieving amazing results by combining advanced CCDs with computational photography, but what else does LiDAR bring to the party (other than use as a tool to enable 3D and interior design solutions)?

At present, the main uses will likely be around the creation of new augmented reality experiences, making use of the sensor’s ability to measure the distance light travels between external objects and the camera itself – but even then this appears to be a story to be played out.

What about the walled garden?

I’m not so certain it is terribly wise to read too much into Apple’s quietly spoken news that it intends permitting support for third-party streaming music services (such as Amazon) on HomePod mini. However, given ongoing discussions around Apple’s role as gatekeeper around its platforms and services might this suggest the company is preparing to take down some sections of wall around its garden?

The best available (Mac?) processor?

Apple claims the A14 Bionic’s processor and graphics processors are significantly (50 percent) faster than the processors available in the current crop of competing smartphones. Highlights:

  • 70% faster ML accelerators.
  • 80% faster Neural engine.
  • 4-core GPU.
  • 6-core CPU.
  • 16-core Neural Engine.
  • 8 billion transistors.
  • 11 trillion operations per second on the Neural Engine.

The next big question will be the extent to which the Mac version of these chips (assuming such a thing) will accelerate – or otherwise – Mac performance. We’ll find out more on this quite soon.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2020 IDG Communications, Inc.

Source link


Please enter your comment!
Please enter your name here