Rob Layton is a Senior Teaching Fellow in Mobile Journalism at Bond University.
Cinematographers, generally, are a fickle lot when it comes to technology.
They work to exacting detail with complicated cameras and even more complex lighting scenarios the way most of us use an office keyboard and mouse. They’re not easily impressed.
So, to be in a virtual room of Aussie cinematographers, each holding an iPhone 13, and told this is “Hollywood in your pocket”, it’s reasonable to expect their reaction might be akin to a scene from Mad Max Fury Road.
However, responses around the room last week turned out to be even more startling than Charlize Theron’s prosthetic arm in George Miller’s post-apocalyptic spectacular.
They liked it.
Each September, Apple introduces its latest suite of iPhones with at least one big-ticket camera feature they hope will sell the range.
This year it’s Cinematic Mode, which gives users the ability to focus on certain people or objects in a video while blurring everything else.
It’s a hallmark of cinematic storytelling; the ability to turn the audience’s attention from one person or thing to another, purely through focus.
Plus it’s what sets good cinema storytelling apart from day-to-day video soap operas and reality TV shows and has, until now, been out of reach (or at least awkwardly executed) on smartphones.
Swapping emphasis from one person or object to another through selective focus is known in the industry as rack focus. It’s a specialist job and the cinematographer and director rely on the focus puller to get it right first time, every time, or the shot is ruined.
Longer focal length:
The 13 Pros feature a 3x tele lens, which is the equivalent of 77mm in the old 35mm camera parlance. That brings distant objects even closer in the image but the downside is that the aperture is smaller (f2.8) compared to the 12 Pro’s 65mm lens at f2.2, so it won’t be as good in lower light.
New Photography Features
You’re probably used to applying filters that change the look of your photos with Instagram, third-party apps, or in the native editor on your phone. Those filters are applied after the image is taken. Photographic Styles is similar, in that you can dial in more warmth or change the tone, but here the difference is that you make the changes before you take the picture. This way those changes are at the sensor level and organically part of the image-making process. Imagine you paint a wall, and then painting a new colour over the top. That’s a filter. This process is essentially mixing your paints before you dip the brush. If you have a favourite look, you can save it as a preset. However, be aware the look is burned into the image and may be difficult to recover in editing.
This feature is truly spectacular. Now you can get up and close with your favourite bugs and small things. Previously it was only possible to do macro photography with a smartphone using a special macro lens but iPhone achieves this with the addition of auto-focus to the ultra-wide camera (earlier models had fixed focus on the ultrawide). The 13 range allows the user to focus on objects as close as two centimetres. This works across photography, video, slomo and time-lapse. Some users have already complained about the automatic camera switching (if you’re on wide, move in close, the phone automatically switches to ultrawide) so there will be a software update soon that enables users to turn this off.
HDR (high dynamic range) extends how far the camera can capture highlights and shadows in one scene. In the old days of smartphone photography this looked terribly artificial but now the results are pleasingly natural and is a standard feature of all good smartphones. The addition this year to iPhone is that HDR can identify up to four people and segment them to give each individual a personal make-over, depending on the rest of the scene. If you are shooting in areas of bright highlights and deep shadows, such as a typical summer day outside, this feature makes the scene look closer to how our eyes perceive it.
The simple answer: If you don’t know what this is, you don’t need it. ProRes is a high-quality video compression format that is widely used as the final format delivery method for high-end use, such as broadcast files and Blu-ray. It’s part of Apple’s push deeper into professional iPhone video production. Smartphones are becoming increasingly popular among enthusiast and professional filmmakers and journalists as viable alternatives to bigger, more expensive cameras and Apple has recognized that for some time. We are likely to see a lot more of these pro tools on smartphones. But be warned, one minute of ProRes footage on a regular system yields a 6gb file, so this may not be the best choice for filming your kid’s next birthday party.
Pro features across the range:
Usually the best features are reserved for the Pro phones but this year we are seeing a number of them across the entire 13 range, including Cinematic Mode, Photographic Styles and sensor-shift stabilisation. The latter was one of my favourite features of last year. It was available only on the 12 Pro Max but this year is across all 13 models. What it does is smooth out video to look as though you are using a gimbal by shifting the sensor up to 5000 times per second to compensate for shaky movement. It is well illustrated in this video, which I shot during a family holiday to Fraser Island last month. The phone is fitted rigidly to the windscreen of my vehicle, so you’d expect it to be bouncing around wildly as we negotiate this rough track, but you can see that it’s the bonnet of the vehicle that moves and not the surrounding landscape. Sensor-shift is only on the wide camera.
Pro display is now brighter and has a faster refresh rate (up to 120hz). That’s good for gaming but it’s also a bonus for filmmakers and film-lovers alike. It’s variable, so it will adjust to match what you are watching. Say you’re settling in to enjoy Tom Cruise’s latest blockbuster, and it was shot at 24fps, the screen will match that frame rate so you’ll see it as the filmmaker intended. You won’t have to worry about Tom coming to your house and berating you for not changing your screen settings to his liking (watch his YouTube appeal while making Top Gun: Maverick). However, be aware that if you’re watching a streaming app such as Netflix you may need to update the streaming app to support ProMotion.
Our cinematographers gave raised eyebrows of approval for how iPhone 13 tracks subjects and predicts where to focus next using complex artificial intelligence, based incredibly on which direction the subject of the scene happens to be looking.
They liked that dancing yellow reticles locked onto subjects and that the focus fall-off looked natural and devoid of the artifacts that plagued earlier attempts at artificial bokeh (blur) in smartphone photography.
But it was the next feature of Cinematic Mode that made their jaws drop.
“You mean I can change focus after I shoot, in the editing?” asked one. “I can change my mind in post-production?”
Such is the power of the iPhone 13’s A15 processor that the neural engine builds a three-dimensional render of a scene that enables users to choose new focal points after they’ve shot their video.
This is due to what’s known as computational photography. The concept has been around for a few years, and with Apple since iPhone 7, which was the first iPhone to introduce dual cameras.
The system works by a method called parallax, in which two cameras – one wide, the other tele (or perhaps the ultrawide) – operate together to view a scene. The neural engine then uses complex algorithms to analyze and sort out the differences between the two viewpoints.
It works in much the same way that human eyes perceive depth; disparity between images enable a depth map to be created, and once that depth map is established, any point on it can be selected as the spot we want to be in focus after the recording.
This requires some serious computing power, which is why Apple had to devise the A15, a chip that’s up to 50% faster than its predecessor. At 15.8 trillion operations per second, Apple claims, that makes it the fastest smartphone chip on the planet.
The camera improvements don’t stop there, although the rest of them may not be quite as Ben-Hur-esque as Cinematic Mode.
Whether they are show-stopping enough to warrant you upgrading your phone will depend entirely on how you want to use the camera system (that said, reports show pre-orders for iPhone 13 are already stronger than its predecessor for the same period).
Other notable camera features this year include:
- Improved hardware
- New photography features
- Macro photography
- Smart HDR4
- Some of last year’s Pro features now standard across the 13 range
- ProMotion display
What made the 12 Pro Max such a powerful camera (apart from the ultra-fast A14 bionic processor) was the larger sensor. This is the part of the camera that captures light. Imagine buckets of water in the rain. The bigger the bucket, the more rain it captures. This is especially relevant when there’s not much rain – or in the case of cameras, low light – you want the ability to catch as much as possible. iPhone 13 Pro Max features a larger sensor and photosites (those tiny little individual buckets that comprise a sensor) than iPhone 12 Pro Max, which was again 47% larger than iPhone11Pro.
The aperture of any camera means how wide the opening of the lens is, dictating how much light pass through onto the sensor. It’s like filling that bucket with a hose. The bigger the hose, the brighter, sharper your images will be. The main wide camera on the 13 Pro Max is only marginally wider than the 12 series (f1.5 compared to f1.6) but it’s the ultra-wide camera where the biggest improvement lays. Aperture is measured in f-stops and the camera on the 13 Pros is now f1.8 compared to the 12 series f2.4. What those numbers mean in consumer terms, according to Apple, is that the camera now captures 92% more light. Note: the standard 13 ultra-wide is still f2.4.