sloaah
About
- Username
- sloaah
- Joined
- Visits
- 28
- Last Active
- Roles
- member
- Points
- 141
- Badges
- 0
- Posts
- 30
Reactions
-
Canon: No camera can truly capture video for Apple Vision Pro
yeah, I’m missing something here too. The canon virtual 3-D lens they talk about when coupled with an R5 will do four megapixels for each eye at 30 frames for second. What happens if you show 30 frames per second on Apple vision that’s rendering it 60? Seems like it would still just be fine.I'm a filmmaker and have worked in VR in the past, so I can give some insight.
The reason why the resolution is apparently so high is because this is for 180VR films. The videos occupy half of a sphere (180º). Though the Apple displays are 3.6k horizontally, that's at roughly 105º FoV; so 3.6k/105*180 = 6.2k resolution per eye.
If you're recording both frames on one sensor, which is how it's done on the Canon Dual Fisheye lens (and which is the easiest way to keep the lenses to an inter-pupillary distance of 60mm (roughly the distance between our eyes), then you need a resolution of 12.4k (Horizontal) x 6.2k (Vertical) = 77MP. There is also some resolution loss given the fact that the fisheyes are not projecting onto the full sensor – they are project just two circles side by side on a rectangular sensor – so I would imagine 100MP would be roughly right to retain resolution across the scene.
As to frame rate. Cinema is 23.98fps and 180º shutter, which means that the shutter is actually closed half the time and open the other half of the time. It leads to a certain strobing which subconsciously we associate with the world of cinema. Nobody really knows why this is so powerful, but maybe it helps remove us a bit from the scene so our brains associate it more as something we're observing rather than us being part of. Tbh I'm not really sure.
But with immersive video, we want to do the opposite. Rather than emphasise detachment, we want to emphasise immersion. And so we want to shoot at a frame-rate which is roughly at the upper end of what the human eye can discern, removing as much strobing as possible. That means roughly 60fps. The fact that there are two frames being shown, one for each eye, doesn't alter this equation. It still needs to be 60fps per eye.
The Canon dual fisheye on an EOS R5C produces two image on an 8k frame. The two images render to a single image which is half of 8k. This suggests two synced 8k cameras could work and that it doesn’t all have to occur on a single sensor as is suggested in that statement.That is true, but it is difficult to get the lens spacing to match the 60mm inter-pupillary distance that I mentioned. If you remain constrained to this distance, then a single sensor is the most effective way to achieve this, because you don't have any dead space between the sensors and thus you can maximise sensor size. It can also ensure that you don't have any sync drifting between left and right eyes, which can be a tricky problem to solve.
In theory you could presumably also create some sort of periscope system so that the two sensors can be entirely detached; but I imagine this would be very costly.
Looking at the BTS shots of the Apple cameras, they interestingly don't follow this inter-pupillary distance rule. Nor does the iPhone 15 Pro for that matter. The Vision Pro isn't available in my region, so I haven't had a chance to see what these spatial videos look like, but I wonder if there is some interesting computational work happening to correct for this. That sort of computational photography work – which essentially repositions the lenses in software by combining the image data with depth data – is definitely implemented in how the Vision Pro does it's video pass-through, where the perspective of the cameras at the front of the headset are projected back to where the user's eyes are.
If there is a computational element going on here, then that's hugely interesting because a) it effectively solves this issue with needing to use one sensor, and b) it opens up intriguing possibilities of allowing a little bit of head movement, and corresponding perspective changes (i.e. true spatial video rather than just 3D video – or what is called 6DoF in the industry). -
Geekbench reveals M2 Ultra chip's massive performance leap in 2023 Mac Pro
What's far more likely is that the price/performance ratio of the new Mac Pro is so much more compelling that it will open up the Mac Pro to a much larger user base than when it topped out at $54,000No that's not the case. The larger user base is covered by the Mac Studio; the Mac Pro serves a niche crowd still.
It's pretty clear what's happened. Mac Pro users are traditionally more from the creative industries, where Apple has historically a strong foothold. There are a few different categories of use cases that benefitted from the Mac Pro including (but not limited to):- 3D/VFX
- 2D/compositing
- Video post-production
- Sound/music professionals
Those in (2)-(4) generally don't need massive compute performance, and instead benefit significantly from the Apple silicon's SoC approach. Some in (3) and many in (4) need PCI-E expansion slots for hardware input/output of industry niche ports (e.g. SDI for some video post users like colourists). The Mac Pro is really for them... it doesn't really offer any benefit over the Mac Studio except for these sorts of users.
For those users in (1) and for some in (2) who do need massive computer performance... Apple has just decided that it's easier to drop those markets, at least for the time being. Besides, as mentioned, Apple lost those markets a while ago; and regaining them takes time. Even if they came out with SoCs that had GPUs on par with the best GPUs that Nvidia has to offer, the simple fact that all this software is optimised for CUDA and either not compatible or still very new on Metal means that they would still struggle to get a foothold.
3D/VFX people have been keeping an eye out on Apple since Apple Silicon came about, so it's a bit of a shame that Apple didn't have a standout product suitable for them; because if they wanted to regain those users, the launch of a powerful Mac Pro would have been that moment in time. As it is, M2 Ultra – even if it's powerful – is a far cry from their needs, falling massively short in GPU compute and in available RAM capacity. - 3D/VFX
-
Apple Vision Pro Optical Inserts pairing process and other details revealed
isaiahmontoya said:It will still be used at events regardless. If you have perfect vision or you wear contacts, optical inserts aren’t even necessary. The option to wear optical inserts for people who wear glasses is a luxury we haven’t have at these types of events in the past. Thankfully it sounds like they’re working on adaptive liquid lenses for a future version of the device which adjust to your vision without the need for optical inserts.It means every experience will need to have dozens of custom lenses, with devices to check existing prescription of glasses. And that every headset will need to be rebooted to calibrate to the new lenses.It just won’t happen. No large scale experience will use it, where you have 60-100+ people per hour partaking in the experience.I say this as a VR professional with direct experience of this field. -
Up close and hands on with Apple Vision Pro at Apple Park
tmay said:Hopefully, a third party will be able to provide a hot swappable, dual battery configuration so that there is essentially limitless power with multiple batteries.Apparently the Vision Pro is already hot swappable. There’s presumably a small battery in the headset itself. -
iPhone 15 Ultra: What it may look like, and what to expect in 2023
Look at the Apple Watch. They have rounded bottom edges and the glass is part of the round, and the round is more parabolic than circular.The problem with this approach is that it exposes the glass directly to drops on the edge. For that reason, I think they will keep the glass flat and just curve the titanium. -
Kuo reiterates 120 mm tetraprism camera coming to iPhone 16 Pro
Agreed on all the comments about 120mm not being useful. 70-90mm is the ideal portrait range; even up to 105mm in some situations. 120mm is too tight for portraits but also not useful for wildlife etc.
i actively chose the 15 Pro over the Pro Max to avoid the 120mm. Initially marketing was persuading me in the other direction, but having used the Pro Max, the 120mm is just an underwhelming camera - too tight and (to my eyes) quite noisy. -
Apple Intelligence wasn't trained on stolen YouTube videos
wdowell said:macca said:Correction England doesn’t have a Parliament. Its the British Parliament -
Apple Vision Pro review: six month stasis
I think the frustrating thing is how hands off Apple is with development. They're approaching it like the iPhone – build it and the developers will come. But the limited market means that it's in danger of being more like the Apple Watch, which continues to have a lackluster App Store. At least with the Watch the apps aren't mission critical, since there's already a strong use case with just the in-built health trackers.
The only reason the Quest etc. have an organic market as they do is because Meta poured in money into games development for that. None of those games are being ported over to the Vision Pro, because 1) the market is too small, 2) some of them are still Meta exclusives (typically for 3 years), and 3) it requires rethinking input because of the lack of controllers on the Vision Pro.
As a VR studio ourselves, we'd love to create content for the Vision Pro... but when the top apps have in total around 10K downloads, of which maybe a quarter at best are paying... who would build an app for a device that only has 2500 paying users? You're talking about 10-25k in revenue after you've taken Apple's 30% cut. The major XR experiences are all high six figures or even seven figures in terms of costs of development.
IMO it's a bit of an own goal from Apple... If they offered even a $50mn fund, you'd have around 50 solid apps for the Vision Pro that push its limits. At the moment there's basically nothing that makes full use of it.
-
Apple's former hardware chief and current Apple Vision Pro lead is retiring
JamesCude said:The positive spin at the end of this article is kind of embarrassing. AVP is obviously a huge flop to anyone not immersed in the Kool Aid. Though of course, Apple will be just fine with its many other successful products.If anything it’s just a reminder of how crazily accurate Apple is at anticipating consumer response. No other manufacturer can achieve sales within a 10-20% margin of their original estimates when it’s a completely new device.Personally, my own view is Vision Pro sadly hasn’t received the developer attention that it needs, but I can’t see how anybody would call it a huge flop on the metrics which actually matter to Apple (sales). -
iPhone 16 Pro rumored to get hugely better ultra-wide sensor & optical zoom