Introducing The List

With the increasing popularity of UHD TVs in the market and the Xbox One X’s awesome backwards compatibility support, I’ve been working on a small side project over the past few months to make sense of the increasing complexity of console and handheld gaming. Today I’m launching a public spreadsheet which I’m calling The List. Consider it a console gaming “optimization” guide.

The List is linked under a new Reference section of the blog, which I will be expanding over time to include various tech-related reference information. The goal is to list the best version and means of playing almost every (good) console or handheld game ever, within reason. For example, I’m guessing not many people would realize that the best way of playing of some original Xbox games is through an Xbox One X connected to a 1440p FreeSync monitor.

I’m sure this will disappoint some people, but I am not including PC games. This is for a couple reasons. In general, 95+% of multi-platform games will of course run best on the PC (on Windows, specifically), so there isn’t much value to add in that regard. Also, optimizing software settings for PC games and all their potential hardware configurations can be maddeningly complicated if you truly want to play without compromises.

Console gaming is just complicated enough that I think a simple guide has merits, especially with the advent of the PlayStation 4 Pro, Xbox One X, and even Nintendo Switch. This guide was initially inspired by the confusion created by the various output display modes of some games as part of their support for the PlayStation 4 Pro.

Unsurprisingly, I am heavily leaning on the excellent analysis of the folks at Digital Foundry, who are the best at head-to-head comparisons of games. While graphical and audio comparisons are fairly straightforward, comparing ports on features and overall quality is much trickier and more subjective. I will try my best to be thorough and fair.

I’m launching The List with 100 games to start with. I’m not sure if more than a couple people will find this project of value, but it’s worth a shot. It would be awesome if anyone would like to contribute suggestions or help out, but I am by no means expecting anything. And if anyone has any specific games they would like to learn more about, please do message me on Twitter. Thanks.

Guesting on the Android Central Podcast

Thanks to the fine folks at Android Central for hosting me on last week's episode of their podcast. We talked about the color accuracy of both Pixel 2 displays, factory panel calibration, software mitigations for burn-in, color management, Google's statement on the Pixel 2 XL, and more. Check it out if you're interested in a more in-depth discussion on the topic.

Assessing the Pixel 2 displays

There has been an absurd amount of misinformation circulating about the displays of Google's new Pixel phones. I’ve written hurriedly elsewhere about this, but one point I want to stress is that sRGB has little to do with anything. The devices run Oreo and are thus color managed.

To be completely clear, the 2 XL's panel is really bad, but we've also effectively known this was going to be the case for months based on the LG V30. After discussing with some smart folks, I’ll go ahead and speculate as to what’s going on. These are just some guesses, and none of this is confirmed.

You should never trust your eyes, and this needs to be properly tested and measured, but it does look like the display of the 2 XL undershoots the red and blue sRGB primaries and overshoots green. The panel is also clearly way too cold, which makes the off-axis color balance look really bad at large angles. There does appear to be a green push similar to that that could affect the Samsung Galaxy S4. Green shifting is about the worst result you can have for the color rendering of a display, because people associate it with nausea.

In addition to all of that, there are clearly various other traditional OLED issues and defects affecting the panel. Samsung Display OLED used to suffer from many of these problems, but the company managed to solve almost all of them over the past few years.

None of this can be fixed in software. It’s just a bad panel.

The calibration itself is software. The Pixel 2 XL’s panel is definitely not individually calibrated, and probably not even batch calibrated. LG Display appears not to have the appropriate equipment and workflow for adequate calibration. Google probably could have paid sufficient money to buy this for the production line and made sure it got done, but it’s also possible the fabs are too immature and this was somehow not feasible.

LG Display was not originally going to ship OLED smartphone displays this year, but seemingly rushed to re-enter the market due to high demand from vendors wanting to better position against the upcoming iPhone X, for all the wrong reasons. (I’m hearing that some vendors now want displays with notches, which is spectacularly depressing.)

The Samsung Display OLED on the 5.0” Pixel 2 meanwhile appears to be pretty reasonable at a distance. These displays are possibly individually calibrated. I strongly suspect it’s actually the same panel that the first Pixel phone used, except now calibrated to the Display P3 color space as well as could be managed. Google added some software to Android Oreo to allow for this, which is the same sort of thing vendors like Samsung and Qualcomm have been doing for many years for tons of devices.

On a final note, one thing Google has always done wrong is include optional or even default color profiles that are deliberately inaccurate. The "Vivid" setting should not exist if the product managers truly care about accuracy, and I hope Google doesn't add any more in response to all this overblown controversy. Perception of "dull colors" is not the problem to solve. No consumers complain about the accurate color rendering of an iPhone.

Resonant Qi and the iPhones

I previously did a really poor job of explaining the new iPhones’ wireless charging support, so I would like to rectify that. The new iPhones are Qi 1.2.3 devices, but the iPhone 8 currently only supports Qi 1.1.X. This means inductive-only charging, and no fast charging.

The resonant extension of the Qi standard was introduced with 1.2 in 2014, and compatible chargers have been available for years. Qi 1.2 also supports simultaneously charging multiple devices with optional WP-ID unique identifiers for power receivers.

The resonant specification allows for charging at greater distances (implementation-dependent) and does not require precise device-charger alignment. The medium power extension for 1.2 allows for charging above 5W using Extended Power Profile chargers, and the first devices came to market in 2016.

Apple is using Broadcom’s 2014 BCM59350 which only supports charging up to 7.5W. This limits the many resonant Qi chargers in the market that support up to 15W of power delivery. An upcoming iOS update will add support for Qi 1.2.3 and 7.5W (5V/1.5A) charging on an unknown date.

What I should have originally stated in my iPhone X article is that inductive charging is mostly useless, and resonant charging is only a little better at the moment. Far-field resonant charging will eventually provide a reasonable experience in the future. Of all the available options, Apple has chosen to bet on the future of the Qi specification.

 

Update

Here's a good explanation of resonant vs. inductive Qi.

"How Google Built the Pixel 2 Camera"

This is an excellent visual overview of smartphone camera imaging. It even covers demosaicing and how vendors perform objective robotic testing.

“There’s a saying in engineering: if you haven’t really tested, it’s broken.”

Uh huh.

"New Theory Cracks Open the Black Box of Deep Learning"

Here is the video of the talk, and here is the associated paper on arXiv.

Geoffrey Hinton, a pioneer of deep learning who works at Google and the University of Toronto, emailed Tishby after watching his Berlin talk. “It’s extremely interesting,” Hinton wrote. “I have to listen to it another 10,000 times to really understand it, but it’s very rare nowadays to hear a talk with a really original idea in it that may be the answer to a really major puzzle.”

Also worth noting: Tishby’s work wasn’t accepted for NIPS 2017.

I have no clue about the likelihood of proposed explanatory theories, but deep learning is one area of computation that I am optimistic about. It is very easy to improve the efficiency of deep learning workloads right now, and it should be straightforward to gradually improve precision in the future.

Apple discusses its 2017 silicon

There are several errors in this article, one of which is a bad one — the CPU is not 70% more efficient.

Nonetheless, I appreciate when Apple's silicon team provides some information. I think the only technical disclosure in this piece is the redesigned secure element (the Secure Enclave, or SEP). I believe we already knew everything else, if I'm not mistaken.

A three year design lead period for the Neural Engine is to be expected. Apple's "fully custom" GPU is still using tile-based deferred rendering (TBDR), which is Imagination Technologies' IP.

Apple’s silicon team is, for example, obsessed with energy efficiency, but never at the expense of responsiveness.

Meh. I appreciate that pushing single-threaded performance is insanely hard, but I've never been sold on this philosophy. You know what would be a lot more efficient? Getting the software teams to truly care about locked-60fps performance again, as they did until iOS 7. Every Apple device still drops tons of frames, no matter how fast the silicon, and ProMotion isn't a magic bullet solution. That's not the silicon teams' fault.

“We’re thinking ahead, I’ll tell you that, and I don’t think we’ll be limited,” and then he added, almost as a post-script, “It’s getting harder.”

That it is indeed.

Thoughts on the iPhone X

Yesterday Apple revealed the iPhone X, which had been extremely hyped for years. The highlight feature of the phone in my opinion is clearly its HDR display. Apple is the first mobile vendor to ship end-to-end support for HDR, while Samsung’s Galaxy S8 was the first device to at least feature full hardware support.

I wrote about this at great length over the past couple months, so for all the details check out these three articles. The first article was mostly wrong about the UI elements, but overall they were hopefully pretty accurate.

Much to my surprise, though, the X’s display really appears to use a diamond PenTile subpixel layout. Apple’s own marketing states that it “uses subpixel anti-aliasing to tune individual pixels for smooth, distortion-free edges.” That’s effectively confirmation, and exactly the same as what Samsung Electronics has always done with OLED. This means there’s definitively no near-term hope that S-Stripe can be scaled up economically to large phone panel sizes at high pixel densities. Samsung hasn’t been holding back.

One thing I will add is that while I appreciate Apple’s intention with advertising contrast of 1,000,000:1 as opposed to an infinite ratio, it’s also ok to say it’s infinite for OLED because of the near perfect blacks. Not shipping ProMotion is probably due to power budgeting, though it’s possibly because of performance constraints.

There are pretty much zero surprises on the silicon side overall. I might write more about the A11 another time, but the IPC improvements are relatively modest. I know many people will raise an eyebrow at this, but I would encourage you to completely ignore Geekbench. And I can’t emphasize enough how almost everything written online about Apple’s CPUs is wrong.

If you subtract out the efficiency gains from removing 32-bit support, you’re left with maybe very roughly a 15% improvement in CPU IPC for the big cores, assuming equivalent clocks to the A10. Apple could have pushed performance and efficiency further, if not for 10FF being really bad. The era of the hyper Moore’s Law curve in mobile is officially over, in my opinion, though maybe the A10 already signaled that. It’s all rough sledding from here on out, based on the state of foundry challenges.

The design of the CPU itself is completely unsurprising. Once it was leaked that it was hexacore, it was obvious that the smaller cores would have been designed for a higher performance target than last year’s Zephyr cores. The design is fully cache coherent but also clearly not ARM SMP.

Regarding “Apple’s first custom GPU,” well… I’ll put it this way: there are probably even people at Apple who don’t really consider it to be fully custom. Personally, I don’t consider it to be custom based on what I know, but I can’t say more. Sorry to be vague, but it’s complicated.

In terms of performance, Apple only claimed a 30% performance improvement, which is not massive. 50% power at iso performance on 10nm is also not necessarily impressive if you think the GPU in the A10 burned too much power in the first place. The reality again is that TSMC’s 10FF is really bad, though, so Apple probably couldn’t achieve more. That Apple went with a tri-core design is interesting. New architectural features are the biggest thing that Apple is touting, but I know nothing about graphics and can’t comment on those.

Apple didn’t say anything else about other interesting things they did in terms of silicon, so I will also not talk about them.

Setting aside the (no doubt really expensive) IR system, the single most impressive advancement might be the new camera color filter. This is a really big deal, because it’s insanely hard to improve on the Bayer RGBG color filter array. We haven’t seen any vendors attempt do this in years in mobile, but it was inevitable some vendor would try again to ship an alternative. Whether Apple’s implementation is RGBW ala Aptina or another subpixel arrangement, I have no idea. I know essentially nothing about image filtering and demosaicing, so there’s nothing more I can say. For the cameras themselves, Apple has finally shipped larger image sensors, though we don’t know the pixel size yet.

The A11’s video encoding performance is also extremely impressive, and it’s fair to say Apple is way ahead of the competition. 4K60 encode simply requires a massive amount of data bandwidth. I’m not sure at this point if deep learning is being applied in this area or not, but Apple is definitely employing its own special techniques to accomplish this.

Face ID seems to be significantly slower than TouchID but more secure. I was skeptical about the latter claim, until Apple explained the sensors and the dot projector. That seems like more than enough data, but keep in mind there will be corner case problems, bugs, and oh right sometimes the deep learning classifier will just miss. Reliability and robustness also matter, in other words. And as Apple tried to lightly dismiss, it won’t work for identical twins. Overall, Face ID should be considered a convenience regression over Touch ID as a necessary concession for the thin bezels of the display, since Apple failed to get fingerprint recognition to work through the display. It works, but Apple is surely unhappy internally.

I almost recently published an article on things I believed Apple needed to improve about the iPhone. The five areas I was going to highlight were: speaker quality, portrait mode quality, shipping a dedicated DLA, camera sensor pixel size, and switching several imaging algorithms to deep learning implementations. As a pleasant surprise, Apple addressed all five of these areas (though I had strong hunches it would do so).

Waterproofing hurts speaker quality, and the primary speaker in the iPhone 7 regressed in some regards despite the improvements to volume and dynamic range. Portrait mode on the iPhone 7 Plus can honestly produce some pretty poor results. The people who work on these algorithms have PhDs in computational photography, though, so I probably shouldn’t criticize what I don't know. I don’t think the previous implementation used deep learning, but I could be wrong. For portrait mode on the front camera of the iPhone X at least, Apple appears to have switched to a deep learning implementation, if I understood Phil Schiller correctly.

Dedicated deep learning ASICs like Apple’s Neural Engine are clearly the direction the industry is moving, so it’s hardly unique or honestly that hard for Apple to do so as well. Inference is too important not to specifically accelerate, so this is something Apple and everyone else clearly need. Implementations should be all over the map and vary quite a bit in terms of results. No one knows how these chips should ideally be designed, so everyone will be experimenting for many years. Apple did disclose a tiny amount of detail on the Neural Engine, such as its being a dual-core design, which was nice of them to do.

There are also a new accelerometer and gyroscope, which is unsurprising if you’re familiar with sensor design standards for VR platforms such as Oculus’s Gear VR or Google’s Daydream. Previously Apple has preferred to source lower power sensor implementations, so perhaps these new sensors are more accurate but draw greater power. The cameras themselves are now also individually calibrated, which is probably really important.

Additionally, there is now hardware codec support for FLAC, which was to be expected given software codec support in iOS 11. ALAC is basically irrelevant now.

The biggest mystery to me going into yesterday’s keynote was the iPhone X’s wireless charging. Since it was revealed to use completely standard Qi charging, I will hazard a guess as to what this is really all about. It’s no secret that Apple wants to push for a completely wireless future, and the Lightning connector’s days are clearly numbered. To get to that point will require far-field wireless charging using resonant technologies.

My understanding, and what I don’t think people realize, is that a resonant specification depends on an inductive specification. If this is the reasoning, then Apple has to help propagate an industry standard to make inductive charging as ubiquitous as possible around the world. Thus, Apple would be pushing for wireless charging now even if it doesn’t see a ton of value in inductive alone. This is just my theory, so I could certainly be wrong.

As an aside, Schiller had to directly contradict his own past comments about the utility of wireless charging (and NFC), which I think is an excellent example of why you should generally avoid saying negative things about other companies or people.

I might write more about the Apple Watch Series 3 in the future, but I at least previously explained all of the cellular details here.

Technical corrections are always appreciated.

"Hardware Architectures for Deep Neural Networks"

This presentation from MIT provides an excellent overview of current techniques and hardware implementations for efficient deep learning computation. There is also an associated paper.

The material requires familiarity with the basics of deep learning and its terminology. Provided you are familiar, though, the presentation is very accessible and easy to follow even if you aren't a machine learning researcher.