Before the iPhone, Apple’s predominant compiler technology used GCC to compile Objective-C applications down to native executable code that was specific to the computers’s processor. The compiler produced executable “fat binaries” — equivalent to exe files on Windows and ELF on Linux – but unlike those, a fat binary can contain multiple versions of the same program; so the same executable file could run on different processors. It’s primarily this technology which allowed Apple to migrate from PowerPC to PowerPC64, and then onto Intel (and later Intel64). The downside of this approach is that there are multiple copies of the executable stored in the file, most of which won’t be used. These were marketed under the “Universal Binary” name — while the term was initially used to mean both PowerPC and Intel support, it was later repurposed to mean both Intel 32-bit and 64-bit support. In a fat binary, the runtime dynamically selects the right version of the code; but the application carries extra weight in case it is used on a different processor. Various thinning utilities (such as lipo) can be used to remove incompatible processor code as a means of reducing the size of the executable. These didn’t change the behaviour of the application, just the size.
With mobile devices the code size becomes more important, mainly because the device itself has much less storage space than a typical hard drive. As Apple moved from the original ARM processor to the custom A4 processor and onwards, the instruction set changed and different versions of code were used. These options are transparently set in Xcode based on the minimum level of iOS support and the resulting binaries will contain multiple variants.
The increasing importance of bitcode — and the migration towards LLVM — started happening several years go, when Apple decided to move from GCC to invest heavily in the LLVM tool chain and infrastructure. This initially took the place of compiling GPU specific code for OpenGL but then later extended to the Clang compiler. As support for Objective-C grew, it became a default first for Xcode and then started driving improvements to the Objective-C language at the same time.
This unlocked the potential for a complete LLVM based tool chain to compile iOS applications. LLVM provides a virtual instruction set that can be translated to (and optimised for) a specific processor architecture. The generic instruction set also has several representation forms: it can be stored in a textual based assembler format called IR (like assembly) or translated to a binary format (like an object file). It is this binary format that is called bitcode.
Bitcode differs from a traditional executable instruction set in that it maintains type of functions and signatures. Instead of (for example) a set of boolean fields could be compressed into a single byte in a traditional instruction set, but are kept separate in bitcode. In addition, logical operations (such as setting a register to zero) have their logical representation of $R=0; when this is translated to a specific instruction set it can be replaced with an optimised form of xor eax,eax. (This has the same effect — setting a register’s value to zero — but encodes the operation in fewer bytes than a direct assignment would take.)
However, bitcode is not completely architecture or calling convention independent. The size of registers is a fairly important property in an instruction set; more data can be stored in a 64-bit register than a 32-bit register. Generated bit code for a 64-bit platform will therefore look different than bit code generated for a 32-bit platform. In addition, calling conventions can be defined for both function calls and function definitions; this specifies (for example) whether the arguments are passed on the stack or as register values. Some languages also use pre-processor directives such as sizeof(long) which are translated before they even hit the generated bit code layer. In general, for a 64-bit platform that supports the fastcc convention will have compatible bit code.
So why does Apple require bitcode uploads for the watchOS and tvOS? Well, by moving the uploads to a centralised Apple server it is possible for Apple to optimise the binaries between compilation with Xcode and delivery to the target device. It’s also possible for developers to upload multiple variants and instead of packaging them into a single delivery (which would take up more space on the device). Finally, it also allows Apple to perform the code-signing of the application on the server side, without exposing any keys to the end developer.
The other main advantage of performing server side optimisation is to take advantage of whole and inter module optimisation. When using a statically compiled language without a dynamic runtime component the target of a function or method call can often be proven directly, allowing the table indirection to be avoided and be replaced with an equivalent direct call. This in turn opens up additional peephole optimisations that allow the function to be optimised further; for example, if it can be proven that a caller has a non-null value then null checks in the called function can be optimised away. Such optimisations are typically enabled through the use of -O flags at compile time, but will often just optimise the content of private functions within the same file. Whole module optimisation can consider optimisations across all functions within the same module, but will stop short of module boundaries (such as the dependencies on external frameworks). Inter module optimisations allow the code from different modules to be in-lined and then optimised further.
Each step up the optimisation chain provides more and more benefits but takes correspondingly more and more time to process. By offering optimisation-as-a-service and integrating it within the app store process, Apple allows developers to take advantage of compiler optimisations that may be prohibitively expensive to run at development time but can be batch processed by Apple’s servers at App Store provisioning time.
Perhaps more interestingly, it allows future optimisations to be developed after the application is uploaded and then have the application re-optimised to produce a faster or smaller application executable in future. Bitcode will provide Apple with a wealth of test cases for optimisation experiments; instead of having to construct examples from scratch they will be able to use real world code bases.
Finally, the bitcode on the server can be translated to support new architectures and instruction sets as they evolve. Provided that they maintain the calling convention and size of the alignment and words, a bitcode application might be translated into different architecture types and optimised specifically for a new processor. If standard libraries for math and vector routines are used, these can be optimised into processor specific vector instructions to gain the best performance for a given application. The optimisers might even generate multiple different encodings and judge based on size or execution speed.
Bitcode also has some disadvantages. Developers can debug crash reports from applications by storing copies of the debug symbols corresponding to the binary that was shipped to Apple. When a crash happens in a given stack, the developer can restore the original stack trace by symbolicating the crash report, using these debug symbols. However, the symbols are a by-product of translating the intermediate form to the binary; but if that step is done on the server, this information is lost. Apple provides a crash reporting service (InfoQ covered the purchase of TestFlight last year) that can play the part of the debugger, provided that the developer has uploaded the debug symbols at the time of application publication. The fact that the developer never sees the exact binary means that they may not be able to test for speciic issues as new hardware evolves. There are also some concerns about ceding power to Apple to perform compilation – including the ability to inject additional routines or code snippets – but since Apple is in full control of the publication process these are currently possible whether or not the developer uses bitcode or compiled binaries.
In addition, Apple’s initial roll-out of the bitcode and app thinning service was put on hold, because issues in upgrading from one type of hardware to a different type of hardware didn’t restore the right versions of binaries. This issue was subsequently fixed with iOS 9.0.2 and the feature re-enabled
It’s called the Apple Watch, it looks like a luxurious synthesis of technology and traditional timepiece craftsmanship, and it uses a “digital crown” to navigate through lists and zoom in on data. Apple finally announced its long-awaited smartwatch on Tuesday, and, boom, just like that, the center of gravity of the shaky wearables market has shifted in a seismic reset.
No, not because the watch looks that revolutionary. It’s because this is an Apple smartwatch. And for better or worse, this is the new wearable’s most important feature.
The new Apple “this changes everything” device starts at $350, and will be available early next year. But here’s the kicker for potential fence sitters: Apple Watch supports the new Apple Pay system, so you can quickly purchase items from retailers ranging from Bloomingdales to Staples with a flick of your wrist.
Apple Watch will come in three different models—from a baseline version to an ultra-luxe 18K gold edition—and six different bands will allow a wide degree of personal customization. “Taptic” feedback puts pressure on your wrist for iPhone notifications, and can even send signals to turn left or right in the watch’s navigation app.
And, yes, this gadget requires an iPhone to work.
In many ways, the Apple Watch feature set looks like it was borrowed from other smartwatch companies. Dig: The gadget tracks your steps and heart rate. It displays smartphone notifications. But Apple’s message is that Apple Watch does all these things better than the competition. And with its perfectly contoured edges, luxury materials, and nifty new UI, it’s got all the visual trappings of another “God device” from Apple. That’s a potentially transcendent advantage that could send competing wearable-tech manufacturers back to their drafting tables.
A physical dial replaces some of your finger taps
Tim Cook proclaimed Apple Watch will “redefine what people expect from its category” and that the wearable is the “next chapter in Apple’s story.” To help achieve such lofty goals, the Apple Watch doesn’t ditch third-party apps—which are always so difficult to implement on wearables—but instead reimagines the smartwatch UI. The key is a new “Digital Crown” that’s used to navigate the teeny-tiny visual elements on the curved, sapphire display.
The new crown translates rotary movement into digital data. But that’s just tech speak for a new navigation scheme that might solve the problem of swiping through icons on a necessarily minuscule display. The crown can be used to zoom in on interface elements and scroll through content—actions that would otherwise require finger gestures in a less advanced UI.
The Apple Watch will come in two case sizes (38mm and 48mm heights), but Apple didn’t mention specifics about display dimensions or resolution, through we know the display is a flexible Retina display underneath a single crystal of sapphire. Regardless, anyone who’s ever used a smartwatch knows new navigation paradigms are welcome.
It’s not so much a watch as an experience
Apple’s grand reveal was relatively light on details, and perhaps the Apple Watch’s most killer feature—Apple Pay support—was tacked on at the very end of the event, as if a last-minute afterthought. There were no details on internal specs, but we learned this wearable doesn’t include a camera, as its main photo feature is a scheme that turns your watchface into a slideshow for your iPhone photos.
That said, Apple did riff at length about industrial design, and highlighted some built-in software features and third-party apps.
Siri voice dictation will let you send iMessages to friends (among other common Siri tricks). There’s also a Digital Touch feature that lets you create something of a walkie-talkie-like connection with your pals. It’s Apple’s new method for sending customized emoji and quick little finger doodles composed with digital ink. And—awwww—with Digital Touch you can even send a visual representation of your heartbeat to loved ones.
Health and fitness features consumed a large portion of Apple’s Watch presentation, but Apple never really delved into the accuracy of its sensor technology. This is a critical area to watch, as activity-tracking features are so unreliable in competing smartwatches. Nevertheless, Apple Watch includes a heart rate sensor and accelerometer onboard, and handshakes with the iPhone’s GPS to track and reveal various activity metrics.
A Move ring displays your daily calorie burn. An Exercise ring tells you how active you’ve been. And a Stand ring reports the embarrassment of your sedentary lifestyle. Once you meet your daily goals for each ring, you earn an achievement, and the ring goes away. There’s also a separate workout app that reveals how far, how fast, and how long you’ve been exercising.
If you’re already a fan of activity-tracking wearables, you know none of these features are revolutionary at face value. But if Apple can deliver a software interface that people love, it will solve an elusive user-experience puzzle.
Data collected by Strategy Analytics notes that global smartphone shipments reached 353.3 million units in the first quarter of 2017. Of that 353.3 million, 21.5 million were the iPhone 7, with 17.4 million the iPhone 7 Plus.
Given those figures, the iPhone 7 claimed 6.1 percent of global smartphone sales, with the iPhone 7 Plus taking 4.9 percent. Rounding out the top 5 are the Oppo R9s at 2.5 percent, and the midrange 2016 Samsung Galaxy J3 and J5 taking 1.7 and 1.4 percent respectively.
The Oppo R9s retails for around $425. The Galaxy J5 sells for $180 with the J3 retailing for $150.
On April 26, Apple announced its second fiscal quarter results. During the quarter, the company sold 50.8 million iPhones, but as usual did not break down by model.
Apple CEO Tim Cook called the declining second quarter iPhone sales year-over-year partly because of more frequent, and earlier leaks of details for future products. The “iPhone 8” rumor mill started in December of 2015, shortly after the release of the iPhone 6s.