Apple watch is one of the newest of Apple form factors at the moment. Programming for it opens up a whole new level of technology integration in everyday life. Apple watch or in fact any watch type wearable device is a great link between user and the world of IoT.
Xcode 7 has included WatchKit alongside with a separate simulator for watch app which is logically paired up with an iPhone. Here is a quick overview.
Creating a new project, an AlexWatch application for Watch OS
Once it’s done, go ahead and creating a new target. There is still an option OS1, which probably is included as a rollover, I assume it will disappear in the next version of Xcode. But for our basic app it doesn’t really matter which one to choose.
We can see new folder in the navigation appeared: AlexWatch WatchKit App and AlexWatch WatchKit Extensions.
Interface.storyboard is now located in AlexWatch WatchKit App folder. Drag and drop a label and a button from the utilities panel to the view controller and change the text.
Assistant editor will take us to the InterfaceController implementation file. Control- drag the label property to the @interface and the button method to the implementation. Lets add the body to the button method making it to change text when the button in tapped:
[self.labelname setText:@”Hello Flatiron!”];
Hit ⌘ + R to run the simulator to ensure it works. Profit!
In order to better appreciate the scope of things a watch can do, it’s handy to know what hardware you have at your disposal. Lets take a quick look.
Apple watch gets IPX7 water resistant rating(which means it can withstand up to 30 minutes in up to 1 meter of water)
There is A 3.8 V, 0.78 Wh lithium-ion battery which is claimed to keep the watch on for up to 18 hours.
Taptic Engine, which is attached at the hip to the speaker, and when combined with subtle audio cues from the specially engineered speaker driver, the Taptic Engine is designed to output a unique motion.
Sensors include heart rate monitor and blood oxygen sensors. It works in the similar way plethysmography works. Light emitter emit light towards the user’s skin such that a portion of the light is absorbed by the skin, the blood and an additional portion of the light is reflected back to the co-located light sensor. The light sensor may generate information indicating an amount of light reflected. Such information can be used to determine how much of the light emitted by a co-located light emitter is absorbed by the blood of the user, which can further indicate the volume of blood present in the skin of the user. The volume of blood present in the skin can be a function of several factors, including the cyclical movement of blood to and from the skin and the particular physical characteristics of the vasculature of a user, among other possibilities.
Here is a brief diagram of how it works. Fascinating piece of technology that opens up a whole new dimension for app development.
Bitcoin tip jar: bc1qgpl6lhf09j6kcdvkh8cz90p4cfxuyfec3ecjrd
Ethereum tip jar: 0x7e0Bf6D50b5F5fcbf76A16Bd5285CE0c74C063a9
Originally published at objectivecmakesyouwanttopicklongnames.wordpress.com June 19, 2016.