When I released the first version of Ohai a few months ago, it had a simple goal - to have a simple, beautiful place to keep track of memories. As with all first versions, it was limited; you could only capture location-based check ins with a comment and a photo. And as it was released shortly after iOS 7's announcement, it quickly looked outdated and needed some visual touch-ups. Today I've released the first new set of features for Ohai to make it a better and more beautiful journal, with some of the most heavily requested features.

The most obvious change made is the refreshed design. Taking cues from iOS 7, the update has been cleaned up with a bigger emphasis on your entries. Photos are now shown full width with no frame wasting space. Lines and icons have been thinned out and reduced in size. The check-in screen has been cleaned up and simplified. Text is sized with the new Dynamic Type feature in iOS 7. All designs are a continual process of refinement, and this will continue.

The most significant structural change is the addition of new text and photo entry types in the journal. Locations are now optional everywhere. So if you want to jot down a few thoughts, or post a cool photo you took that day, you can put them and not worry about adding a location. You can even set custom times and dates for posts, letting you add journal entries even for events in the past. If you make a mistake and want to edit a post, or even delete it outright, you can tap-and-hold on any entry to do it.

The last of the new features is the new support for checking in to Foursquare. I will note that this is feature is still a little early, and there may be a couple issues using it, which I'll be working on. The issues are solvable with real world data, and I invite you to contact me if you run into problems. To explain why this is tricky I'd like to talk a bit about how the Foursquare check-in process works.

When you want to check in to a location, Ohai gets your location, and searches a database of places provided by App.net to see what's around you. Data returned for places includes name, address, city, and a pile of other data, including a unique identifier which is basically a random assortment of letters and numbers. There are many providers of places databases (one of them being Foursquare), which have many of the same places, but different identifiers. In order to check in, Ohai has to search Foursquare using App.net's data, which may be subtly different, and this may not work all the time.

So that's the new Ohai 1.1. I'm very excited to hear what you think about the new features and design, which reflect the most common and popular requests from people who use it. There are a couple bugs I've found since the update was submitted around the new Foursquare integration and how entries are categorized, which will be fixed in an update submitted to Apple today, although with the app review shutdown, it likely won't be available until around New Year's. But this new update is a lot of small things that add up to make your journal more useful and more beautiful. With the holidays coming up, it's the perfect time to get it. Start saving some memories while it's on sale for the rest of the year for $2.99 on the App Store.

 

About three years ago, I had a simple idea. I wanted an app to keep track of the places I've been. Naturally I've tried all the services for this, jumping from Gowalla to Foursquare to Path. But they all want you to broadcast your location, all the time. They're focused on the experience of letting other people knowing where you are. There's certainly value in sharing your location, but I wanted something that benefited me first.

I built a prototype of this app a few years ago, but it didn't go anywhere. The secret sauce behind any check-in app is a database full of points of interest (or POIs, meaning places like businesses, restaurants, tourist attractions, etc.), and mine was no different. I didn't want to rely on a free API of places that could evaporate at any time. Buying API access to one was prohibitively expensive. And shipping without one meant checking-in became a huge data entry process that was not fun. The project got shelved.

Then, a few months ago, my friends over at App.net announced a new API for finding POIs, and attaching metadata about places to posts and private messages. A few months before, they released an API for, among many other things, creating a private timeline of posts for individual accounts. I saw both a way to get a sustainable POIs database and cloud storage for check-in data.

And thus, Ohai was born.

Ohai lets you save location check-ins with photos to a beautiful and simple digital journal. You can share them to App.net and Twitter if you want, but you don't have to. All of your check-ins and their photos are stored safely in a private part of your App.net account. And App.net is a sustainable API in service of their customers and developers, so I can rely on it for a long time. Of course, this requires an App.net account, which you can get for free by downloading the App.net Passport app. You can store as many check-ins as you like, and a few hundred high-resolution photos. If you want more photos, you can always upgrade to a paid App.net account.

One other cool benefit of using App.net for the backend is that the data specification is publicly available. This means other developers could build apps that recognize your journal. So, if the developer of your favorite camera app adds support for Ohai journals, they could save those photos into your journal. Then, the next time you open Ohai, those photos are available. Other developers could build journaling apps for other platforms like Android, or even write competitive apps for iPhone. You as the user would not have to export your data and re-import it; it would just all appear when you logged in. It's a wonderful deal for customers to have no lock-in at all, with open data standards for interoperability.

Ohai was a pretty quick app to write, and as a result some bugs shipped in the 1.0 version. If you check-in to a place without leaving a comment or a photo, that check-in won't appear in your journal (but it is saved). If you tap on the right side of the check-in button, it'll accidentally turn the page instead. And the build, which ran fine on iOS 7 beta 2, crashes at launch on iOS 7 beta 3. All of these bugs will be fixed in the first update which I will be submitting immediately.

I've wanted to write Ohai for years, and I'm very glad to finally be able to share it with you as my first app as an indie developer. I hope you will check it out. It's available for $4.99 on the Ohai web site and the App Store.

 

Starting over is often a difficult, but necessary, way to revitalize yourself. When I was in middle school, I wrote a column in my school’s student newspaper talking about video games. I’ve been writing since before I was coding, but lately the coding has superceded the importance of writing in my life. I haven’t been content with this for a long time, and have been trying a variety of strategies to make myself write more. My personal blog, SteveStreza.com, has acted somewhat as the outlet for this, and it has succeeded in getting me to write long-form articles, but it has largely failed at producing shorter and more frequent content. But the opposite side of that is a lack of focus and a burden of further long-form content. I love writing, but the hole in what I have been writing has been bothering me for awhile.

So today, I’m beginning a new experiment, Informal Protocol. This new blog is focused around the topics of development, design, tech, and culture. The goal is to keep most articles at 5 paragraphs or fewer, and to have at least one new post a day. But as the name implies, this is an informal protocol, and won’t always be followed. Quantity and quality will have peaks and valleys, and the focus may skew one way or another. It’s wholly possible the direction may drift and this becomes something else entirely. But hey, sometimes you just have to give it a shot.

Informal Protocol is an experiment. Like all experiments, it may fail. But sometimes you have to just jump face first into a new adventure and start over. If you would like to join me on this adventure, you can follow new posts at Informal Protocol via RSS, App.net, or Twitter.

   Tags
 

For many years, I've had a lot of hard drives being used for data storage. Movies, TV shows, music, apps, games, backups, documents, and other data have been moved between hard drives and stored in inconsistent places. This has always been the cheap and easy approach, but it has never been really satisfying. And with little to no redundancy, I've suffered a non-trivial amount of data loss as drives die and files get lost. Now, I'm not alone to have this problem, and others have figured out ways of solving it. One of the most interesting has been in the form of a computer dedicated to one thing: storing data, and lots of it. These computers are called network-attached storage, or NAS, computers. A NAS is a specialized computer that has lots of hard drives, a fast connection to the local network, and...that's about it. It doesn't need a high-end graphics card, or a 20-inch monitor, or other things we typically associate with computers. It just sits on the network and quietly serves and stores files. There are off-the-shelf boxes you can buy to do this, such as machines made by Synology or Drobo, and you can assemble one yourself for the job.

I've been considering making a NAS for myself for over a year, but kept putting it off due to expense and difficulty. But a short time ago, I finally pulled the trigger on a custom assembled machine for storing data. Lots of it; almost 11 terabytes of storage, in fact. This machine is made up of 6 hard drives, and is capable of withstanding a failure on two of them without losing a single file. If any drives do fail, I can replace them and keep on working. And these 11 terabytes act as one giant hard drive, not as 6 independent ones that have to be organized separately. It's an investment in my storage needs that should grow as I need it to, and last several years.

Building a NAS took a lot of research, and other people have been equally interested in building their own NAS storage system, so I have condensed what I learned and built into this post. Doing this yourself is not for the faint of heart; it took at least 12 hours of work to assemble and setup the NAS to my needs, and required knowledge of how UNIX worked in order to make what I wanted. This post walks through a lot of that, but still requires skill in system administration (and no, I probably won't be able to help you figure out why your system is not working). If you've never run your own server before, you may find this to be too overwhelming, and would be better suited with an off-the-shelf NAS solution. However, building the machine yourself is far more flexible and powerful, and offers some really useful automation and service-level tools that turn it from a dumb hard drive to an integral part of your data and media workflows.

Before we begin, I'd like to talk about the concepts and terminology to be discussed as part of the assembly. Feel free to skip this section if you already understand RAID, ZFS, and computer assembly.

Data Storage for Newbies

At its core, a NAS is just a computer with a number of hard drives in it. Its only purpose is to store and load data, and make all that stuff available over the network. Since all it's ever doing is holding on to lots of data, you typically don't need a lot of the things that you'd put into a normal computer; stuff like a graphics card, keyboard, mouse, and monitor aren't needed very much. You instead buy parts that focus on a few key areas: number of hard drives you can connect, and how fast you can get data in and out. In this case, you need these parts:

  • a motherboard
  • a CPU
  • some RAM
  • a bunch of hard drives
  • a power supply
  • a case to put everything inside of

Your laptop has a hard drive in it. If you've ever plugged in an external drive or a Flash drive, you'd see that they're two separate places for you to store stuff. If one of them fails, you lose all of the data on it, but it doesn't affect the data on your other drives. And you have to organize everything yourself. Trying to scale up to 4 or 6 or 10 drives sounds like a disaster. What we really would like is to make all of those drives pretend like they're one giant hard drive. And we'd like to be resilient to a hard drive dying without losing data.

There's a tool for this, and it's called RAID, or "redundant array of independent disks". RAID is a set of technologies that takes multiple hard drives, called an array, and combines them under the hood to make them look and act like one giant hard drive. The way this works is complicated, but the basic idea is that RAID takes a file, chops it up into little pieces, and spreads them out across all your hard drives. Then, when you want the file, RAID will grab all those pieces from each hard drive and combine them back into the original file. (Please note: this is an overly simplified discussion of the technology, and is not technically accurate, but is adequate for our purposes of conceptualizing.) There are different strategies called "RAID levels" you can use that will change the specific behavior; some are more focused on redundancy, some are focused on speed.

The benefits you get with most RAID levels are: a bunch of hard drives that look like one storage place, improved speed when reading/writing data, the ability to survive a drive failing, and the ability to replace a dead drive with a new one. However, the downside is potentially a big one. Because the files are never stored as a whole on one drive, if you lose enough drives at once and don't replace them in time, you lose all the data, even on drives that haven't failed. Depending on your RAID level, you can survive zero, one, two, three, or more drives failing. But the more dead drives you want to be able to withstand, the more storage of those drives gets used for redundant data. So it's a balance of how much storage you want vs. how much protection you want from dying drives. You can calculate how much storage you'll have based on how many drives you buy using a RAID calculator. A healthy minimum is that for every 3 drives you buy, you want to be able to withstand one failing. So 2 or 3 drives should withstand 1 drive failing, 4-6 drives should withstand 2 failing, 7-9 should withstand 3, etc.

For this build, I set up my array as a form of RAID called RAID-Z2. RAID-Z and RAID-Z2 are based on a technology called ZFS, which is a modern file system that supports "storage pools". This gives us the "make a bunch of hard drives act like one giant hard drive" behavior, which RAID-Z builds on to give us the "survive a hard drive failure" behavior we want. RAID-Z lets you survive one drive failure, RAID-Z2 lets you survive two, RAID-Z3 lets you survive 3. The major downside to RAID-Z is that it requires all data to be processed by the CPU, so you'll want something reasonably fast to process your data. The more drives you add, the bigger the CPU will need to be.

Building the Computer

The part that was the most daunting for me to overcome was actually purchasing the pieces necessary to build the computer. I'm a software guy who's owned Macs all my life, so I've never actually assembled a computer before (I will take this opportunity to let all the nerds out there get a good laugh in before we move on). If the idea of building your own computer is scary, you may want to just go buy an off-the-shelf NAS, such as the Synology DS413j and stop reading. Keep in mind, though, that a preassembled NAS will be more expensive and far less flexible than building one yourself.

After waffling on this for months, I finally decided to go the custom build approach. I figured I could make it cheaper, quieter, and run whatever services I wanted directly on the machine by building it myself. After putting some pieces together, here's the parts I went with. Prices are what they cost as of September 30, 2012. All links to Amazon include affiliate links, so I get a tiny kickback. Feel free to search for the part names if you wish. You may be able to find these parts cheaper elsewhere on the Internet.

A few notes about this hardware configuration:

  • The case has 6 hard drive slots, so you can put up to 6 drives in it. You can, of course, put fewer in it.
  • The motherboard has 6 SATA ports, but only two are 6 Gbps, while the others are 3 Gbps.
  • The power supply has 5 SATA connections, so if you want to run 6 drives, you'll need a Molex to SATA power adapter.
  • Besides the Molex adapter, the parts mentioned all the cables necessary for internal setup. But you will need your own power cable.
  • The motherboard includes some onboard graphics, and you'll want to have a DVI monitor available for making sure the machine is booting correctly. You won't need to keep it plugged in beyond setup, however.
  • RAM is cheap, and if you're accessing the same files over and over, they can remain in RAM and be even faster than loading from disk. It's better not to skimp on this. Just make sure your CPU is 64-bit.
  • There's no Wi-Fi here, so you'll either need to get a wireless card or (ideally) plug an Ethernet cable into it connected to your network.

Installing the OS

For the operating system, I decided to use FreeNAS 8.2, a distro of FreeBSD that is designed to run ZFS-based RAID systems. It includes a web-based administration tool that lets you set up the array, monitor it, set up tests to detect failing drives, run services and plugins, and lots of other stuff. To run this, I copied it to a USB key (at least 2 GB necessary, you probably want 4 GB) and just leave that plugged in to the back of the machine all the time. Once you copy the image onto the key, you set the default boot drive to the USB key, and it will boot to it each time. You will also need a keyboard (and note, Apple's keyboards will not work with this setup, so have a USB or even a PS/2 Windows keyboard) to get into the BIOS settings. After you have the BIOS auto-boot set up, when you turn the computer on, it'll take a minute or two to set everything up, and then the web admin will be available on your local network. If you have a router that can tell you what's connected, you can get the IP there; otherwise, plug a monitor into the motherboard and it'll tell you the IP. If your router supports it, you should grab the MAC address and assign it to a static IP on your network so that your NAS is always available on the same IP address. Once this is all running automatically, you can disconnect the monitor and keyboard and just run the machine headless.

The web admin is divided into a few sections. Along the top are the sections/actions that are the most commonly used; System, Network, Storage, Sharing, Services, Account, Help, Alert Status, and Log Out. The absolute first thing you should do is click the Account button and change the username and password for the admin account (which you got logged into automatically). Once this is set, nobody will be able to log in to the web admin without these credentials, or without physical access to the machine (as you can disable the login from the console if you have a monitor/keyboard attached). You'll also want to click the Users tab in that section and create a user for yourself for connecting to the array. Make sure it's in the group "wheel", at the very least.

Once you have that out of the way, you can set up your storage array and actually get those hard drives to do something. Click Storage at the top to view the Active Volumes, which is empty, as we haven't set any up yet. Set one up by clicking the Volume Manager button; give the volume a name (I just called mine "Main"), select all the disks from your list, choose ZFS, then choose your RAID-Z level. Click Add, and after some processing, you'll have a giant hard drive. The amount of storage will be considerably less than the sum capacity of the hard drives you put in, as it is reporting the capacity after taking out the backup data it will eventually be storing. In my case, the 6x3TB drives have about 16.3 TB of raw capacity, but after the backup data in RAID-Z2 is accounted for, only 10.7 TB is available. Note: If you added 6 drives to the array, you should see 6 drives in the list when creating the volume; if you don't, you probably didn't connect something correctly inside the machine. Make sure you set the permissions on this new volume so your user can access it, and do this recursively.

ZFS has a cool feature called "datasets". A dataset is just a folder with special types of rules around how big those folders can be. You can set a quota, which is the maximum size a folder can grow to, and a reserved space amount, which (as the name implies) reserves a certain amount of space for use in that folder. You can customize permissions on these separately from the whole array. You can set certain compression levels based on if you're more concerned with speed vs space. All of these values can be changed later. You can also ignore all of this, and just use datasets for organization. So, for example, I have two primary datasets:

  • Media, which has no quota or reserved space, permissions set so that anyone can read but only I can write, and no compression so it can stream fast, and
  • Backups, for Time Machine, which has the maximum level of compression (as read/write speed doesn't matter), no access to anyone except my user, and a quota of 500 GB

Actually Getting Data In/Out

So now I have a ZFS volume running RAID-Z2, /mnt/Main, which has two datasets, /mnt/Main/Media and /mnt/Main/Backups. Now we need to actually make them available for use by other computers. To do this, we set up Shares. FreeNAS has three different types of shares - AFP (for Macs), CIFS (for Windows, also known as SMB or Samba), and NFS (for Unix/Linux/FreeBSD). For our purposes, I will be setting up two AFP shares, one for each of the two datasets.

Shares are a type of Service, which is a program that FreeNAS will run automatically for you. Besides Shares, FreeNAS has services for things like FTP, LDAP, Rsync, SSH, UPS integration, and plugins. At the top of the admin UI, click Services, and click the On/Off switch next to the AFP service to start it up. Feel free to turn on whatever else you like (except Plugins, which will not quite work out of the box, but I'll discuss Plugins at greater length below). You may be prompted for settings before a given service will start.

Now you can create your Shares. Click the Sharing tab at the top, and make sure "Apple (AFP)" is selected. Click the "Add Apple (AFP) Share" button, and you'll be prompted with a daunting form. You can leave some of the more confusing fields as their default. The fields you really need to worry about are:

  • Name, the displayed name of the share
  • Path, where you want the share to point
  • Share password, if you want to set a password
  • Allow/Deny list and Read-Only/Read-Write Access, to control who can do what on the share
  • Disk Discovery, which will allow the share to be seen if you just ask the server for a list of shares
  • Disk Discovery Mode, which will let you toggle between a normal Finder share and a Time Machine backup share
  • Permissions, which let you control who can read, write, and run programs on the share

Once you have this in place, click OK, and you'll have created the Share. If you enabled Disk Discovery mode, your NAS should appear in the Finder's sidebar. If you did not, you can connect to it by selecting "Connect To Server" from the Go menu in the Finder (⌘K), and typing afp://NAS_IP/SHARE_NAME and filling in the NAS_IP and SHARE_NAME as appropriate. Authenticate if you set it up, and you should be connected. Then you can drag stuff from your hard drive into the share and it will copy over. You can also use cp from the Terminal to copy data.

When I tried setting this up originally, I got permissions errors while doing this. My rules for setting the permissions up are:

  • Make sure the user you want to have read/write access is in both the allow list and the read-write access list
  • If you want read-only access available to everyone, add @nobody to the allow list and the read-only list
  • Set all file/directory permissions to on, with the exception of "other/write".
  • Set the owner of the ZFS dataset to your user, and set all the permissions there to on, with the exception of "other/write".

To test the permissions on the ZFS dataset, the easiest thing to do is enable the SSH service, SSH into the machine with your user account, cd into the dataset, and try to touch a file. If it fails, you can't write. If it does work, cat the file; if it fails, you can't read. If that succeeds, but trying to connect via AFP doesn't let you read/write files, the error is on the AFP share permissions.

Keeping Your NAS Healthy

If you have a system dedicated to making sure your data is reliably accessible, you want to know sooner rather than later if you're going to have hard drive problems. FreeNAS includes a drive testing system called S.M.A.R.T. which is a tool for testing your drives to determine if they are behaving abnormally (higher temperature, higher error rates when reading data, lower throughput, etc.). These can then be emailed to you on a schedule you decide for your analysis. These tests are not run on the array as a whole, but rather on individual disks within the array. These tests can be created and found on the sidebar, under System > S.M.A.R.T. Tests.

I rely primarily on the "short" S.M.A.R.T. test which runs once a day, and occasionally a "long" test which runs manually when I won't need the array for awhile. The short test scans electrical circuits and selected parts of the disk for errors, and these tests take only a couple of minutes. The long test scans every bit on the drive for failures; this takes a very long time, especially on high capacity disks, so it should be run infrequently. There's also a "conveyance" test, which is useful to run before/after moving the drives, to determine if they were damaged during transport. Set these up at your preference.

The easiest way to see this data is to have it emailed to you. Test reports are sent to the email address associated with the root user. To change this, select Account > Users > View Users from the sidebar. In the list that appears, the root user will be at the top of the second list. The last button lets you change the email address, so set this to your email address. You then have tell FreeNAS how to connect to an SMTP server with an account. You can use Gmail or iCloud for this. On the sidebar, select System > Settings and choose the Email tab. Fill out the fields as appropriate for your mail server. Once this is in place, you can send a test email. If you get it, you're all set up, and your S.M.A.R.T. tests will send their results to you when they run.

Extending with Plugins

Note: This is a more advanced topic, and to make this work you'll need an understanding of how SSH and shell access works, which is beyond the scope of this post.

FreeNAS 8.2 introduced a plugin system based on FreeBSD jails, which are sandboxed environments for programs to run in. Plugins are like other services that run automatically in the background, but instead of being services for managing the array themselves, they are apps that you might want to run directly on your storage array. As they are sandboxed, they will only be able to write to specific folders in your array. A number of services have been ported to the FreeNAS plugin format, and you can use these to extend your array's functionality and provide even more utility. I'll demonstrate how to set up Transmission, the BitTorrent client, to run natively on your NAS. You can find other plugins on the FreeNAS Forums, or even make them yourself if the app has been ported to FreeBSD.

To begin, we need a place on the array to store plugins, and to store the Jail. Create two ZFS datasets for this (I call them "Jail" and "Plugins"). You'll rarely need to go in here manually, but the plugin system needs a place for this stuff to live. All FreeNAS plugins are .pbi files, and in fact the service that runs the plugins is itself a pbi file, which is not installed by default. Once you have your datasets set up, go to the Services tab, and click the settings icon next to the Plugins service. There are three steps to the installation. First, it needs a temporary place to store the plugin while it installs (this will be the root of your ZFS volume). Next, it needs to know the path to your dataset for your jail and plugins folder, as well as the IP address you're going to use as the jail's IP (make this something unique, out of your DHCP range). Finally, it needs the plugin service PBI that is appropriate for the version of FreeNAS you're using and the architecture of your CPU.

If it installed successfully, you can then install plugins. Near the top is a tab called "Plugins". Here you can upload the pbi for whatever plugin you like. On the page where you downloaded the plugin service PBI, you can also download the pbi for Transmission. Download it from the site and upload it to your NAS. You'll have to set up the parameters before you can turn it on. Make note of the Download directory you specify, as we'll need it later (but you can leave it as the default). Then, you can turn it on and access it by going to http://JAIL_IP:9091/ in your browser.

Now, before we go on a download spree, we need to understand where those files will end up. They go into the Download directory specified in the settings, which for me was /usr/pbi/transmission-amd64/etc/transmission/home/Downloads. But there's a catch: since this is in a FreeBSD jail, that path is relative to the jail root, which is itself part of your array. Now, you can access that folder, but you probably will want to set up a nicer path for it, that doesn't go through your jail.

That's where Mount Points come in. A Mount Point is a way of making a folder available from the outside of your jail to inside of it. So you can set up a Downloads dataset at /mnt/Main/Downloads, and establish a Mount Point from that to the Transmission download folder, and suddenly everything Transmission downloads will appear in /mnt/Main/Downloads, even though Transmission itself is jailed. In the Plugins tab of Services, there is a "View Mount Points" button. If you add a mount point, it asks you what the route is you want to set up. So for the case above, we need a mount point that looks like this:

  • Source: /mnt/Main/Downloads
  • Destination: /mnt/Main/Plugins/Jail/usr/pbi/transmission-amd64/etc/transmission/home/Downloads

Once this is set up, turn it on, and it will just start writing data from the Transmission downloads folder into your Downloads dataset. You may have to fiddle with permissions; I found I had to make the folder within the jail writable by the user that was running the Transmission process. To enter a jail, SSH in to the NAS box as a user in the wheel group, su root, and run jexec 1 csh. To exit, just exit.

Result

The machine, named Holocron, sitting in its new home next to my media center. Given the level of nerdy this project entails, Twilight Sparkle is appropriate here. Forgive the wiring and other clutter; that's one of my next projects.

The case was larger than I expected, but not too large. It's about as tall and deep as my media center, so it sits nicely next to it (which is handy as that's where my Internet switch is). The case looks great, with off-black on all sides and no obnoxious branding on the front, and has some convenient USB ports on top. The only problem with the front is that the case has a power button on the top with a REALLY BRIGHT BLUE LED noting that the machine is on; I would love to figure out a way to turn that off (or at least knock the brightness down). But the real win here is that the case is very quiet. It has noise insulating material on the walls, which knock down the sound, and the hard drive trays have rubber grommets on the screw holes, which helps quiet the spinning of the hard drives. The case emits so little sound that, even with 6 hard drives and fans, the entire thing is less noisy than a single Western Digital MyBook (and I had 5 of those to replace). It blew away my expectations of noise.

The machine is quite fast. It handles reading and writing stuff like a champ, downloading and streaming at the same time with no problems. It's been running for weeks at a time with no uptime issues. Even with 7 plugin services running, it has all run very, very smoothly. I've run into one or two bugs in the FreeNAS web admin UI, mostly happening when you try to save an options form that includes a permissions field (when you aren't actually changing permissions). When this happens, a manual reboot of the machine fixes the problems, and since it's manual you can take down connections as you need to. But you really shouldn't have to change them once they're set up, so this is a problem of setup more than anything.

The permissions on the system remain the biggest single headache. I've definitely spent most of my time struggling to make sense of the permission model, which gets more complicated and difficult to track down when you introduce Shares and Mount Points into the mix. But once you have it figured out, you can build in the permissions you want to offer and it will stick. You can also SSH in to the system to see the permissions at the UNIX level, which is helpful if you're familiar with the shell.

The second biggest headache has been learning FreeBSD, which is starkly different from Linux or Mac OS X. There have been several times where I'll do some muscle-memory shell command, like sudo su transmission, and it will fail because FreeBSD does things a little differently (in this case, I've been doing su root followed by su transmission). These are probably just differently configured and there's ways to get it to do what I want, but it's not a big deal.

However, nits aside, once this system is running, it's providing a ton of value. As someone who has always cobbled together storage based on what I had and what was the easiest to get setup, this definitely took more discipline to configure and get working properly, but the value is paying off huge. Since everything is pooled together, I have more incentive to keep it organized and optimized for how I want to use it. The assumptions I set up for myself and through the plugins mean everything works as I want and everything ends up where I need it to be. The extra effort makes it a more useful system.

Building a NAS is not for the cheap or faint of heart. It requires money, time, and effort to build into a great storage system. It is also not a panacea of storage; you still want to back up critical stuff onto a different drive, ideally offsite or in the cloud, and you still need to worry about drives failing. But if you put that energy in, you'll end up with an indispensable tool that will be more reliable and more powerful than a glued-together system of disparate components and wonky services. It's an investment that I'm hoping will pay off for a number of years.

 

I grew up knowing how to play music. My parents started me on piano lessons when I was very young, and played violin in middle and high school. When I was in college, I taught myself how to be a human beatbox. Lately, my tastes in music have turned more towards the electronic spectrum; perhaps a fitting choice, given my software engineer background. For years, I've owned a copy of Apple's Logic audio production app (Logic Express 8, and recently upgraded to Logic Pro 9). And while I had played with it on and off, I never really finished anything, due both to a perceived lack of skill and a lack of confidence in ability to actually make something worthwhile.

Fast forward to a time when the Korean pop song Gangnam Style by PSY has taken the Internet by storm, racking up a hundred million views in a few weeks. Mashups of this song came out quickly and by lots of people. I've had a fascination with this song since I'd first heard it. That everyone else was getting into mashups of a song I was completely hooked on created an itch in my brain to give it a try. At the very least, I could put some ideas in and see what happened; I was safe in my insecurity about my own ability because I never had to let another soul hear whatever came out.

Last weekend was the three-day-long Labor Day weekend, and I had nothing better to do. I spent Saturday putting the song together, combining the K-pop music track with vocals from LMFAO, Dev, the Offspring, and the Bloodhound Gang (as unlikely a combo as you'd expect to find in a mashup). I posted it to SoundCloud and got a bunch of positive feedback along with the constructive criticism. Over Sunday I tweaked a few things in the song. And on Monday, I decided to try and put a video together.The whole time my brain was fighting me to stop, to give up, to pretend I'd never tried, and forget it ever existed. But I kept pushing, and out came my first real mashup ever. And here it is.

Apologies to those in Germany, where the video is blocked for copyright reasons. You can download or stream the MP3 or the AAC audio for now.

I put this up on the Internet, tweeted a link, and waited for feedback. I wasn't expecting much to happen; my wildest expectations were that it might get 10,000 views and maybe a link on Reddit's mashups community. Barely even a splash. I had no delusions of grandeur about its value; I just wanted to play with some ideas, get a little experience, and pull this splinter out of my mind.

I was wrong. Very, very wrong. Once you put something on the Internet, you lose any control over what happens to it. Within an hour, my mashup had gotten posted to Buzzfeed where it quickly started to gain traction. Within 12 hours, it had grown to 8,000 views like it was nothing. Within 24 hours, that number shot past my crazy hopeful dream to 16,000. It hit Mashable and The Daily What. That number grew to 34,000. 44,000. 88,000. Blogs and news sites started posting it as their viral video of the day. Know Your Meme picked it up. Radio stations around the world started playing it. 104,000. 122,000.

Then, just as it seemed like it was peaking, something utterly unexpected and amazing happened.

And a few hours later, it got retweeted by the man himself, PSY, in what may become my favorite screenshot ever.

Holy crap, PSY retweeted LMFAO's tweet about my mashup of their work.

I never expected anything like this. Not at all. It is an incredible honor to have a mashup work celebrated and shared by an artist you took from, let alone two.

Throughout this whole experience, I've been floored by supporters. People on the Internet tend to tell you what they really think while hiding behind the shield of anonymity. So much to my surprise, the response has been phenomenal. There are over 5,000 likes of the video on YouTube, with 26 likes for every dislike. It's been shared on Facebook and Twitter over 12,000 times; the LMFAO tweet itself got over 1,000 retweets. The audio files I posted on the YouTube page have been downloaded over 6,000 times. The video got hundreds of comments, most of which are positive. And I've gotten dozens (if not hundreds) of mentions on Twitter, Facebook, and App.net from friends and strangers who love it.

And the most astonishing thing to me? The mashup itself is catchy, but it's far from being technically great. There are issues with some parts of the song not mashing well, clashes between keys, mismatched song syncing and beatmatching, issues with the video clips not being perfectly aligned, etc. It's my first time making a mashup, and it was meant to be a learning experience, not a viral hit with flawless execution. I've heard my own song dozens of times, both during and after the process of making it. When you hear your own work that much, and you know exactly how it is pieced together, the faults are not only obvious, they get in your face no matter how much you try to shut them out. But it's too late to do anything about that now. The Internet has taken the song and given it legs to run. It's out of my hands, technical merits be damned.

I'm very fortunate to have an audience of thousands of people on social networks to seed ideas to. Many of them fall right on their face. I expected this to, as well. But then that group took it and started a chain reaction which brought this little weekend project to hundreds of thousands. I am deeply grateful for and humbled by the friends and strangers who gave me a week of consistently beaten expectations and holy-crap-I-can't-believe-this-is-real moments. But most of all, I am more encouraged than ever to push forward and keep this little dream of making electronic music alive. I have a renewed sense of confidence and courage to improve, to try again, to turn a what-if into a reality.

So thank you to those who have offered their support, praise, critique, tweets, shares, likes, upvotes and downvotes. Thank you to everyone who overlooked the technical flaws and offered their appreciation. Thank you to LMFAO, Dev, The Offspring, The Bloodhound Gang, and of course PSY, for being the unwitting participants in a learning experiment, for putting something into the world that I could draw from. Thank you to the bloggers, the writers, the curators, and the DJs who gave me the elation of feeling for just a moment like a rock star. I won't ever forget it.

Try something that scares the hell out of you. It just might turn into something wonderful.

 

The desktop UI was invented almost 30 years ago. The original Macintosh had a 9" screen at 72 DPI. The modern conventions of size were established by these constraints; a 20-pixel tall menu bar provided a 2" target for the fairly-precise mouse pointer. These constants have stuck with us throughout the lifespan of the desktop, occasionally getting ever-so-slightly smaller as DPI were added to displays here or there. At its peak, the desktop iMac reached about 108 DPI, with 1.5x the pixel density over the 30 year old Macintosh. The MacBook Pro received more improvement, getting as close as 128 DPI on a souped-up 15" model with a "high-res" display. 30 years of progress brought us to 1.77x pixel density. These improvements were great, and always breathed new life into the desktop, but we never managed to break through to a truly high density display where pixels were indistinguishable. Despite years of hoping, it seemed as if we had started to hit a wall in terms of packing pixels tightly together, one that would keep us stuck forever seeing the boundaries of a pixel.

Meanwhile, the mobile revolution happened. The original iPhone blew all of these displays out of the water, with what was (at the time) a ridiculous 163 DPI display. Those of us who were paying close attention to this DPI knew at the time what a big deal this was. The iPad also brought a great 132 DPI display, still higher than the best Apple could put into any desktop or notebook. EDIT: Apparently the 17" MacBook Pros had a display at 132 DPI, so equivalent to the iPad. Soon after, those displays took their pixel density and had it doubled, a feat of engineering that is astounding. The results were truly something else. For most people, the move to a Retina display was immediately noticeable and improved the experience while reading, looking at photos, watching video, and browsing the web. Hopes were renewed that Apple would eventually take this technology back to its roots and bring a Retina-class display to a Mac.

That day has finally come. At WWDC 2012, Apple officially put their toe into the water of the high resolution desktop with the new Retina MacBook Pro. They took the old MacBook Pro's display and packed four pixels together in place of each one. On top of that, they took the opportunity to do what Apple does best - get rid of dead technology. Software and media are largely delivered via the Internet, so they could get rid of the heavy, noisy, and large optical drive. Spinning disk drives also got the axe, with tiny solid-state drives taking their place. Wireless networking is pervasive and ubiquitous, far more so than wired networks, so they got rid of the Ethernet port. In their place, they put more USB ports, more Thunderbolt ports, and even an HDMI port. By removing all this bulk, they got the MacBook Pro down to just under 3/4ths of an inch thick, and just less than 4.5 pounds.

The new Retina MacBook Pro is a solid foundation on which to build pro-level notebooks for the next few years, and is in every way an improvement on older notebooks. It has a small share of early adopter issues, some of which will require new software updates, some of which will require waiting until next year's laptop ships. But as someone who has been waiting for a pixel-free world for the better part of a decade, I couldn't have wished for a better and more focused product from Apple.

Before We Begin…

Prior to getting the Retina MacBook Pro, I was using two laptops, one for personal use, and one for work. I will be making my comparisons against these models:

  • A Mid-2010 15" MacBook Pro, 8 GB of RAM, 500 GB spinning disk drive, and a 1680x1050 matte display
  • A 2011 13" MacBook Pro, 8 GB of RAM, 256 GB solid-state drive, and a 1280x800 glossy display

There will be some screenshots attached to this review. Screenshots presented inline have been compressed and scaled down to fit the blog format, but optimized for Retina displays where available. Clicking the screenshot will take you to a full-size lossless PNG format of the screenshot. Be aware, however, that at 2880x1800 pixels, these screenshots are going to be in the range of several megabytes.

Display

The biggest draw of the Retina MacBook Pro is, of course, its Retina display. Doubled in both directions, it boasts a 220 DPI display that is stunning to look at. While people like to disagree and debate what makes a Retina display, the reality is that pixels are not distinguishable on this display, either when placed on the lap or on a desk with enough room for your arms to type on the keyboard. This leads to the same illusion as viewing an iPhone from close up, or an iPad in your lap; the display falls away and all that's left is a smooth canvas to work with. The difference cannot be adequately explained in words. Much like when the iPhone pixel-doubled and adopted the Retina display, it sounded great on paper, but its impact was not truly felt until placing your eyes on it for the first time. My first words when I turned on the display and looked at it were "oh wow". Curves were smoothed out, text looks like print, and photos are unbelievably detailed.

One angle Apple sells on the new display is that, while glossy, it suffers from less of a glare problem due to the way they engineered it. In my daily life I use two laptops (a personal 15" from 2010 with a matte display, and a 13" for my job with a glossy display), so I was intrigued to see how it performed against both. This weekend was particularly sunny in San Francisco, where I live, so I took all three laptops outside (yes, I actually did this, all at once) to try them out. For me there are really two criteria for outdoor use: can I use it outside without problems, and is the effect of sunlight reflection against the eyes uncomfortable? In both cases, the Retina MacBook Pro exceeded Apple's previous glossy and matte displays, which I did not expect at all. The old glossy display needed full brightness to be usable, and the reflection was sharp and could sting the eyes; it really was not feasible for use in direct sunlight. The matte display, while it smoothed out the sharpness of sunlight, caused the display to appear washed out and difficult to use without squinting. The glossy display on the Retina MacBook Pro, however, was both pleasant and fully readable, even at less-than-full display brightness. The reflectivity, while visible, can be looked through, and didn't cause any eye discomfort after a half hour of outdoor use in direct sunlight.

Size, Weight, and I/O

By removing the optical drive and the spinning disk drive, the machine has gone almost entirely solid-state. The result on use is striking - the machine is crazy quiet and has barely any internal motion. Previous laptops have always had a slight shudder during use, which has always felt uncanny to me. It has also made the laptop very light, and not trivially so; one handed pickup places much less strain on the wrist and elbow than previous laptops. This will be a big deal if you're upgrading from any MacBook Pro model shipped previously, but it is still heavier than any MacBook Air. They've used this extra space to pack this puppy with battery, and it shows. On a full charge, I got 5 hours of battery with heavy use, with music playing and the brightness turned up, doing power-intensive tasks like compiling with Xcode and manipulating graphics with Photoshop. The placement of the I/O ports along both sides is also a highly welcome addition, as I constantly have smartphones plugged into my laptop, and I no longer have to wrap cables weirdly around my legs to put things where I want them.

One thing I'm not crazy about, however, is the new MagSafe 2 cable. Its design feels like a step back from its predecessor, and it comes disconnected very easily. Perhaps this is due to the fact that Apple continually keeps removing weight from their laptops, and so the reduced force needed to disconnect the cable is a counterbalance to that. But I've found myself accidentally disconnecting the cable while moving around on my couch, or moving the cable to a more comfortable position. It's a small annoyance, but a real one. And unlike previous MagSafe cables, the tiny clip attached to the cable (used for holding it in a wrapped position in a bag) no longer can clip to the bezel of the laptop display, as you could do with previous laptops.

Heat

In normal and even heavier use, the Retina MacBook Pro does get warm, but not hot. It does a pretty good job at spreading the heat throughout the body of the laptop, mostly centralized near the top and less from the area where your wrists are likely to lie. When used for gaming, it does heat up a bit more, though. One area which does become unusually warm, especially while gaming, is the tiny metal pieces between the keys of the keyboard. Under normal use, these become very warm, and games send the heat to an uncomfortable level. This is generally not a problem, but if your fingers rest on the keys, they will occasionally fall into the area between keys, creating contact between your finger and the aluminum chassis. Do this with a game like Diablo 3 running and you will quickly feel how hot those contact points can get. This is a pretty significant drawback for me, and in one 30 minute game of Starcraft 2's Nexus Wars, I counted 12 instances where my finger accidentally fell into that gap, creating a sensation just under a stinging feeling. If you plan on gaming with this notebook, you will probably want to use an external keyboard, or learn to deal with the sensation.

The Retina MacBook Pro contains a new type of fan, with asymmetrical blades. A typical fan has blades that are evenly spaced, which creates a distinct and repeated pattern. This creates two forms of disturbance, one audible and one physical. Symmetrical fans have a distinct white noise that makes them sound like a plane taking off, while also causing a shudder from within the body to be felt by your resting hands on the laptop. The asymmetric fan design of the Retina MacBook disrupts this pattern for an effect that is far less noticeable, creating a quieter sound that doesn't distract nearly as much. The shudder is still present, but barely. To my ear, the fans seem to ramp up much more slowly, and even after several minutes do not get very loud or distracting. This is a huge improvement in the general fan design, and one which will undoubtedly trickle into the rest of Apple's laptops. It's one more area in which Apple focuses on improving a tiny aspect of the experience, but which has profound yet invisible benefits over the lifetime of the machine.

Apps

In the case of both iOS and Mac OS X, which share common ancestry, the OS has found a way to do a lot of the hard scaling work for developers if they play by its rules, drawing text and common controls like buttons with high-resolution artwork. And if you stay within Apple's stock apps, that's usually the case. The entire OS has been upgraded with high-resolution images and controls, and the core experience remains fast, fluid, and beautiful. Mail, iTunes, Photo Booth, QuickTime, iCal, Address Book, and the App Store have all been optimized for the new display, among others. Even Apple's Dashboard widgets got a full high-resolution makeover, and just look amazing. However, as with all early adopter technology, you do have some legacy to put up with. Some of Apple's own apps, like GarageBand, have not yet been given the full Retina treatment.

Given that the Retina display is barely a week old, it is entirely forgivable that the thousands of Mac apps were not prepared to take advantage of the extra pixels. That said, of the three platforms Apple has with Retina displays (phone, tablet, and notebook), the notebook is easily where low-resolution content is at its most noticeable, and the nature of the Mac and its legacy means that some apps using old APIs will not get any love from the system. For example, Photoshop (and other tools in Adobe's Creative Suite software) is kicked into an app-wide low-resolution mode, where everything, even the text and controls, are displayed upscaled. Twitter's Mac app, which uses some funky text rendering scheme, looks terrible on the display, with low-resolution text upscaled. Text is the big area where problems are immediately noticeable, rendering some apps unusable. Images, such as those used in custom controls and things like avatars in a Twitter client, are noticeable, but less so.

Twitterrific, Echofon, Osfoora, and Twitter's app. Twitterrific has already been optimized for the Retina display.

Of course, the Web is also a big part of any computer, and this is also where the low-resolution nature of the desktop really pokes you in the eye. There just aren't very many sites that have been optimized for a Retina display that target the desktop. This is, of course, to be expected, as those haven't existed until now, but it is still jarring. Text on web pages is rendered cleanly, as are form elements and CSS styles, but the vast majority of sites with images don't have Retina graphics yet. This transition is even still happening on the mobile web, where techniques to show Retina graphics have been around for years but are slow to be widely adopted. The same techniques that apply to providing Retina images for mobile web apply to desktop, so at least the frontier is more known this time around. As Retina displays become more ubiquitous, and the techniques for supplying these assets becomes cleaner, there will hopefully be more of an effort to update sites with higher-res graphics. But that process will probably take time. Right now, Safari is the only shipped browser which supports Retina graphics, though the Google Chrome "Canary" experimental builds support it as well.

One interesting thing I found is that Apple decided that images should not be displayed at native resolution, but upscaled. In many cases, when you look at an image, you're seeing that image doubled in size. For example, if you click a link to a standalone JPEG in Safari, that will be upscaled to double its original size, regardless of the DPI embedded in the image. (Curiously, screenshots taken on the Retina display have a DPI of 72 set within the image.) Opening them in Preview and viewing at actual size, however, does display the images with pixel-accuracy. Video also generally is upscaled to 2x. This is actually an interesting problem, as even the most high definition video you can find generally maxes out at 1080P, which is about 40% of the number of pixels on a Retina MacBook Pro. There is some video available in the 4K format, which will downscale to the display, but not much. 1080P video looks great, but it'll still be upscaled.

Pixel-accurate 1080p video only fills 40% of the pixels of the Retina display. To achieve this without upscaling, you have to run at 2880x1800 at 1x scale, which Apple makes difficult.

Apple's been pushing developers to modern APIs to accommodate these higher resolution displays for several years. One of the areas they have been pushing developers is to understand the difference between a "point" and a "pixel". A pixel is a physical dot on the screen that you can put a color value into, where a point is a unit that represents a more abstract relationship between the pixel and the scale factor. A Retina display shows things at 2x the normal resolution, so the menu bar (at 20 points tall) becomes 20 pixels on a non-Retina display (at 1x scale) and 40 pixels when run on the Retina display (at 2x scale). Viewed another way, 1 point multiplied by the scale factor becomes the number of pixels. This is what you want as a developer when upscaling things like controls and UI, but when drawing things like text and images you probably want to target the actual pixels that will be shown. But up until now, with displays at 1x, a point has been a pixel, and the APIs have been improperly used interchangeably, which causes a few minor rendering bugs in some applications. These issues will be fixed over time as developers learn the correct way to use the APIs, and as Retina displays become ubiquitous on the desktop.

Games and Graphics

Four times the pixels means you need one incredible graphics chip to push every one of them around at high framerate. This is an area the Retina MacBook Pro did not skimp on. Both the integrated chipset (used for lower power and reduced performance) and the discrete GeForce GT 650M chipset from Nvidia scream on this machine. The OS itself has been running on the GPU for many years, which provides tons of benefits for apps with smooth and fast animations.

But of course, a machine like this is dying for games, and I was all too eager to try out how some games ran at higher resolution. So far, I've put these games on the MacBook Pro:

  • Starcraft 2
  • Diablo 3
  • Galaxy on Fire 2 Full HD
  • Minecraft
  • Kerbal Space Program

The first three are capable of running in full screen at native resolution; the second two run windowed. In all cases, the performance was astoundingly good. I had Starcraft 2 and Diablo 3 running at 2880x1800 at low-quality graphics with very high framerate. At higher quality graphics, the framerate did stutter a bit, but was still impressive for the resolution. In more intense scenes the smoothness of the gameplay did degrade, especially in one game of Starcraft where there were several hundred units on screen in combat. Blizzard's games scaled up to the new resolution very well, both for in-game graphics, and game/menu UI. Galaxy on Fire 2 scaled up to the resolution, and looked amazing, but with one drawback - the UI of the game menus didn't scale with it. This leads to a lot of squinting at very tiny buttons and words, as well as a lot of mouse movement to hit those tiny targets. I'd guess this is a similar problem to the points/pixels issue laid out above, but I'm not sure. It's just one more tiny (pardon the pun) early adopter issue to deal with, one which will undoubtedly be resolved soon.

Starcraft 2 (Nexus Wars) at 2880x1800 with high quality graphics.
Galaxy on Fire 2 Full HD at 2880x1800 with high quality graphics.

Minecraft and Kerbal Space Program are simply pixel-doubled for the higher-resolution displays. In both cases, the gameplay was remarkably smoother than my previous laptops. A game like Kerbal Space Program, with extensive physics and fine-grained control, really requires a smooth framerate to deliver on its promise. With my old laptop, it was very sluggish, especially with more sophisticated rockets. It's significantly smoother on the Retina MacBook Pro. In general, I've seen performance that has been as good or better as a 2 year old laptop with 1/4th the number of pixels.

Nice Touches

An Apple product is all about the details that you don't immediately notice, but you appreciate immensely more when you do.

  • On the display of all of Apple's laptops, at the bottom, there is an insignia at the bottom denoting the name of the laptop. This is gone in the Retina MacBook Pro. The bezel is all black, with no note of what the laptop is. While not distracting in prior laptops, it's just one piece of clutter that I don't need to see every day, and now I don't have to.
  • On prior MacBook Pros, the keyboard had an eject button, and the power button was detached from the keyboard and embedded into the corner of the case. Of course, with no optical drive means no eject button is needed. The power button has taken its place, meaning no weird button hanging out in the corner.
  • The optical drive used to take up pretty much the entire right side of the body of the laptop, which meant all of the I/O ports had to be centralized, and thus misbalanced, on the left side. With the Retina MacBook Pro, I can keep different USB devices hanging off each side, I can better organize my cable layout for how I work.
  • The edge around the laptop has been smoothed ever so slightly, as has the thumb tray (where you lift the monitor up to open the notebook). It's much less sharp on the wrists and on the thumb, making extended typing much more enjoyable. The corners of the thumb tray are, however, still fairly sharp.
  • The sound quality from the built-in speakers is far better than in previous laptops. It seems to emanate from the entire surface of the device, rather than just through two speakers. It's difficult to describe, but noticeable when in a quiet room.
  • If you connect a second display, it's probably not going to be a Retina display (at least not yet). Luckily the OS seems to handle this all magically, and downscales the window appropriately and without any input. If you drop the window halfway between the two displays, the one half on the Retina display will be high-resolution, and the other half is downscaled. In other words, it just works.

Denouement

After years of waiting for the needle to move on display technology, we have finally arrived at high-resolution displays with graphics cards capable of driving every one of those 5 million pixels quickly. Apple has taken this incredible and breathtaking technology and put it into a leaner, faster machine that is better in every respect from its pro-level predecessors. And this is certainly just the beginning. I would be surprised if, in the next year, Apple wasn't putting Retina displays into all of its products, both notebook and desktop. The quality difference is astounding and needs to be seen, in the same sense that it was when the Retina iPhone and iPad came out. This is not incremental progress, this is a technological breakthrough.

Of course, with all breakthroughs in technology come the costs. Early adopters will spend more for this technology, and will derive less benefit from it, than we will two or five years out from now when this same display technology is more ubiquitous. Apps and websites will update to provide high resolution art and fix the bugs associated with high-resolution displays, but on the Mac it will take more time to get to a truly complete Retina experience than it did on mobile. Many apps on this platform have been around far longer, and haven't necessarily been updated to use the best APIs for things like rendering text. This transition will take time, and early adopters will pay the cost for this transition, as they always have.

But this new display and the shedding of legacy technology like optical drives offers a few key conceptual takeaways. First, Apple is still absolutely dedicated to the Mac, spending a ton of engineering time and effort on the most fundamental component of any portable computer - the display. They've built this up as a foundation upon which to build the next five years worth of computers, and they wouldn't do that without being fully committed to executing that. Part of this execution involves shedding rarely used legacy technologies. But, more importantly, this finally begins the transition to a complete high resolution world of computing. Other manufacturers will compete to bring high-DPI displays to market and customers will benefit. In a few years, displays that have only 100 DPI will be viewed as primitive and difficult to look at. And who knows, perhaps TV manufacturers will begin moving toward the next generation of high-definition TV and film to surpass 1080P video.

For those of us who have been waiting for the high-resolution desktop and the imperceptible pixel boundary for years, this moment is exciting and invigorating. The PC is alive, healthy, and ready to tackle the next several years of innovation and discovery with the new display and the new technical foundation it's built upon. Welcome to the age of the Retina personal computer, and the new MacBook Pro is a wonderful machine to begin this transition.

 

Server teams are made up of the people who write and maintain the code that makes servers go, as well as those who keep that code working. Twitter, Facebook, Instagram, Amazon, Google, and every other service in the world have one or more people in this role. When things go right, nobody notices, and they get no praise. When things go wrong, their phones ring at 3 in the morning, they're up fixing a new issue, and they're answering calls from every cog in the corporate ladder who's screaming about how much money they're losing. It's a thankless job.

Diablo 3's launch has had a number of issues around load. When you have millions of people swarming on a server, ruthlessly trying to log in every few seconds, it causes a huge amount of load. At this scale, things you expect to go right suddenly break in strange and unfamiliar ways. No amount of load testing could adequately prepare the server team behind Diablo 3 for firepower of this magnitude. The way you fix these kind of issues is to look at what's slow, fix it to make it less slow, and hope it works. Do this until load stops being a problem. Oh, and you have to do it quickly, because every second that goes by people are getting more and more upset. And you can't break anything else while you do it.

The people who work on the servers for services of any decent scale cope with new problems every day around keeping the thing alive and healthy. The Diablo server team has been moving quickly, solving issues of massive scale in a short time, and getting the service running again. Players notice and yell the service when it stops working, but the response to it and the maintenance of it has been quick and effective.

Next time you're using an Internet service, or playing a multiplayer game, think about the people who keep it running. If you know any of these people, tell them thanks. They're the unsung heroes of the Internet.

 

Smartphones have replaced lots of types of small devices. iOS and Android have made it easy to build apps that perform all kinds of functions, replacing other standalone devices like media players and GPS. It's been wondered if they would replace handheld gaming devices, and for many people they have. For awhile, I thought they had, at least for my needs. But after trying to play games on touchscreen-only devices for years, I've largely felt unenthused about the deeper and more engaging games that would come from big studios. These games required a higher level of precision control that touchscreens just couldn't deliver.

The PS Vita caught my attention about a month before its launch in the US. It combines a lot of the best features of smartphones with the controls of console games. It has a gorgeous, large, high-resolution touchscreen (and a back panel that is touch-sensitive), as well as a tilt sensor and cameras for augmented reality games. But it also has almost all of the buttons of a typical PS3 controller, including two analog sticks. Sony managed to cram all of this functionality into a device that, while large, is not too big to fit into my pocket, and with long enough battery life for a busy day interspersed with some gaming. The combination of apps and games (which I will describe as just "apps" for the sake of this review) is powerful, and the hardware power and display size make it a compelling device.

Hardware

Put simply, the Vita is a delight to look at. Its black and silver case is easy on the eyes, and falls away while playing games. The device itself is almost entirely plastic, which does make it feel a little bit cheaper than the iPhone, but it's still quite comfortable to hold, if you don't have to use the back touch sensor (more on this below). The physical controls are small, but placed well; I have no difficulty moving my hands between the buttons and the analog sticks for the kind of twitch gaming that hardware buttons excel at.

The display is stunning to look at; at 220 DPI, pixels are almost never noticeable, and the color depth and contrast provide some incredible graphics. The pixel density is not as tight as the newest iPhones or Android phones, but it's just not an issue. The screen itself has a multitouchscreen with amazingly low latency; it feels ever-so-slightly faster to swipe something than the iPhone does (which may be real or not, but it's at least as good). There were some minor issues with the display. It seems prone to banding in a few cases (an issue where a smooth transition between two colors appears as stripes, or bands, on the display). And the graphics, while high-resolution, occasionally showed some slight jagged edges, especially in the OS UI. These issues are tiny, though, and aren't hugely apparent in gameplay.

Spanning the back of the device is a touch sensor which can be used for controlling games, which is as responsive as the front, but is almost too large. There are grips for your fingers, but these grips are too small for my hands. If you need to use them for a game which relies on the back touch sensor, I have to grip the device somewhat awkwardly. It's not uncomfortable, but it does make me worry a bit that I will drop the device due to loose grip (a problem that has never actually happened in use).

There are a number of input ports on the device. Along the top are two trays, one containing an accessory port, and one for inserting the tiny game cartridges. These trays have a plastic cap that I found incredibly difficult and frustrating to open with just my hands, which will probably limit how many physical games I end up buying versus downloading through the store. Along the bottom is a proprietary "multi-use" port similar to Apple's dock connector, a headphone jack, and a memory card slot. The memory card is the only covered port, which is thankfully far easier to open than the ports on top. I have to wonder if this was a conscious decision by Sony to encourage purchasing games over the Internet; make the old-style games hard to replace, but make the memory card (which you can store downloaded games on) easy to replace, and people will tend to buy more online. There's also front and rear cameras; these take terrible photos/videos, and are basically useless for anything other than augmented reality games, which are actually really interesting (more on that in the Games section below). But it's not like you're buying this to replace a camera anyway.

The usual wireless technologies are here. Wi-Fi worked pretty well and generally connected automatically to 802.11b/g/n networks. I paired my Sennheiser MM 100 Bluetooth headphones to the Vita and they sounded great. You can also get a version of the Vita with 3G data. I ran into several issues with these in common use. While I did not get the 3G model, it's limited to 20 MB downloads (so basically no games), and multiplayer games cannot be played over 3G. It's basically useful for messaging and browsing, and that's about it. If you have a smartphone with tethering, it's probably best to just stick with that. And while the Vita had no issue auto-connecting to the Wi-Fi at my home and my office, it didn't seem to want to connect to my iPhone's tethering Wi-Fi until I went into the settings app and turned it on. Similarly, the Vita had no end of trouble automatically connecting, to my Bluetooth headphones, leading to a similar jump through the settings app. Hopefully these are 1.0 issues resolved with software updates, but it limits their use when you only have 10 minutes to play a quick game.

OS

The system OS is pretty well thought out in terms of interaction, though it has some rough edges. It's completely controlled by tapping and gestures on the touchscreen; none of the buttons do anything. You can either use your index finger, or the combination of both thumbs, to access every pixel on screen, and all the gestures are usable by just a single thumb. This might be an issue if you have smaller hands, but I have no issues with it. Navigation around the home screen is more fluid than any touch device I've ever used, and animates at very high framerate (probably 60 FPS).

Your apps are listed on the leftmost screen, stored on multiple pages you can access by swiping vertically. You can organize them as you would on a smartphone, and assign different wallpapers to each page. Tapping on any icon opens its LiveArea, which you can use to then launch the game. This is one thing I rather dislike about the Vita OS, as it requires two taps to open anything.

To the right of the app pages, you can find the list of recently running apps. Each shows what Sony calls a "LiveArea", a nearly full-screen page showing information about the game, some meta controls, and recent activity about the game (when you last played, recent Trophies you've gotten, etc.). App developers can place stuff on the LiveArea, such as announcements and links to downloadable content. The system also shows some common controls for apps, like an update button, a button to do a web search for the game name, and on-device instruction manuals for the games. You can close any of the LiveAreas just by swiping from the upper right to the bottom left, with a nice paper effect of throwing the page away.

The graphical style of the OS is not great, but it's livable. App icons are glossy bubbles on the home screen, which looks kind of cheesy. As far as I can tell, the Vita doesn't use anti-aliasing (at least on the home screen), which causes the round bubbles to appear extremely jaggy. If the display were higher resolution, this might work, but it just isn't quite high enough to warrant eliminating anti-aliasing. The LiveAreas look nice, but some of the stock apps use this to excess with bright, conflicting colors that just look under-designed. But it works, and it's intuitive.

The interaction between software and the OS is generally pretty great. When inside an app, the OS disappears except for a few interactions (loading/saving data, for example). Some popups will appear occasionally, such as when you unlock a Trophy or a friend comes online, in the upper right corner. At any time you can press the PlayStation button to suspend the app and return to its LiveArea; you can then switch to a few of the other apps, such as Settings or the Twitter app, do something, and return to the app in the exact same state. Unfortunately you can only have one app open at a time, which can be annoying (specifically for the Browser app). But this doesn't get in the way all too often.

Apps

The Vita comes with a handful of stock apps, none of which are particularly great, but they get the job done. I haven't gotten to play in-depth with all of them, primarily as this is a gaming device first. The Friends app lists your PSN friends and who is online, but has a lot of whitespace, leaving you to see only 6 people onscreen at a time. The Messaging app is handy for chatting with your friends, and has no setup other than your PSN account, which is handy. Maps is pretty capable, using your geolocation to show you places and driving/walking directions, and storing favorites (but has no public transit, which is a dealbreaker for me personally). The Browser is okay, and can view basic pages, but anything taking advantage of newer HTML5 features will probably not render well. Hopefully these are 1.0 issues that will be improved with system updates

As of this writing there are four apps you can download from the PS Store - LiveTweet, Facebook, Flickr, and Netflix. I could not get the Facebook app to work, which just showed a "connecting to Facebook services failed" dialog and a cryptic error code. The Netflix app was slow and not particularly aesthetically pleasing, but it worked and you can stream video on it, pretty fast (and video plays very well). The best app is definitely the Twitter app, LiveTweet, which is a surprisingly full-featured Twitter client, supporting reading your timelines, pull-to-refresh, uploading images to the Twitter image sharing service, and lots of other little nuances in Twitter. It's a pretty great app, though it has some polish issues that will surely be resolved in updates.

The PS Store is the app you use to buy stuff and download free apps. It features one of the best LiveAreas in the system, showing popular content that you can find within the store. The Store itself works pretty well, albeit slowly and with some organization problems. You can see featured apps, new releases, and the most popular downloads. There are also a number of categories, such as Vita-specific games, PSP games, games that run on either the Vita or the PS3, and smaller games called "minis". These categories are generally grouped by their title, which is weird for me, as I prefer exploring all the games, not just the ones whose first letter is betweeen E and H. You can also sort games by genre, which is probably my favorite view (but is inexplicably buried at the end of the list). There are a number of genres (including both the "Shooters" genre and the "Shooting" genre) to explore games. Sadly the game pages themselves don't show screenshots, previews, or customer reviews; just an aggregate rating, a description, and the ESRB rating. This should really be fleshed out to show more detail, similar to the App Store or the Android Market.

UI-wise, there are some nice affordances. If you reach the end of a scrollable area, the content will either stretch (in the case of a single piece of content like a web page), or the items in the list will space themselves out (in the case of the Twitter or Messaging apps). Apps can fire off notifications, which appear in any app as a bubble and are collected in a notification space on the home screen, accessible by tapping the bubble in the upper right corner. Text input is generally easy, although there is no selection/cut/copy/paste (though you can tap-hold anywhere to zoom into the text to place the cursor where you like). The keyboard is pretty good, with a fairly intuitive layout and some OK autocorrect features which work similarly to Android's suggestion tray above the keyboard.

Games

There are over a dozen full Vita games available at launch, as well as a huge online catalog of PSP ports and mini games you can download through the PS Store. Of the games available at launch, I've played:

  • FIFA '12 (Vita)
  • Uncharted: Golden Abyss (Vita)
  • Unit 13 (Vita, demo)
  • Fireworks (Vita, a tech demo)
  • Final Fantasy IV (PSP)

So far, my favorites have been FIFA '12 and Final Fantasy IV. FIFA is EA's well-known soccer game, and the extent of the game is pretty huge for a portable device. It's so complete, it feels like it belongs on my TV. Tons of gameplay modes, a huge array of national teams, an extensive Career Mode, and tons of character customization. The touchscreen controls are OK, but can be kind of gimmicky and in practice are only occasionally useful. It often takes too long to move your hand from the buttons to the touchscreen and back to use in action-packed gaming. It's more useful for throw-ins and other less intense moments. The rear touch surface lets you shoot the ball on goal extremely accurately, and this is where the touch controls truly shine in FIFA. And it just looks amazing.

Final Fantasy IV is a remake of the SNES original, one of the greatest Japanese RPGs ever made. Square Enix completely remade the graphics and made a great version for the PSP, and it looks great on the Vita's screen. PSP games have some additional features on the Vita, accessed by tap-holding on the touchscreen, such as changing how the image is upscaled and colored, determining which camera to use, and picking what to control with the right analog stick. If the $29 price tag is off-putting for a 30 year old game, at least you're getting a polished remake with high-quality pixel graphics with cutscenes and video, and a bunch of supplemental material.

Uncharted was a game I was looking forward to, but it has mostly been disappointing. The jagged edge effect is more noticeable here than in any of the other games, simply because there's a lot going on onscreen. The game has so far tended to hold your hand throughout the entire process; walk for 50 feet, then a cutscene telling you exactly where to look and what to do. The combat controls are fairly good, but with one huge exception. If an enemy gets too close to you, it enters "melee mode", which wants you to use the touchscreen to draw gestures on how to attack your opponent. As with FIFA, the switch from physical buttons to touchscreen is not fast, and the whole thing is somewhat jarring. The game uses the touchscreen for some "puzzles", which are so far pretty boring and repetitive tasks like "wipe this thing off" and "spin this object around to look at it". The one good use of the touchscreen is climbing. You can draw a gesture along rock walls to signal to the character where to climb, which is handy and doesn't seem to come during fights.

Unit 13 is a tactical shooter game by Sony, which makes good use of the physical controls of the device. The graphics look pretty good, but not stellar, mostly like a PS2 game. The controls work very well, and I had no problems with moving around or hitting my target. And the game doesn't coddle you - it doesn't point out where enemies are, and it will happily let you die mid-mission if you take a few shots from the enemy. It makes light use of the touchscreen for controls, but it does so when you're supposed to have cleared the room of targets, so you're encouraged to avoid using it while in twitch mode, and to use it when things calm down. That's smart use of the touchscreen, and I hope more game developers will do that. I only have the demo, but will probably pick up the full game soon.

Fireworks is a free tech demo published by Sony that uses the augmented reality feature of the Vita very nicely. The system comes with six AR "cards", which are about as big as the Vita, with QR-like shapes printed on them. The idea is that you set one of these down on a table, point the Vita at it, and the camera will recognize the card and project graphics on top of it. In the case of Fireworks, it showed a small house which was shooting off fireworks, and you tap the round to make it explode. I've played AR games and apps on iPhone, and found it to be lacking; if you moved the device, it was too slow to respond, leading to a disconnect between the real world and the augmented world. Not so on the Vita. The camera, display, and accelerometer work extremely well together, and it really feels like you're projecting onto the real world. If you move the device, the lag it takes to see the game update is nearly imperceptible. I can't really explain this one, you just have to see it in action. I hope to see more games (and apps) take advantage of this.

In general, games and demos look great, and the physical controls are quite snappy. This truly was built to be a gaming system first, and it shows. Touchscreen input is OK for instances where you don't have to make a lightning fast reaction, or don't need accuracy beyond a tap or a swipe, but you're not going to want to do it often. It sucks for all the reasons intense games suck on touchscreen-only devices. I'd love to see more use of the rear touch surface, though, which is handy because your hands are already there. And the augmented reality stuff could open up some really awesome possibilities, if everyone manages to keep from losing their AR cards.

Future

Sony is propositioning this as another long-life console, like they are with the PS3. It very well could be; it certainly has the raw horsepower, a great (if maybe too large for the average person) form factor, and a wonderful blend of console mainstays and fresher smartphone ideas. It's pretty clear that the smartphone manufacturers aren't terribly interested in making the input side of gaming much better. And Nintendo's 3DS is gaining some traction, but is certainly not as big a success as they'd hoped. The big question remains, can a gaming device remain a standalone product and gain enough traction to warrant being a separate device?

I, for one, hope so. The Vita is extremely capable and, with some updates to the stock OS/apps and some additional software, could be (and this is probably a stretch) a competitor to the iPod touch. It makes sense to me that, as people want to use technology in ever-more-mobile spaces, they'd want to bring powerful games along with them, and smartphones just can't provide that beyond flinging birds and other simple games. The Vita shines because of its ability to provide the immersive experience, and it does that very well. I can't remember ever seeing three hours disappear playing an iPhone game; I did that this weekend on the Vita.

One way they can definitely attract consumers is by expanding the available apps to include all kinds of content, as well as indie games. The mobile software industry is exploding right now on all platforms that offer everyone the ability to tinker, from massive companies to hobby hackers to teenagers. A Vita that ignores that opportunity is leaving money on the table, both from lost sales of software to unpurchased devices. Sony has announced that they're bringing an SDK for developers, called the PlayStation Suite, but it's unclear if this will be a less restricted approach like on iOS and Android, or a locked down and tightly controlled approval process that has been the status quo in the gaming industry since its inception.

If Sony can keep game makers interested in bringing massive new titles to the Vita, we are probably looking at some of the best days for mobile gaming ahead. Hopefully customers will notice, and be willing to fork over the premium for a better gaming experience. But it will be a tougher sell in a world of mobile computers crammed into smartphones.

Conclusion

Five days in, I love my Vita. I've been spending at least an hour or two on it every day, and that's only been going up. The games are pretty great for first-gen titles. This truly feels like a console experience merged with the best ideas the smartphone world has been building for years. There are some 1.0 issues, and some hardware quirks that are metaphorical rough edges, but the overall experience is solid and thought out. If you are a fan of gaming, or are disappointed in the state of gaming on cell phones, a Vita will be a great asset to you. Hopefully Sony can sell a bunch of these things and keep game developers interested over the long run. And hopefully they open it up to indies and app developers to add that much more value as a great Internet communication device.

Edit 2/27/2012: As Kevin Ballard pointed out, I incorrectly called the LiveArea feature "LiveTile", which is actually a feature of Windows Phone 7. The Vita's feature is called LiveArea.

 

2011 is coming to a close, so I'd like to take a moment to highlight a few apps and games on Mac and iPhone that have been invaluable to me. I broke this out into four categories, each with two apps. I have purposely omitted iPad, because frankly, I rarely use my iPad (and I prefer the TouchPad over the iPad), and don't feel I've played with enough iPad apps to really give it a fair shake. So I've left that off to focus on iPhone and Mac apps and games. I hope you'll check out all of these great apps.

DISCLAIMER: I am friends with the guys at Tapbots (makers of Tweetbot) and the guys at TapTapTap (makers of Camera+). However the apps would not have made it onto the list if they were not of the highest quality, and have not influenced my reviews. I have deliberately excluded apps made by any company that I have worked for either now or in the past. I have also not included affiliate links.

Best iPhone Apps

Tweetbot

$2.99 - Tweetbot came out this year as a pretty full-featured Twitter client, but naturally everybody has their own pet features they would like. The guys at Tapbots have steadily improved the app over the year, adding support for push notifications, muting, Favstar integration, and plenty more. It has since become the best designed and most full featured Twitter client, far exceeding Twitter's iPhone app.

Camera+

$0.99 - The iPhone has the best camera of any mobile device (and I test a lot of mobile devices). Camera+ has many features that go beyond the included Camera app. The most important ones actually help you take better photos, such as the image stabilizer, which uses the iPhone's gyroscope and only captures a photo when your hands aren't not shaking. The touch up tools are very handy, and the filters look pretty good compared to other photo apps. And a suite of sharing tools help you share your moments with your Twitter, Facebook, and Flickr friends. It's the tool you should reach for when taking photos, and it shows how good a replacement the iPhone can be for a standalone camera.

Best iPhone Games

Super Stickman Golf

$0.99 - I'm a sucker for a good physics game, and Super Stickman Golf is a great one. Get the ball into the hole while navigating steep terrain, dodging death traps, and staying under par. With dozens of 9-hole courses and a grab bag of power ups, there is a ton of replayability here. It also includes an exhilarating multiplayer mode which completely changes the mechanics from precision to speed. One of the greatest physics games for iPhone that isn't just another Angry Birds clone.

Jetpack Joyride

$0.99 - The form factor of the iPhone lends itself to simpler games over more complex ones. One-tap games like Canabalt are great because of their simplicity. Jetpack Joyride takes this to great lengths, packing a ton of variety into a single tap. And it shows how to do free-to-play games right, in a way that doesn't give an unfair competitive advantage or make you feel forced into spending more money. Gorgeous graphics with a ton of tiny nuance help seal the deal.

Best Mac Apps

Spotify

Free w/ optional subscription - Spotify combines a cloud-powered streaming music service, a lovely native Mac app, and excellent sharing features into one great app. Though Spotify has been around for awhile, it only recently became available in the US. It lets you combine their huge streamable catalog with music in your iTunes library. It syncs your playlists in the cloud, and copies tracks to your iPhone via Wi-Fi. And it lets you share tracks and playlists, either to one person or the world, quickly and easily. I've ranted before about how I don't like iTunes, and have searched long and far to find a suitable replacement. This year, Spotify became that replacement.

Pixelmator

$29.99- Photo-editing tools typically come in one of two favors; way too complicated, or way too simple. Pixelmator bridges the gap by bringing much of the power of Photoshop to mere mortals at an affordable price. While pro designers probably won't drop the Adobe tools for it, it's a fantastic and flexible tool for editing photos. And it has great support for core Mac OS X technologies like Core Image and Auto Save. It's the perfect tool for going beyond red-eye reduction and one-click canned filters to make photos and images look fantastic.

Best Mac Games

Minecraft

$26.95 - Not many games have staying power for me; I typically lose interest in most games after a couple weeks. Not Minecraft. I started playing Minecraft in September of 2010 and have been continually coming back to it ever since. A Lego-style deformable terrain lets you create the world in your image, getting lost in massive caves, and dodging monsters who hunt you down and try to kill you. The game has incredible depth, forcing you to find raw materials and learn how to turn them into weapons, armor, building materials, and so much more. If you've got some friends who play, and one of them is technically savvy, you can set up a multiplayer server where everyone can contribute to the same world, helping each other survive and creating ever-cooler worlds. The 1.0 version, released in November, has a series of objectives and an end-game, though you can happily ignore it and just have fun making your world. You'll spend hours playing Minecraft, and actually have something to show for it after.

Galaxy on Fire 2

$19.99 - One dream that will probably go unrealized in our lifetimes is that of interstellar space travel. We've all wanted to live out battle scenes from Star Wars. Galaxy on Fire 2 is a gorgeous space action game that brings that to life. You can hunt pirates, trade minerals, go on side quests, and hunt down the mysterious race known only as the Voids. A large economy lets you customize your ship to give you the edge in battle, or help you carry more precious resources across the galaxy. A massive story, incredible graphics, great voice acting, tremendous depth, and perfectly-tuned controls make this a must play. It's available on iPhone and iPad, but with so much graphical detail, you'll want to check this one out on the big screen.

, , , , ,    Tags
 

Adobe is finally putting an end to Flash Player. They've announced they're stopping development of the mobile Flash Player, which is where the future of tech innovation is heading, and the writing is on the wall for desktop Flash Player as well. This is a good thing for a myriad of reasons, both technical and political.

However, it is important to remember that Flash drove much of the innovation on the web as we know it today. When Flash was conceived over a decade ago, the web was a glimmer of what it is today. Creating something visually impressive and interactive was almost impossible. Flash brought the ability to do animation, sound, video, 3D graphics, and local storage in the browser when nothing else could.

Without Flash, MapQuest would not have been able to provide maps for years before Google did in JavaScript. The juggernaut YouTube would not have been possible until at least 2009, four years after its actual launch. Gaming on the web, which has been around as long as Flash, would only now be possible a decade later. Flash enabled developers to create rich user experiences in a market dominated by slow moving browser developers. Even in 2011 Flash exists to provide those more powerful apps to less tech-savvy people who still use old versions of Internet Explorer.

Flash Player itself seemed like a means to an end. Macromedia, and then Adobe who acquired them, sells the tool that you use to build Flash content. Thus, Adobe's incentive was not to build a great Flash Player, but a pervasive one that would sell its tools. Its technical stagnation provided a market opportunity for browser developers to fill in the gaps that Flash provided. As a result it has a huge market dominance in tools for building rich apps for the web, tools HTML5 lacks.

This puts Adobe in a unique position. As HTML5 continues to negate the need for Flash Player, Adobe has the tools for implementing Flash within HTML5, and the market eager for those tools. Hopefully this move signals that Adobe will be moving in this direction. Because the web DOES need great HTML5 tools for people who aren't savvy in JavaScript, especially for the people who used Flash to do it previously.

HTML5 offers developers the ability to build high-performance, low-power apps and experiences. Browser innovation has never been faster; Apple, Google, Microsoft, and Mozilla are all competing to bring the best new features to their browsers in compatible ways. But they're just now filling in many features Flash Player has had for years. Adobe can harness this to help build a better web, and few others can. Hopefully they seize this moment.

Page 1 of 21 Next →