movq

Wherein I Move a Lot of Words Around

On Gaming Input Devices

Having the desire to upgrade my input devices at the home, I started looking around for a good keyboard and mouse combo. While the business-oriented lines were nice in their own ways, they lacked a certain flair and were woefully short of buttons and standard layouts. (What's with everyone screwing with the standard keyboard layout? Stop it. I like my buttons.)

As a result, I started to look at the gaming series of devices. I'm not sure how I wound up looking at them, honestly, but once I started to look at the options it was clear to me that all the attention on making input devices better at a hardware level was going into that market instead: the keyboards were mostly mechanical, the mice were high-DPI and loaded with buttons, and the quality was far and away higher — as were the prices, of course.

After some period of research I picked up the Corsair K70 keyboard and Corsair M65 mouse. Neither is too gratuitous with the lights off, and both are quite helpful if you setup the lights accordingly. By which I mean: think back to the 80s and keyboard overlays for Lotus. Like that, but with colors. So when switching to a game you can have the lights come up for the keys you normally use and then color-code them into groups (movement, actions, macros, etc.).

When using it for daily stuff in Windows, and some games, it proved quite the nice combo. The mouse has buttons to raise and lower the DPI and a quick-change button at the thumb for ultra-precise movement (think snipers in an FPS game or clicking on a link on an overly-designed web page where the fonts are 8pt ultra-lights — yes, I really used it for that once).

However, I quickly found the Corsairs had a very large weakness: they literally only worked in Windows. I don't mean the customization and macros, I mean the devices themselves did not show up as USB HID devices in Linux or the Mac. They failed to be a keyboard and mouse on every other computer I have. Normally, I would blame this on my KVM's USB emulation layer, but I connected them directly and nothing changed. That is, until I read the manual and discovered the K70 had a switch in the back to toggle "BIOS mode". Now it worked, but I lost some customization and the scroll lock flashed constantly to tell me I wasn't getting my 1ms response time anymore (no, can't turn that off — flashes forever).

To add to the fun, the keyboard has a forked cable. One USB plug is for power and one is for data. If you connect the data cable to a USB 3 port on the computer itself then it can get the power it needs and you don't need the other. If you use a KVM or USB 2 hub then you're using both.

Overall, the frustrations outweighed the utility and I returned them both. I did some more research and found that, of all companies, Logitech fully supported their gaming devices on both Mac and Windows and their devices started in USB HID mode and only gained the fancy features when the software was installed on the host machine.

Taking that into consideration I went ahead and picked up the G810 Orion Spectrum keyboard and the G502 Proteus Spectrum mouse.

To summarize the differences:

I'm especially happy that both keyboards had a dedicated button for turning the lights off when I just wanted a good mechanical keyboard and back on when I want to do something that it adds value to. That's a nice selling point for both, really.

At any rate, if you have a Mac, it appears only Logitech still cares about you. That's perfectly fine with me as they make some good stuff, overall.

WQHD, DVI, HDMI, Oh My

Even though I knew that video modes were a nightmare mess that was made barely tolerable by standards, I had no idea the hell that awaited once one passed the 1200p edge.

A short history of video modes before we begin (this helps the pain later). Video is a three-dimensional concept that must yield to the laws of computer science and become a two-dimensional bitstream of arrays in order to go down the wire to the screen (and also in order to be stored in memory, but let's not complicate things more). You may be wondering if I added an extra dimension in the previous sentence but I did not -- that additional dimension is time.

In computer science a list of numbers is an array (stop it, Melvin, you know where I'm going with this). A two-dimensional array is a list of lists (think of an outline). A picture, or a frame of video, is a 2D array of pixels (and each pixel is an array of component values -- usually RGB). You can quickly see why pictures take up a lot of space:

(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)

That's just a 6x6 image filled with black. But it's formatted for humans. A computer would see that closer to:

000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000

That raises a very important issue: how big is that, and what's what? It's just a mess of zeroes without some kind of structure around it to describe what's what. That's metadata. Specifically, it's a glob of metadata that conforms to a pre-set standard and describes what kind of pixel description is being used, how long a row is, and how many rows make up a single image. Using this information, a computer can break up that bitstream into chunks at the row boundaries, decode the pixel colors, and then draw row after row on-screen and create an image from it.

As one might imagine, it's very important to get that description correct. If you cut at the wrong boundary or your pixels are a different format (or size!) then it all goes to pot fairly reliably. There are ways to detect when this goes haywire, but it's better if it doesn't in the first place.

That brings me to video. So take the above and then add in metadata about time. That is, how many frames will be delivered in one second, what a frame's data will start with (just in case a partial frame is delivered -- don't worry about it), and what the current stream time is for any given frame (that is, when to display it in relation to other frames should it need to hold a frame on-screen longer than another one). There's a lot of other stuff, but it starts to get rather complicated after this.

To simplify all of this, there are video modes. These are pre-agreed upon configurations of all of the above that two devices can use to simplify this whole process. You've probably heard them by the name "resolutions" but it's the same thing (Melvin, hush).

The standard-bearer resolution at one sad point in time was 640x480 for the Video Graphics Array, or VGA. Realizing this sucked rather massively, the industry moved to 800x600 and slapped "SUPER!" in front of it and we got SVGA. Both of these supported the mind-boggling limit of 16 colors due to an infinitesimal pixel storage size. Because the set of computer scientists doing video did not intersect the set of artists in the world, black and white are considered colors for the sake of this discussion. You got 14 colors, and you liked it.

Realizing this wasn't really so "super" the 1024x768 resolution was put forth and slapped with the fancy XGA name. Among the improvements was a widening to 8 and 16 bpp which allowed for 256 and 64K colors in those modes. Things got crazy after that and Wikipedia can tell you all about it should you decide to inflict that madness upon yourself.

The Point, Please Get to It.

As the frames got larger and the pixels wider and the refresh rates faster, there was an increasing demand put on the connection from the computer to the display. With the old VGA connector type it was not-trivial-but-not-impossible to keep adding these new modes to it because it was essentially broadcasting the video signal over the wire as a waveform. Enhance the transmitter and receiver and voila! -- more pixels.

Digital was quite different. After VGA (the connector) came DVI -- Digital Video Interface. It's a mess of a standard because it was bridging the analog world into a brave new digital era, but it did impressively well for what it was. However, the digital connection was limited by available bandwidth to 1920x1200 (if you've been keeping up with the names, prepare for a winner: WUXGA -- Widescreen Ultra Extended Graphics Array; when in doubt, add a word). That means that if you want to power a higher-resolution screen then you'll need two of those links to get the bandwidth needed to break past that limit. More on that later.

When DVI was made it came in several flavors, but the most common were DVI-A (analog), DVI-D (digital), and DVI-I. The I in DVI-I stands for Integrated, which is the nice way of saying "we considered all you laggards and included some analog pins in here so we can deprecate that god-awful VGA adapter and you can just hook up an adapter to extract the analog signal and power your radiation box that way". It was a really efficient naming scheme, I think.

While they were busy shoving pins into this connector, someone had the bright idea that displays might keep getting larger and more dense and it would be nice to consider the future just a little. So a collection of pins in the middle were designated a second digital "link" and could be used to shove more bits down the pipe. Using both links, you could drive about 4K pixels sixty times a second (2560x1600@60Hz). Thus we have Single-Link DVI and Dual-Link DVI.

Still Not Seeing a Point

Because displays could live on the sustenance provided by SL-DVI for many years, lots of things just didn't support DL-DVI (or DVI-A/I for that matter). That means things like displays that didn't need it, video cards that couldn't push it, cables for those folks, and KVM switches for the same reasons (oh, and cost). If you were the lucky owner of a 1920x1200 display then you were perfectly fine living in the Single-Link world. That includes all common HDTV formats, by the way, so a very large number of people are in this bucket (HDMI is compatible with SL-DVI for the most part).

But displays did, in fact, grow larger. As they grew larger, the previous version of larger trickled down to a more common position of the distribution curve and now we're seeing a video mode grow in popularity that hits this pain point right in the Y-joint: Quad XGA (aka Quad HD), and Widescreen Quad XGA. If you take 720p video and double the height and width, you'll get Quad HD (2560x1440). If you take that 16:9 signal and make it 16:10 you'll get WQXGA (2560x1600). Remember that resolution? Yeah, that's the limit of DL-DVI. We found the edge of the world (again)!

(By the way, you may have also heard it by another -- trendier -- name: 4K. That refers to the number of pixels per frame and is a nice shorthand. Another name for 1080p is 2K video, along the same line.)

The Point (Really!)

I bought one of these bastards. Namely, I bought a 32" QHD display to replace my 27" Apple Thunderbolt display because I wanted to use it with more than just my Mac (seriously, Apple?). With a DisplayPort to Mini-DisplayPort cable the Mac runs it just fine. With a DL-DVI, DisplayPort, or HDMI (1.3+) cable the other computers can use it just fine. With a SL-DVI cable -- or a DL-DVI cable running through a SL-DVI KVM -- I get 1900x1200.

It turns out that this really sucks. It sucks more that the DL-DVI KVMs for four computers (really) run about $400+, the DisplayPort KVMs appear to be limited to exactly two hosts (WTFSRSLY?), and HDMI KVMs only advertise Single-Link resolutions.

However, if you read the descriptions of the HDMI KVMs you'll notice something funny. They advertise being HDMI 1.3 switches. Yes, in spite of saying that they only handle 2K video, they are using the 4K standard and will happily support switching your lovely 4K display (or near-4K). Also, since HDMI includes full 7.1 audio, you can switch that as well.

So, very long story short: if you need to switch your 4K display for a price under 4K, use HDMI to connect it instead of DisplayPort or DL-DVI. In fact, if you handle the KM part yourself, you can get a 4K HDMI switch for about $30, and it'll do PIP and have a remote as well.

In the end, it's all about the bandwidth and the associated video mode standards. If the pipe is big enough, the pixels will flow.

Bluetooth Keyboards

For a mess of reasons too boring to get into, I wanted to get a Bluetooth keyboard for my iPad Mini. After poking around for a bit I found some good candidates, but holy hell is the market for BT keyboards crap right now.

The Apple Wireless Keyboard is pretty much a desk keyboard. While I really hate wires, I hate replacing batteries for needlessly portable devices more, so I always go wired at desks when I can (my love for the Magic Touchpad is an exception to the rule). There are a dozen clones of that keyboard out there (whose manufacturers are both named and unnamed) but the design is very much for the desktop.

I did a search and found a long list of keyboards that would possibly work for me and settled on three to seriously consider: the solar-powered Logitech K470, the Logitech K480, and the Microsoft Universal Mobile keyboard. I chose these because they looked well-designed, supported multiple devices (once I learned this was A Thing™ I realized I wanted it for my Apple TV), and were available for purchase locally so returns would be easier.

Well, after going to a few places that listed the K470 as being in stock, it never was. Popular belief has it that it's been discontinued and that's kind of restricting supply. Oh well. I did find the other two locally and brought them home for some testing.

The Microsoft one is just lovely. It's very Surface-like, small, can pair with three devices, includes a cover that doubles as a detachable device stand, and is rechargable over USB (and a charge lasts six months, supposedly). I love this thing so much. I'm rather sad it didn't work out because so much attention to detail has clearly gone into this product that I was almost unashamed to carry around a MS keyboard with my iPad.

Alas, there are two flaws with this keyboard, and one is fatal with zero recourse. The first flaw is that I have man hands and this was apparently made for small children. Every key I hit was wrong. Even when I got used to it, I found myself smacking something random every 20 characters or so. Livable flaw, especially given how much I liked the thing, but annoying.

The big flaw, however, is a rather unforgivable oversight. The keyboard supports three devices: a PC, an Android device, and an iOS device. I mean that literally. You must have those exact devices for the different positions on the device switch are associated with keymaps for those platforms and there is no way to change that. If you have three PCs, you're boned. If you have three of anything, or even two of anything, you're boned. I have three Apple devices. I was boned. The modifier keys change for each expected platform and on anything other than the iOS mode the Command key is mapped to Control.

Think about that. That means to get Command back I have to remap Control to Command, thus losing Control (which I use in the shell -- a lot). I could switch Caps Lock to Control like a Proper Neckbeard but I never learned to care for that layout so it'd be a frustrating change.

Also, there aren't real function keys; they broke out the actual media keys and mapped them to their F-key for each expected platform. That kills a lot of CLI work as well.

So, I returned it. Sadly. I really liked it outside of that. The cover stand even held my Smart Case-wrapped iPad correctly without taking it off.

Next came the Logitech K480. This thing is big. It's about the size of a MacBook Air 11", but it's light enough that's not a big deal. While that initially gave me much pause, I realized that meant that they keys would feel properly spaced-out, and they do. I can type on this like any other keyboard without any issues at all. There are F-keys, and other than fn and Control being swapped it feels great. Best of all, it solves the keymap problem very well. The knob at the side lets you pick which pairing slot to use at the moment and when you want to pair a device you press either the pc or i buttons which establishes the keymap for that pairing slot at the time of pairing. Whenever you go back to it, the right keymap is used. It works great on the desktop, iPad, Apple TV, and my Linux server as a result. The battery is not rechargable, however. It uses two standard AAA cells, though they claim it'll last a couple of years on one set. We'll see. It wont be the end of the world if not.

Overall, though, what I discovered is that there are a lot of crap options out there and relatively few well-designed ones (at any price). I had hoped that by now there'd be some nice things out there but I guess everyone's making crap hardware to go with the crap freemium apps. At least I found a couple of options that show that some folks are still thinking about usability.

On the importance of knowing what you've made

I have four Backup Plus Desktop drives. Two were bought two years ago and two last year.

All of them had an issue where they would drop off the USB bus sometimes and require a power cycle to return. I looked around and found a firmware update that promised a fix. However, it would only apply to two of them (the newer ones). The older ones (which looked identical, of course) didn’t show up in the tool.

Too busy to be bothered, I let this go for a while and just dealt with it.

Recently, however, I decided to RAID them together as a backup device for my server’s RAID. (How do you backup a RAID? With another RAID.) Well, you can imagine how well RAID likes it when a disk randomly falls off the bus.

I looked into this further and found something darkly hilarious (after trying to update the firmware again). Those older drive models that wouldn’t appear in the updater? System Profiler showed them as having a USB identifier of 0xa0a1. The new ones have 0xa0a4. Seagate omitted an entire line of devices from the update tool.

Well, okay, Seagate can eat an RMA then, because a drive falling off the bus is a failure in my head (and my warranty is up in September, so this is my window). I went through the process on their site several times and was rejected every time with an invalid product ID.

SEAGATE DOESN’T KNOW THEY MADE THESE DRIVES.

That’s the only conclusion I can come to. The firmware updater hasn’t heard of them and the warranty system hasn’t heard of them (kind of — the checker gave me the dates just fine, but I couldn’t start an RMA).

In the end, I found that using the part number of the newer drive with the serial of the older worked to get it in the system.  Clever, no?

Well, it would have been.  Turns out that while that let me send in the RMA, their system appears to know about that model somewhere deeper in the system and when I got my two replacement drives one was the newer a4 ID and the other the older a1 ID.  Lovely.

I did have a solution at the ready, though.  I have yet another drive of that model that I use as a solo backup drive for my Mac.  The USB widget attached to it read as an A4 so I swapped out that part and carried on with the only A1 device being the one that is intermittently connected to my Mac, and where a random long-term drop off wouldn’t be noticed.

Lesson: if you get a Seagate Backup Plus drive, ensure the USB family ID is 0xa0a4 and use the latest firmware.  They’re pretty solid devices at that point.