Wherein I Move a Lot of Words Around

On Gaming Input Devices

Having the desire to upgrade my input devices at the home, I started looking around for a good keyboard and mouse combo. While the business-oriented lines were nice in their own ways, they lacked a certain flair and were woefully short of buttons and standard layouts. (What's with everyone screwing with the standard keyboard layout? Stop it. I like my buttons.)

As a result, I started to look at the gaming series of devices. I'm not sure how I wound up looking at them, honestly, but once I started to look at the options it was clear to me that all the attention on making input devices better at a hardware level was going into that market instead: the keyboards were mostly mechanical, the mice were high-DPI and loaded with buttons, and the quality was far and away higher — as were the prices, of course.

After some period of research I picked up the Corsair K70 keyboard and Corsair M65 mouse. Neither is too gratuitous with the lights off, and both are quite helpful if you setup the lights accordingly. By which I mean: think back to the 80s and keyboard overlays for Lotus. Like that, but with colors. So when switching to a game you can have the lights come up for the keys you normally use and then color-code them into groups (movement, actions, macros, etc.).

When using it for daily stuff in Windows, and some games, it proved quite the nice combo. The mouse has buttons to raise and lower the DPI and a quick-change button at the thumb for ultra-precise movement (think snipers in an FPS game or clicking on a link on an overly-designed web page where the fonts are 8pt ultra-lights — yes, I really used it for that once).

However, I quickly found the Corsairs had a very large weakness: they literally only worked in Windows. I don't mean the customization and macros, I mean the devices themselves did not show up as USB HID devices in Linux or the Mac. They failed to be a keyboard and mouse on every other computer I have. Normally, I would blame this on my KVM's USB emulation layer, but I connected them directly and nothing changed. That is, until I read the manual and discovered the K70 had a switch in the back to toggle "BIOS mode". Now it worked, but I lost some customization and the scroll lock flashed constantly to tell me I wasn't getting my 1ms response time anymore (no, can't turn that off — flashes forever).

To add to the fun, the keyboard has a forked cable. One USB plug is for power and one is for data. If you connect the data cable to a USB 3 port on the computer itself then it can get the power it needs and you don't need the other. If you use a KVM or USB 2 hub then you're using both.

Overall, the frustrations outweighed the utility and I returned them both. I did some more research and found that, of all companies, Logitech fully supported their gaming devices on both Mac and Windows and their devices started in USB HID mode and only gained the fancy features when the software was installed on the host machine.

Taking that into consideration I went ahead and picked up the G810 Orion Spectrum keyboard and the G502 Proteus Spectrum mouse.

To summarize the differences:

I'm especially happy that both keyboards had a dedicated button for turning the lights off when I just wanted a good mechanical keyboard and back on when I want to do something that it adds value to. That's a nice selling point for both, really.

At any rate, if you have a Mac, it appears only Logitech still cares about you. That's perfectly fine with me as they make some good stuff, overall.


Even though I knew that video modes were a nightmare mess that was made barely tolerable by standards, I had no idea the hell that awaited once one passed the 1200p edge.

A short history of video modes before we begin (this helps the pain later). Video is a three-dimensional concept that must yield to the laws of computer science and become a two-dimensional bitstream of arrays in order to go down the wire to the screen (and also in order to be stored in memory, but let's not complicate things more). You may be wondering if I added an extra dimension in the previous sentence but I did not -- that additional dimension is time.

In computer science a list of numbers is an array (stop it, Melvin, you know where I'm going with this). A two-dimensional array is a list of lists (think of an outline). A picture, or a frame of video, is a 2D array of pixels (and each pixel is an array of component values -- usually RGB). You can quickly see why pictures take up a lot of space:

(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)
(0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0), (0,0,0)

That's just a 6x6 image filled with black. But it's formatted for humans. A computer would see that closer to:


That raises a very important issue: how big is that, and what's what? It's just a mess of zeroes without some kind of structure around it to describe what's what. That's metadata. Specifically, it's a glob of metadata that conforms to a pre-set standard and describes what kind of pixel description is being used, how long a row is, and how many rows make up a single image. Using this information, a computer can break up that bitstream into chunks at the row boundaries, decode the pixel colors, and then draw row after row on-screen and create an image from it.

As one might imagine, it's very important to get that description correct. If you cut at the wrong boundary or your pixels are a different format (or size!) then it all goes to pot fairly reliably. There are ways to detect when this goes haywire, but it's better if it doesn't in the first place.

That brings me to video. So take the above and then add in metadata about time. That is, how many frames will be delivered in one second, what a frame's data will start with (just in case a partial frame is delivered -- don't worry about it), and what the current stream time is for any given frame (that is, when to display it in relation to other frames should it need to hold a frame on-screen longer than another one). There's a lot of other stuff, but it starts to get rather complicated after this.

To simplify all of this, there are video modes. These are pre-agreed upon configurations of all of the above that two devices can use to simplify this whole process. You've probably heard them by the name "resolutions" but it's the same thing (Melvin, hush).

The standard-bearer resolution at one sad point in time was 640x480 for the Video Graphics Array, or VGA. Realizing this sucked rather massively, the industry moved to 800x600 and slapped "SUPER!" in front of it and we got SVGA. Both of these supported the mind-boggling limit of 16 colors due to an infinitesimal pixel storage size. Because the set of computer scientists doing video did not intersect the set of artists in the world, black and white are considered colors for the sake of this discussion. You got 14 colors, and you liked it.

Realizing this wasn't really so "super" the 1024x768 resolution was put forth and slapped with the fancy XGA name. Among the improvements was a widening to 8 and 16 bpp which allowed for 256 and 64K colors in those modes. Things got crazy after that and Wikipedia can tell you all about it should you decide to inflict that madness upon yourself.

The Point, Please Get to It.

As the frames got larger and the pixels wider and the refresh rates faster, there was an increasing demand put on the connection from the computer to the display. With the old VGA connector type it was not-trivial-but-not-impossible to keep adding these new modes to it because it was essentially broadcasting the video signal over the wire as a waveform. Enhance the transmitter and receiver and voila! -- more pixels.

Digital was quite different. After VGA (the connector) came DVI -- Digital Video Interface. It's a mess of a standard because it was bridging the analog world into a brave new digital era, but it did impressively well for what it was. However, the digital connection was limited by available bandwidth to 1920x1200 (if you've been keeping up with the names, prepare for a winner: WUXGA -- Widescreen Ultra Extended Graphics Array; when in doubt, add a word). That means that if you want to power a higher-resolution screen then you'll need two of those links to get the bandwidth needed to break past that limit. More on that later.

When DVI was made it came in several flavors, but the most common were DVI-A (analog), DVI-D (digital), and DVI-I. The I in DVI-I stands for Integrated, which is the nice way of saying "we considered all you laggards and included some analog pins in here so we can deprecate that god-awful VGA adapter and you can just hook up an adapter to extract the analog signal and power your radiation box that way". It was a really efficient naming scheme, I think.

While they were busy shoving pins into this connector, someone had the bright idea that displays might keep getting larger and more dense and it would be nice to consider the future just a little. So a collection of pins in the middle were designated a second digital "link" and could be used to shove more bits down the pipe. Using both links, you could drive about 4K pixels sixty times a second (2560x1600@60Hz). Thus we have Single-Link DVI and Dual-Link DVI.

Still Not Seeing a Point

Because displays could live on the sustenance provided by SL-DVI for many years, lots of things just didn't support DL-DVI (or DVI-A/I for that matter). That means things like displays that didn't need it, video cards that couldn't push it, cables for those folks, and KVM switches for the same reasons (oh, and cost). If you were the lucky owner of a 1920x1200 display then you were perfectly fine living in the Single-Link world. That includes all common HDTV formats, by the way, so a very large number of people are in this bucket (HDMI is compatible with SL-DVI for the most part).

But displays did, in fact, grow larger. As they grew larger, the previous version of larger trickled down to a more common position of the distribution curve and now we're seeing a video mode grow in popularity that hits this pain point right in the Y-joint: Quad XGA (aka Quad HD), and Widescreen Quad XGA. If you take 720p video and double the height and width, you'll get Quad HD (2560x1440). If you take that 16:9 signal and make it 16:10 you'll get WQXGA (2560x1600). Remember that resolution? Yeah, that's the limit of DL-DVI. We found the edge of the world (again)!

(By the way, you may have also heard it by another -- trendier -- name: 4K. That refers to the number of pixels per frame and is a nice shorthand. Another name for 1080p is 2K video, along the same line.)

The Point (Really!)

I bought one of these bastards. Namely, I bought a 32" QHD display to replace my 27" Apple Thunderbolt display because I wanted to use it with more than just my Mac (seriously, Apple?). With a DisplayPort to Mini-DisplayPort cable the Mac runs it just fine. With a DL-DVI, DisplayPort, or HDMI (1.3+) cable the other computers can use it just fine. With a SL-DVI cable -- or a DL-DVI cable running through a SL-DVI KVM -- I get 1900x1200.

It turns out that this really sucks. It sucks more that the DL-DVI KVMs for four computers (really) run about $400+, the DisplayPort KVMs appear to be limited to exactly two hosts (WTFSRSLY?), and HDMI KVMs only advertise Single-Link resolutions.

However, if you read the descriptions of the HDMI KVMs you'll notice something funny. They advertise being HDMI 1.3 switches. Yes, in spite of saying that they only handle 2K video, they are using the 4K standard and will happily support switching your lovely 4K display (or near-4K). Also, since HDMI includes full 7.1 audio, you can switch that as well.

So, very long story short: if you need to switch your 4K display for a price under 4K, use HDMI to connect it instead of DisplayPort or DL-DVI. In fact, if you handle the KM part yourself, you can get a 4K HDMI switch for about $30, and it'll do PIP and have a remote as well.

In the end, it's all about the bandwidth and the associated video mode standards. If the pipe is big enough, the pixels will flow.

Auto Unlock Requirements

After installing the GMs to Sierra and Friends I was eager to try out the Auto Unlock feature with the Watch as passwords generally suck and re-entering them time and again sucks more.

With all my devices on the same Apple ID and updated, I went to the Security prefs on the Mac and lo … no option for it. After reading around I learned all the devices must be marked as Trusted in iCloud, which means you need Two-Factor Authentication (not Two-Step). I set this up and re-added each device to iCloud until they were marked appropriately. Still no option on the Mac.

Then I read around a little and discovered the requirements for this feature, which I couldn't find on Apple's site anywhere. While early release notes to the betas said any 2013 Mac would work, it turns out that only mid-year and later models will work — specifically those with the 802.11ac network card.

I ain't upgrading for this. Luckily, other folks have some solutions for this that work with less-stringent requirements on how recently your Apple tithe was paid, such as Knock and MacID.

Is It Illegal to Make Your Spouse Ride on the Roof of the Car?

Is It Illegal to Make Your Spouse Ride on the Roof of the Car? | Lowering the Bar

This question arises from the recent arrest of a Florida man (credit: The Smoking Gun) after he was stopped by a police officer who wished to inquire as to why there was a woman clinging to the roof of his car. The answer to that question, as you might expect, turned out to be complicated.

Every Developer's Nightmare

State: Answers were erased on 14,220 STAAR tests |

State officials are threatening to reconsider a $280 million contract with its testing vendor after answers to 14,220 state standardized tests were erased because of a computer glitch last week.

Programming Sucks

Also, the bridge was designed as a suspension bridge, but nobody actually knew how to build a suspension bridge, so they got halfway through it and then just added extra support columns to keep the thing standing, but they left the suspension cables because they're still sort of holding up parts of the bridge. Nobody knows which parts, but everybody's pretty sure they're important parts.

Programming Sucks

Every project I've ever worked on has this smell somewhere.

Not Paying Attention

People like to think Microsoft is the dean of proprietary software companies. Nonsense! Microsoft is making serious investments in open-source software. Apple, though, now there's a company that likes to lock down its code.

Apple's Swift Comes to Linux - ZDNet

Likes to lock down their code? WTH? Pay attention to that which you critique. The OS is open source, the compiler has always been open source, WebKit is open source, the core frameworks are open source, and they publish all changes to GPL code as they should. Seriously, what's your standard here, especially when comparing to MS?

I should expect this from ZD though. I think they're beating CNet to the bottom.

The Academy and Diversity

The issue is larger than the folks running the show can fix. Their members vote based on what they see as talent. Their membership is not at all diverse. Even this, though, isn't in their control. They mainly have A-level members with some scattering of Bs. Folks at that level trend towards the pale end of the spectrum as a product of the viewership's perceived preferences ("Ain't no white family going to see a movie with a black lead!" uhh, Lethal Weapon? The Matrix? A hundred others?)

So if we want a villain – and who doesn't? – it's the casting and hiring directors who are using outmoded descriptions of the viewership (from their producers and other management) from god-knows-when that limit the influx of diverse talent into the industry to begin with.

Fix that and then the cast and crews become diverse (as the people on-screen in the theaters become more diverse). Then they get their membership and vote. Then the nominees reflect this.

It's a huge problem, and not at all fair for anyone on any side to blame them for what their members chose. It's like blaming the county clerk for who won a political race. The voters chose. Fix the voters.

Silent Trade

The silent trade: universal objective ethics in action:

Once upon a time, back during the Age of Exploration, there was a marvellous practice called the “silent trade”. It was a solution to a serious coordination problem between groups who had no languages in common, or distrusted each other so much that they refused to come within range of each others’ weapons.

I do love interesting bits of trivia.

However Bad It Is…

… it could always be Mac OS 9.

macos9 extensions off crash