Tales of the Rampant Coyote

Adventures in Indie Gaming!

The Rise of the Microconsoles Continues…

Posted by Rampant Coyote on June 28, 2013

Google is apparently going to enter the fray with an Ouya-killer of their own. Possibly in response to Apple’s potential push for supporting game controllers and having gaming integral with Apple TV:

Forbes: Google Takes Aim at Gaming Market With New Android Video Game Console

WSJ: Google Building Android Game Console

The rumors of Apple TV based gaming is of course interesting. Forbes has another article this morning about the possibilities: “Google and Apple May Bring Us a Console War We Didn’t See Coming.

And the GameStick is coming out pretty soon…

And then there’s the Mad Catz MOJO. Which is more of a different way to play vanilla Android games.

Green Throttle Gaming seems to be doing kind of the same thing – letting you play stock Android games on a TV with a controller.

And then there’s GamePop – a subscription-based microconsole that I still can’t get my head around. Yet.

And we’ve got the new consoles from the major console manufacturers hitting. Wow.

Ladies and gentlemen…. welcome to the wild, wonderful world of market fragmentation. It won’t last forever (thankfully), but it will certainly be a land of opportunity… littered with the corpses of lots and lots of earnest but ultimately unsuccessful companies. As Dave Thier comments today, ” I see a storm of new technology and evolving tastes on the horizon, and I wonder what the long tail for these consoles, and the AAA gaming industry in turn, is going to look like. All the lines are blurring right now. The distinctions between console, PC, phone, tablet and watch are approaching academic. This makes for a cloudy future for anything with a more narrow focus.”

But in the meantime, as all this develops: Your choices as a gamer this year are pretty impressive: Do you go with Intel or AMD? NVidia or ATI? Definitely a lot to think about… 🙂

(Seriously, if Microsoft tries to drag PC gaming kicking and screaming down into potentially another walled garden of a “me-too” mobile-style environment, my commitment to PC gaming may not be all it once was. It’s still where my heart is, but I’m worried about being abandoned. So… while I’m still leading on computers of various flavors, I’m definitely fascinated by the evolving console / mobile landscape).


Filed Under: Biz - Comments: 8 Comments to Read



  • Califer said,

    Too darn many consoles. Maybe I’m just feeling my age here, but I just haven’t had the time to play games on my 360 or my Wii. No plans to get any of the new consoles (unless I’m porting), but if I was I’m not sure were I would go. Interesting that most of them are android related. I imagine that only one or two of those are going to get any traction at all though.

  • Rampant Coyote said,

    The last time I *remember* it being this bad, it was Apple, Commodore, IBM, Tandy, Atari, Sinclair, Texas Instruments, BBC, Coleco, Amstrad, and others on the “home computer” front.

    And on the console front, you had Atari, Sears, Coleco, Mattel, Magnavox, Vectrex (Milton Bradley), Bally, and those arcade upstarts Nintendo and Sega…

  • Andy_Panthro said,

    Considering the ever-increasing quality of Android (and mobile in general) gaming, it’s not surprising that these things are catching on.

    After all, why buy an expensive console, when you can link up your Google Play or Apple account with your TV?

    I’m still amazed you can get games like Carmageddon on mobile devices (looks amazing on my Nexus 7, and even runs on my old HTC phone). I feel like the gap between mobile platforms and the console/desktop is shrinking in many respects. You can even get XCOM (the new one) for iOS now, which I hope makes it’s way to Android at some point.

    On a vaguely related note, Carmageddon is actually free right now on Google Play and the Apple store (only this weekend, so I’d grab it while you can).

  • Anon said,

    I also compare this with the mid-eighties and the home computer platform plethora.

    If you were playing it safe you went with either a Commodore, an Atari, an Apple (if you had more money) – or if you were in the UK a Sinclair Spectrum.
    Everything else would be for “individualists” (not needing masses of software) or people with no knowledge (well meaning parents that bought a TI99/4a for example…).

    I also agree that most of these companies will die a horrible death like back then. There’s practically only one surviving computer making company from the eighties that still makes small computers: Apple – because they not only were greedy like the rest but also the most intelligent ones (pushing their Apple IIs in schools was a genius move – and they didn’t spend too much money on questionable projects).
    The market will support three or four companies but not with the same success/profit so the remaining big companies will consider leaving the market, too.

    However, regarding platform choice (which the gamer has) and architecture choice (which the console maker has) the gamer shouldn’t consider the performance of the hardware in detail (which is an architecture question).

    The gamer should instead consider the features of a console and the software/shop support. And obviously the costs. 😉

    Why? Because the last time, the PS3 was hailed as the most powerful console – but the majority of multi-platform titles looked sharper and more detailed on the X360 (both platforms had great exclusives but one can’t really compare them). In the end it was a tie, the platform advantage of the PS3 (in the beginning with hardware PS2 compatibility and card readers and Linux support) dissolved pretty quickly.

    The generation before that the PS2 won the hearts of the majority of gamers. Why? Not because it was powerful but because it got great software support from all corners, after a slightly rugged start (it was hard to program, too).
    It’s competition?
    – The first Xbox was x86-compatible and perhaps the most powerful of the bunch. But it was the initial console of Microsoft and while it was innovative in some aspects (it had a harddisk, wide online gaming) it had several problems: Too damned expensive, very few great exclusives and too similar to PCs (it was too far ahead ;-)). Many people simply bought the cheaper version for their PC which was a major gaming platform back then (this has changed, it now simply is one of the many platforms).
    – The Gamecube was practically as powerful and had much worse support (even though it was easier to program) and it was seen as a kiddie console.
    – The Dreamcast on the other hand was doomed when Sony’s marketing department got up to speed and EA and other companies completely abandoned it. While the Dreamcast perhaps wasn’t as powerful as the other two “next gen” consoles it still was way more powerful than the generation before it (PS1, N64). It’s telling that this console still enjoys some commercial support…

    So what the companies learned from their successes and failures pf the past is that they shouldn’t spend resources on developing their own processors.
    Especially Sony was “healed” by their costly PS3 experience, I guess.
    Microsoft on the other hand initially had x86, tried an IBM multicore PowerPC RISC (to combat the Cell RISC CPU Sony used) and is now back on x86 – which they also do to save costs.
    The only remaining big console maker, Nintendo, still bets on IBM and ATI graphics tech (which is of course owned by AMD, who now is on all three major consoles…) because their Wii was such a big success. It will be interesting to see if Wii-U gains ground or if it remains a failure, thanks to insufficient software support – both quantity and quality (support of the specific Wii-U features like the controller) and the problem that its predecessor was first and foremost a casual console and casuals don’t buy consoles as regularly as core gamers…

    So now all big console makers use tech from the shelves of the market leaders. While some adapt it to their specific needs (cores, cache size, rendering pipelines, shader units…) it still remains existing tech.

    The Android consoles, however small, are even more extreme: Most of them don’t even customize the chips, they really use existing parts (vs. existing designs that need to be manufactured).
    It will be interesting to see what the bigger players of the “casual market” (Apple, Google) will announce but I don’t expect drastic custom chip designs either.

    In the end all manufacturers use existing designs and the gamer will have to decide what flavour he wants: A mass market core gamer console from Sony or Microsoft, a mass market casual platform from Apple/Google, or an indie platform on the least powerful architecture from smaller companies that don’t spy on the user as much…

  • Felix said,

    Now, to be fair, adapting a game to various Android devices should be much easier than rewriting a C64 game for the Speccy. It’s hardly the Tower of Babel we had in the 8-bit era. That’s not the problem.

    What bothers me is all the pundits who seem convinced that the open console was invented last year. Most of these new devices will fail, yes, but not because of market fragmentation or other such buzzwords. They’ll simply go the way of the GP32, GP2X, Wiz, Dingoo, OpenPandora…

    Open consoles have failed to make any inroads even when small portable consoles were a thing. Nowadays? Look around you next time you take a train, and you’ll notice everyone is playing on a smartphone or tablet. Not one PSP or DS in sight anymore. And if they want to play on a big screen, they can either get HDMI output from the same device, or more likely they already own one of the major consoles.

    Not to mention that the OUYA at least seems exclusively focused on games despite being based on a general purpose operating system. And people will want to use it to show pictures, movies, listen to music and browse the Web. They always do, even when they have to jump through hoops to do it. Makes me wonder if all these people who make consoles (whether for corporations or start-ups) ever bother to meet some real potential customers apart from the perpetual early adopters who buy everything new on principle.

    Set-top media player manufacturers have figured out what people want a long time ago. It can’t be that hard.

  • Anon said,

    > And people will want to use it to show pictures, movies, listen to music and browse the Web. They always do, even when they have to jump through hoops to do it.

    The same goes for the very vocal minority that wants to install Linux on everything that remotedly resembles a “Von Neumann architecture”…

    But a console manufacturer has to consider costs and most of them will save a few cents if 95% of the owners don’t show interest.

  • Felix said,

    Anon, that very vocal “minority” includes everyone I’ve ever seen using a portable console. Everyone. Not in the sense of installing Linux, but in the sense of using it for media.

    See, I’ve been thinking about this lately. We all want to do a little of everything with our computers. Play, work, surf the Internet, listen to music… The trick is, we want to use the same device for everything, because it’s a lot more convenient than having lots of different electronics piling up. That’s how Web browsers ended up doing games, video, spreadsheets and e-mail.

    It’s just that some of us work more, and some of us play more. The former will choose a desktop or laptop, while the latter will go for a console. More recently, touchscreen devices are also an option for those who mostly just chat, read books and play Bejeweled. But we all want to do a little of everything with every device.

  • Anon said,

    My experiences with other people (family, friends, colleagues) is exactly the opposite: They buy a device for a specific functionality. Most of these people are certainly no hard-core tech geeks, though.

    The audience you are referring to is 15 to 30 years old and the geeks you know are of course proficient.
    Now ask your dad how he coped with home electronics and what he actually uses.

    Every TV set above a certain price level is now a “Smart-TV”, being able to connect to the internet, playing back youtube videos etc.
    Guess how many owners actually connect their device to the net? One manufacturer claimed this: Only 14% do so!
    So obviously a lot of multi-functionality is wasted like a lot of people never use most features of their mobile phone.

    Don’t get me wrong – I personally use such multi-functional devices (like my Galaxy Note 2 smartphone which seems to be able to do everything and then some) but sometimes they are barely usable: Slow browsers, limited video playback (missing codec, stuttering) etc.

    What we need for broader acceptance are three things:
    – Very easy user interface (I have friends who think their PC is like a toaster and refuse to learn operating it)
    – Enough processing power for basic tasks like playing back full HD video streams (next year all geeks cry for 4K video streams, of course…). Similar to how all devices of the last ten years or so can play back hifi audio without any problems.
    – Adaptability/flexibility – for updates like codecs. This is an illusion of course as all manufacturers would love to sell you a new device.

top