Firefox 29 – Not the most customizable Firefox ever

So I upgraded to Firefox 29 and was pretty excited after reading their claim that it was the most customizable Firefox ever. Upon opening the browser, I was greeted with a very new look and here are my first thoughts:

  • The bar with the buttons, address bar, etc. got a hell of a lot thicker. So despite saving space in the tabs, they went backwards and ended up eating my precious vertical space.
  • Where the fuck is the refresh button?
  • 10 seconds later. Oh, there it is in the most hard to click spot ever (size + being a but off means you click the address bar dropdown instead).
  • How do I get my navigation controls back to how I want? Home, Forward, Back, and Refresh buttons.
  • 3 minutes later. Ok… maybe I’m blind but I really don’t see that Refresh button in the customization options.
  • 2 minutes later + some Google searching. Ok, I’m not blind. Firefox took it out.

So…

WTF Firefox?

How can you call this the most customizable Firefox ever when you REMOVED the ability to add a refresh button (among other options for navigation)? I can see all the work that went into this new version but right now I’m trying to find a way to downgrade because of this usability fuck up.

I should have seen this coming after the whole “let’s remove features because no one uses them (except for all of Africa)” discussion. Firefox is starting to get the GNOME Shell bug where the designers are beginning to ignore users in an effort to push their own vision of simplicity over functionality. Do you know what I say to these designers? You suck at your job.

Rather than try to find a way to better organize and group features, you remove anything you cannot deal with. You implement things in a way that works for a single workflow instead of keeping things flexible to accommodate all users. That makes you all hacks. Unable to think outside the box, you decide to shove everyone into your box.

Have you designers ever thought about how this would effect users with poor eyesight? Or those with disabilities who may not have 100% accuracy when moving their mouse? Or how about people who don’t use a standard DE and place their menu bar vertically on the left/right (hint: moving all of the browser controls to the left/right is a huge usability boon for these people)?

No you didn’t think of these people.

And if you did, then that’s worse because you consciously ignored them.

With Firefox going down the same route as Chrome, who do we non-90% users turn to? Chrome is just as bad. Opera is effectively dead. IE was never good. Safari isn’t exactly good on non-OS X systems. All of the various Linux browsers? Nope, not powerful enough. This trend of companies trying to push everyone into their own box is very disturbing. It’s hard to make sense of and leaves me wondering what I can even do.

Sigh…

Right now, I guess the only thing I can do is stand on my box and shout.

VN:F [1.9.22_1171]
Rating: 5.0/5 (4 votes cast)

One Microsoft, One Life Left

m4s0n501

Yo Microsoft, let’s sit down and talk.

You have a “new strategy” (which really reads like a vision so that’s what I’m calling it) that states:

Going forward, our strategy will focus on creating a family of devices and services for individuals and businesses that empower people around the globe at home, at work and on the go, for the activities they value most.

Compare this with the old vision of “put a PC in every home” (paraphrasing) and which seems more concise? Which can you read and say, “Now that is one heck of a vision. How do we get there?” As much as I like some of your developer products and a few other things, your new “strategy” is representative of what has slowly driven me away from using Microsoft products. I just don’t get you as a company.

Where exactly do you want to go? Your vision is grand but has no clear goal. How do you measure progress? Or rather, a better way to say what I mean is, how do you know that you are headed in the right direction? As a consumer, I am worried when I buy and use your products. Not because they are headed in a new direction, rather it is because I don’t think your vision will result in me using the best possible product.

Compounding the issue is that your vision is big but has no “WOW” factor. It’s not something that makes people think, “Damn, would that be cool if it happened.” Why? Because you are simply following industry trends instead of paving a new path forward.

And there is the crux of Microsoft’s struggles. You are no longer a pioneer, so you’ll always be one step behind everyone else. In a world where interoperability is inevitable, with various open protocols and where even the closed ones at least have APIs, your vision is not something that surprises me. Hell, it’s something I already expected from you.

So please Microsoft, I beg you to reach deep down and pull out a vision truly befitting of your vast resources.

Or, you know, ask Bill for some advice. His post-Microsoft philanthropic goals have the same “WOW” factor as his original vision at Microsoft. It’s clear he is a natural visionary and that is what Microsoft needs. And yes, this is a jab at the Microsoft leaders. I don’t think any of them are tech visionaries. Instead, I believe they are all “business visionaries” (if that even exists) whose only talents are growing a company. Not helping a company innovate. And I feel this will lead to the death of the Microsoft we all once knew.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Experience Gaming on Ubuntu – Part 1

I have a Windows 7 / Linux dual boot desktop for gaming, general web browsing, and media playback. I built it about a year ago and during that time never booted into the Linux partition (running Mint Debian). Why? Too annoying to get gaming to work on it (though Desura ran fine if I recall). Instead, I always booted into Windows and played games there. That was perfectly fine except the computer would randomly black screen and crash due to some video card driver – kernel conflict. I spent about 2 weeks trying various fixes proposed by Microsoft and none worked (even RMA’d some parts hoping that would do it but no luck). Despite this, I still used Windows cause that’s where all my games were.

In comes Steam on Linux. With this release, I carved out a weekend to change my Linux partition over to Ubuntu 12.10 (I wanted LTS but after much research and testing, it turns out my video card requires a newer kernel in order to get sound working over HDMI). After building my system from the ground up using the minimal CD to ensure no bloat, I installed Steam & Desura and gave the system a trial run. I’ll get into my gaming experience in another article, but first, my experience with Ubuntu in general.

As a disclaimer, I am clearly not a Linux newbie. I’ve been using it as my primary work system for almost 5 years now. All of my computers are either dual boot or pure Linux. I even have a massive BSD server running ZFSGuru and my laptop (that has Nvidia Optimus) run on Arch Linux. So yeah, if I have any trouble or pain points with a distro, I can guarantee that any “normal” end user has zero chance of solving the problem. And oh boy, did I have some pain points.

Linux + Graphics Cards… Bring on the pain!

For reference, my configuration is:

  • Intel Pentium G630 SandyBridge CPU
  • ASRock H77M-ITX Motherboard
  • HIS Radeon HD7750 (Silent version with no fan)
  • A-DATA XPG Gaming series DDR3-1600 RAM AX3U1600GC4G9-2G (4GB×2)
  • SilverStone SST-SG05B-B USB3.0 Case with 300W PSU
  • Hitachi HTS725050A9A364 500GB 2.5″ HDD
  • XBox 360 Wireless Receiver + 4 wireless controllers for gaming

It’s built to be portable, consume very little power, be able to handle games at decent settings (which it does), and be absolutely silent (which it is).

So what pain points were there in getting Ubuntu running? The GPU driver. Everything else worked out of the box and didn’t require me to open a terminal. Great start right there. But when it came to drivers, I went through hell and back to get everything working perfectly. Here is a list of all the hurdles I had to go through to get this working:

  1. Uninstall Linux Mint 14 and go to Ubuntu (with Unity) because it turns out that there is an open bug with the Cinnamon Desktop + Steam that causes most games to just display a black screen when in full screen mode. You can check out the bug progress here: https://github.com/linuxmint/Cinnamon/issues/1513
  2. Uninstall the drivers from the repositories and install the 13.1 drivers manually. The ones in the repo are too old and give you a “Graphics Card Unsupported” message on the screen when combined with my GPU.
  3. Disable the automatic underscan via the command line. The Catalyst Control Center does NOT save settings and apply them after reboot. So to get the stupid underscan to go away, you need to run:
    sudo aticonfig --set-pcs-val=MCIL,DigitalHDTVDefaultUnderscan,5
    The 5 will set the under/overscan to 0%. Yes, there is no documentation on what the number represents anywhere and I had to test it out to see what it did.
  4. Reinstall Ubuntu (yeah, FUCK!) from 12.04 to 12.10. Out the window goes my plan to use an LTS release. This is required because you need a newer kernel than what is provided on 12.04 in order to get sound via HDMI to be recognized by the OS.
  5. Set the sound to HDMI by default and… wait, that doesn’t work. Turns out you need to run alsamixer to unmute the channel, set the card, and then use speaker-test to get the sound to start properly working. Yes, it doesn’t work until AFTER you run:
    speaker-test -c 2
    I figured this one out by tons of testing and putting together clues from all of the failed attempts by other people to get sound over HDMI working with the fglrx drivers (yes, I believe I am the first to get this working). Oh, and no this has nothing to do with me installing via a minimal CD. I tried this out on the default installs for Ubuntu 12.04, 12.10, and Linux Mint 14 as a sanity test before figuring all this out.

So yeah, FUCK AMD DRIVERS! As much as I try to like and support ATI/AMD, their Linux support has gone down the shithole ever since my first (perfect) experience with them 5 or so years ago.

Cool, the GPU drivers and sound are all working now. Amazingly, I had no other hardware issues. I didn’t have to cast insane commands via the terminal to get networking running, or do any custom modifying of conf files to get suspend working (no hibernate since I don’t use swap). Everything else just worked out of the box. Even the wireless XBox 360 controllers / receiver (minus a few noted bugs with the LED lights mentioned on the Ubuntu Wiki).

Next comes the software. What I did have trouble with was configuring Unity to not do annoying stuff like display results from the Software Center (remember – minimal install so no Amazon already, but damn, they baked the Software Center stuff right into the core Unity libraries). Also getting things like the notifcation area to appear, get certain indicators working, and deal with lens crashes (still don’t have the Photos Lens working yet) were a bit of a pain. None of this is stuff you want newbies to deal with. But at least it isn’t impossible to solve with the help of Google. As for my experience using Unity, I think it is a good DE but still needs a lot of polish. Especially with regards to application discovery. But overall, the experience is good since I am quite used to the layout (I have Windows 7 setup the same exact was on all Windows partitions).

And then the moment of truth, installing Steam, Desura, Emulators, Wine, and PlayOnLinux to run games. I’ll get to each of these in my followup posts.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

2013, PlayStation 4, and the mark of the beginning (or end) of the console era.

Like most people, I’m following the PlayStation 2013 Event (albeit, filtered via Reddit updates) for news about what Sony has in store for us in 2013. And it’s interesting.

The rumored (leaked?) prices for the PlayStation 4 are $429 and $529. If true, that means the next gen consoles have finally entered into PC level pricing. Combine this with the move to x86, which will make it far easier for gamedevs of all levels (indie, AAA studios, etc.) to create cross-platform console/PC games, and potential up and coming consoles will have a much greater range of strategies to use to compete with PS4 and probably whatever Microsoft decides to come out with next for the Xbox. Specifically, the two big players to watch are the OUYA and SteamBox.

The OUYA is already clearly going to undercut the PS4 as it targets the market for low end and indie games. They are far cheaper as a console and much more portable making it easy to use as a LAN system. It will be interesting to see if the ability to play locally with friends can beat out playing over the net (the clear move Sony is making with the share button on their controller). In addition, gamedevs now have an interesting targeting choice for their primary platform – develop cheaply on the OUYA or put in cash to develop on Sony (which should be easier with the move to x86 but there is still a publishing cost associated with Sony). Indie devs will probably target the former while AAA studios stick with the latter mainly due to the impossibility of running AAA titles on an OUYA based on startup costs. However, it is full possible that there will be a shift for the AAA studios over the next few console generations depending on the performance of the SteamBox.

The SteamBox is really just a PC under the hood. Early prototypes have already shown that they are targeting a small, portable form factor which will head on compete with the OUYA in terms of group play. However, they also have the Steam network built in which will allow them to compete, and in my opinion, trump Sony’s services. This leaves Sony to compete with the SteamBox on price and hardware platform to garner the attention of both gamedevs and consumers. Unlike the OUYA, which uses an ARM CPU, the SteamBox is most certainly going to use an x86 compatible CPU like the PS4. This will make it much easier for gamedevs to create cross-platform games on both the SteamBox (PC) and the PS4. Because of this, there is a very good chance that both indies and AAA studios will target both platforms by keeping the majority of their code base portable and using Greenlight as a measuring stick to see if polishing the game (porting any non-compatible code, testing, and control tweaks) for the PC is worth the cost. This puts the PS4 in a position of competing with hardware; a losing position in my opinion as the SteamBox now has plenty of pricing breathing room to outspec (or undercut) the PS4.

With Sony’s admitted failure of the Cell on the PS3, they realized that gamedevs have far more power over the success of their platform and that their name will not hold strong forever. So their move to x86 is both an indicator that they will concede power to gamedevs, and in a bigger move, Sony will be banking their future on their hardware platform and services (really, more the services than hardware). This isn’t a bad move as the very calculated risk can have great long term rewards for the company if pulled off well. By using their current market position, they can maintain a healthy lead in premier titles over unproven consoles (seen by having launch titles such as Destiny, Final Fantasy, and even Diablo III!) while at the same time repairing or building up good relationships with AAA studios. If they can use their market position to buy time to fix all the issues with their gaming & social services before their next gen console is released (most certainly going to completely close the console-PC gap for specs per price), then they can close the gap or overtake the Xbox (superior multiplayer) and Steam (superior social) as the gaming platform of choice. A very profitable market as everyone is moving to the online purchasing model (goodbye retailers, aka: GameStop).

On the flip side, if the SteamBox (or OUYA – unlikely) can draw indies and AAA studios to their platform, then this may mark the beginning of the era where consumers begin moving towards the PC as their gaming platform of choice.

Of course, this is all speculation and very much based on how the big AAA companies decide to move in terms of target platforms. This of course is based on how consumers will see the PS4 and either balk or swallow the price tag. Very circular but this definitely puts the standalone console market into interesting times as the way the next few years play out will definitely be discussed by and analyzed by the industry as a major turning point for consoles or explanation on why PCs failed.


Why PCs failed? Wait what? Yes, this article doesn’t end here. There are a few other major players (aside from Xbox – the main competitor to Sony and potential industry changer) that can change the entire gaming ecosystem. They are iOS and Android (and maybe Ubuntu; we’ll see how they do. Random aside: I’ll write about Ubuntu in the future as I already switched my gaming desktop away from Win7 to test how well Ubuntu holds up now with Steam.). Yes, I’m talking about the mobile market (phones, tablets). They have already made a clear impact in the handheld gaming market previously dominated by Nintendo and only partially dented by Sony. Especially since Sony is again copying Nintendo and merging the PS Vita play experience with their console, the PS4. Mobile devices are becoming increasingly popular as casual gaming devices and if they can garner enough power, they may become a better platform for AAA gaming in the future due to their “play anywhere” capabilities (this will require innovation in gameplay but I’m certain this will happen). If this happens before the SteamBox can take flight (or after the SteamBox gets crushed by consoles), then PC gaming has a very good chance of straight up dying for the mainstream. Yes, there will always be a lot of PC gamers but the market will forever remain a niche as future games will increasingly move to other devices. Imagine if you can have great MMO gameplay on a mobile device that isn’t a clunky laptop. The social capabilities are endless.

So this may also mark the beginning of a shift in competing platforms from consoles vs PCs to consoles vs mobile (phones, tablets, and I forgot to mention, the web).

Speaking of casual gaming and mobile platforms, Nintendo is (and has been for a while) in a crazy interesting position. They clearly target a different kind of consumer – casual gamers of all ages. Their core market already doesn’t see a standard PC as an alternative because of the nature of their games (Wii still beats the Kinect and PS Move) and how family friendly their franchises are (Mario, Kirby, etc.). Mobile is their only threat which is why the Wii U was created (a clear counter to the growing mobile platform by making it so you can mobile style games on the Nintendo but with more gameplay design choices). The only thing holding them back from creating a stranglehold on that section of the market (and potentially crippling mobile platforms from competing with consoles & PCs) is how hard it is for gamedevs to create games for Nintendo. It will be interesting to see what moves Nintendo makes after seeing Sony move to x86 and giving up so much power to gamedevs.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Google Attempting to Unify Google+ and YouTube Accounts

Interesting. Google is now pushing for having people use the same name on their Google+ and on YouTube (and probably across all Google services). This would mean having your real name on YouTube. Yeah… how about no?

To expand on my decision:

1. There are simply some services that you don’t want to display your real name on. YouTube is one of them because you can potentially have an account for posting things like streamcasts and the like where you are well known based on your online handle and not your real name.

2. I already get annoyed by now the default display for YouTube is not Uploads Only. I can only imagine what would happen once Google starts integrating all their services and I suddenly get YouTube videos posted by friends showing up in on my YouTube page.

3. Or vice versa, in my Google+ page. I complained almost instantly about the lack of grouping/filtering for your stream in Google+ which got fixed like a year later (way too late, you failed Google; this is one reason why everyone just stuck with Facebook). If they do start merging services and updates, let’s just say that I don’t think they will put in any worthwhile fixes for their initial overreaching/broken implementation anytime soon. So it’s better to just tell Google to fuck off straight from the start.

4. Laws haven’t caught up with privacy in job hiring yet. So I personally avoid linking my real name to my online presence as much as I can. You never know when an employer will Google your real name in search of what you do online. Sure I can secure my public info like I painstaking do with Facebook, but really, why go through the trouble? If I just stick with an online handle, I don’t have to think about this problem at all. There are just no benefits I can see between linking the two unless you are trying to streamline your online presence.

Which takes me to why I believe Google is now making the push here. It was pretty damn obvious we’d get to this point, and it all depending on Google+’s progress. Now that they have a fairly good group of celebrities, companies, businesses, and even the government using Google+, they want to provide more “benefit” towards these groups by unifying their online presence. This will give all of their various Google service pages more visibility and make it easier to manage… for them. For the rest of us average joes, what benefit does this provide us, Google?

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Desktop Environment Rewrite Failures – The Corner Cases

DE rewrites fail due to many obvious reasons – bugs, new workflow, new paradigm, etc. But one of the less obvious failures is that a mature DE has had time to implement features and fixes for various corner cases. Things that effect maybe 1% of your userbase. When DEs go through a rewrite or complete redesign, a lot of those features are tossed aside and labeled as something to be worked on later. GNOME 3 has so far failed at addressing this. Instead they are relying on their Extensions framework as a way to try and rectify corner cases. This is poor, no actually, completely irresponsible design on their part. And this is why.

Corner cases are by definition a problem that effects very few users. The chances of a user within that small percentage being able to design, code, and maintain a solution for a particular use case is so infinitesimally small that it isn’t a proper solution. Then you add in the fact that GNOME 3 hasn’t found a way to not break extensions at each update and you get a 1% of users that cannot function using your DE.

Ok, so 1% is small. No biggie. This is a stupid way of thinking. You have to remember that there are various, sometimes overlapping, groups of 1% corner cases that cannot use your DE. All of these tiny groups add up to a very large percentage of inconvenienced users. Hence why GNOME Shell gets so much flak. You piss off every vocal minority group and offer no help to resolve their issues. Even going to far as to claim some are invalid (shutdown option debacle). These situations can be handled much better (Cinnamon is doing a great job of acknowledging concerns and Unity is at least better at this than GNOME, though not perfect) and until they are, I think GNOME Shell will eventually cause so much fragmentation in Linux DEs that Linux will never be able to catch up to Microsoft or even Apple in popularity. With Steam on Linux coming out soon, we have an opportunity to grain ground in the desktop space but with so many flat out idiots running crucial projects in Linux, I’m afraid we will blow out chance to maximize this opportunity.

Here’s to hoping one of the following groups pulls off a miracle during this crucial 1 year time frame – Canonical with Unity, Linux Mint with Cinnamon, anyone with KDE, and elementary with Pantheon. These are the people I’m pinning my hopes on (and I say this as an XFCE/Compiz user). I hope one of them manages to catch up to GNOME 2’s feature set and overall polish/simplicity this year. If not, Linux will forever remain a non-factor in desktop computing.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

I am not Aaron Swartz, but are you?

I’ve had a long enough time to process my thoughts about Aaron Swartz’s death. If you don’t know who he is, I urge you to read his Wikipedia page, checks the news, read the transcript from Aaron’s memorial service, filter out the hyperbole, and focus on the facts to form your own conclusions about what kind of person he is and about the situations he faced. Other greater people than I have expressed their thoughts on the matter in a clearer manner than I could, so I will not express my thoughts on justice here. Instead, I will focus on Aaron’s death and what this means for me.

As usual, I was reading Hacker News today and reading about the latest article from yet another famous person about Aaron’s death. I don’t know why I keep reading these articles. I guess I am expecting to find some sort of life changing inspiration or trigger some sort of epiphany that will lead me to the forefront of the activism movement. But really, I’m not that person. It’s not who I want to be. I will be a supporter, someone who speaks out and gives donations, but even with my own technical skills, this is not a direction I can see myself taking. I am far too selfish and desire wildly different things in life. Does this make me a bad person? I hope not, but I cannot shake the feeling that I should be doing more.

Why?

Aaron’s death is one that has transcended him from being an activist to a martyr. So says the many people who are voicing their opinions and pushing changes like Aaron’s Law. Here is this person who attempted to change the world and it is only after his death that we, as a society, have begun to move towards enacting his visions. Except we aren’t. The initial reaction is to prevent another tragedy like this from happening. Sure this is progress, making technical activists feel safer, but all we are doing is putting up signs saying, “If you want to step up to the plate, you can now do so knowing that 1 of the 3 umpires is now fairer.” We should be taking this opportunity to step up to the plate to swing. Not get into a pissing match with the an umpire. We have the crowd on our side and if anything, this is the time to make drastic moves like Aaron did and force changes that will better the world. Screw how crooked the umpires are, ignore the other team, just step up and swing damnit! Swing!!

I guess that is why I feel so antsy. I think this is why I feel like I should be abandoning my life, my dreams, to pick up Aaron’s work. But… that’s not who I am. I’m just part of the crowd and the best I can do is cheer. So please, someone, anyone, step up and give me something to cheer for.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Windows vs Linux – Stability and Predictability

Disclaimer: I wrote this in a single, several hour long session without doing any editing or research. These are simply opinions that I felt like shouting out to the world. If you disagree, agree, or have suggestions on expanding this, feel free to comment below.

 

I’ve been thinking about why it is so hard to convince people to move to or even just try Linux whereas people are much more open to the idea of using OS X. Of course there are the usual suspects like gaming. But with Steam arriving on Linux, there are almost no games that my friends play that isn’t playable on Linux. Or drivers, but that has been a slowly improving situation. So long as you pick certified hardware, then this isn’t an issue. Then you have marketing creating misconceptions, but quick demos and a recommendation from the guy who fixes all their computers easily dismisses those incorrect notions. And of course there is the “I don’t want to reinstall everything” excuse. However, I’m only suggesting this to people getting new hardware and simply suggesting they try it out on a Live CD or VM (which I will setup). And yet, it is still very hard to change people’s minds. So what is it that Linux lacks or what scares people away from even trying Linux? I think it all comes down to stability, predictability, and support.

Stability

Let’s first look at stability. No, not stability in the sense of the computer crashing (BSOD anyone?) but rather in the sense of environment stability. In all major, popular versions of Windows (95, XP, 7, and perhaps 8 in the future), the desktop environment (I’m including the window manager, file manager, and absolute basics in here.) has been rock solid stable. There are very few bugs noticeable to the normal user. Windows itself may crash but Windows Explorer has been one of the least buggy interfaces I’ve ever used. Compare this with the primary default DEs in major Linux distros – GNOME, KDE, Unity, and Cinnamon. Also look at some other non-scary to Windows users DEs – XFCE, LXDE, Pantheon (Elementary OS’s DE). Out of all of these distros, how many have been the paragon of stability over the past 5, 10, 15 years?

GNOME 2.x – As much as I loved GNOME 2.x (with or without Compiz), it wasn’t exactly bug free. It was stable by time I started using it but I still ran into a lot of bugs that made me think, wtf? While it outpaced Windows Explorer in features, those new features often brought with it endless amount of annoying bugs.

GNOME 3.x – I like the idea of GNOME Shell but have been adamant since release that it was released too early. 3.0 should have been an Alpha test and they should have only moved this to a Beta after a few releases with extensions in place. Even now, they aren’t anywhere near RC state. For anyone who wants stability, I would steer clear of GNOME Shell. Too many bugs and too much jumping before thinking with how they handled extensions (who here has out of date extensions after upgrading?).

KDE 3.x – I wasn’t around for this so no comment here. However, the fact that it was dumped for a complete rewrite in KDE 4.x will be covered in my support section.

KDE 4.x – I tried this out when it was new and damn was that shit buggy. I’ve tried it on an off for years and I will say that it is fairly stable now. However, it does have performance issues and the sane defaults set by most major distros don’t make it standout from Windows Explorer because Windows 7 copied features from KDE (and did it without stability issues to boot). For KDE 4.x to really shine, you need distros to thoroughly customize KDE to vastly outshine Windows 7. This is possible but I have never seen any distro come out of the box with 1) well designed eye-candy settings that mesh super well like in Pantheon and 2) an easy to understand walkthrough of how to use the DE features in KDE. I’ve attempted doing both but for someone not well versed in KDE, the plethora of settings is just too overwhelming to wade through and create a polished DE.

Unity – I will admit, I like Unity because it is setup similar to how I use most of my DEs (launcher on side). However, this has the same problems as GNOME Shell in that it is still too buggy to use or suggest.

Cinnamon – I like the direction this is going but there is still a TON of polish and basic bugs that need to be ironed out before I would suggest this to someone coming from Windows. Hell, the text cursor in the menu still doesn’t blink and we are already at version 1.6. What is up with that?

XFCE – Version 4.10 finally caught up with Windows 7 (aero-snap) and Thunar 1.6 finally had them outshine Windows Explorer with tab support. However, as a full-time XFCE user, I will say that this environment is not 100% better than Windows 7. It is about even on features and stability. So telling friends to try out XFCE isn’t going to convince them to change.

LXDE – Same boat as XFCE but fits a slightly different niche. I wouldn’t except using this to convince anyone to use Linux over Windows.

Pantheon – The most interesting of the new DEs that I have been following since inception. It is still too new though and only recently went to Beta status, so again I wouldn’t suggest this to people who want a stable DE.

Looking at all of these DEs, it is clear that Linux is on the front of innovation. The problem with that is that too many bugs are introduced that screw with the user experience. If you pick a stable DE, then you end up with a situation where Linux is simply on par with Windows. And finally, the last piece of the puzzle – OS X. This DE takes a middle ground between Windows and Linux – it slightly more innovative than Windows, much more polished than Linux, and crazy stable like Windows. While it does require a paradigm shift to switch to OS X, users are content knowing that things will work as they expect. This is what will be needed by Linux DEs to convince users to switch away from Windows.

Predictability

While innovation is great, it is clear that stability is more important. What innovation that does get introduced though, must be predictable. Both Windows and OS X only introduce new features that have been thoroughly tested by focus groups and tweaked to seamlessly integrate with the current DE. Innovation in Linux is like the wild wild west. New stuff is being implemented all the time but you have no idea where things are headed. In fact, it is often the case that Windows and OS X simply take the most popular new features in Linux, test it into the ground with focus groups, and then implement it in a super stable way months/years down the road. They are basically Debian but much faster. To give an example of what I mean, let’s walk through the DEs again.

GNOME 2.x – Very predictable. Nice! But it is no longer supported which leads to my third issue discussed later.

GNOME 3.x – Very unpredictable. Boo! In fact, this is the least predictable of all DEs I have ever encountered in my life. With each release, I have no idea what will change and what muscle memory I will need to retrain to use this DE.

KDE 3.x – Same comment as above.

KDE 4.x – Very predictable. Nice! But there is no tutorial to teach you what everything in the DE does. Seriously, I’ve looked everywhere for one and the information out there is simply too sparse and contains too few examples for anyone to learn KDE without trying everything out themselves.

Unity – Sort of predictable but Cannonical has shown that they aren’t afraid of making fairly big changes to the GUI (moving the min/max/close buttons anyone?). Since this DE is still fairly new, I would be afraid of suggesting it to anyone for fear that other big changes are in store.

Cinnamon, XFCE, LXDE, Pantheon – All very predictable. Nice! In fact, the reason why they are predictable is because it is easy to be predictable when a DE focuses on simplicity and getting a few things right at a time.

So we have a few candidates here which can be both innovative and predictable. However, they all fail the stability test above which is fairly important. They are also fairly young in most cases and that leads to my final problem.

Support

Let’s first get this out of the way. Microsoft is clearly not the paragon of support. Neither are the Linux Communities with the differing personalities leading the various software projects. I don’t know about Apple so I won’t comment on them. What I mean by support is expected longevity and ease of upgrade.

Windows generally doesn’t upgrade all that often and when it does, they are major upgrades that people expect to be a painful upgrade process (e.g., anything dealing with reinstallation). What they do well is that their OSes don’t have many major releases. Instead, you just continually get updates for as long as Windows is supported, which for Windows is a VERY long time. Hell, my older brother still uses Windows XP. Combined with how software updates don’t just stop on Windows, you can technically have the most recent version of all your software at all times on supported versions of Windows.

Apple, from what I understand, is an easy to upgrade system. In fact, I believe upgrading major versions is less painful than on Windows 7 and their support in helping you migrate is top notch. They are (fairly close to?) the ideal half-rolling release model.

Then there is Linux. You have a choice of normal release models (Ubuntu, Debian stable, Fedora, etc.) similar to Windows, half-rolling release models (Debian Testing, Chakra Linux, OpenSUSE Tumbleweed, etc.) similar to OS X, and rolling release models (Arch, Gentoo, etc.). For each of these major models, I will break down the issues users have with Linux:

Normal Release Model – Updates are either too frequent (Ubuntu, Fedora) or too slow (Debian stable). Neither of these would be issues except for the fact that 1) software updates often slow down to a crawl or stop (Ubuntu PPAs alleviate this issue but you are at mercy of the maintainer) and 2) support even for LTS versions has never matched the longevity of Windows support. As someone who has used an LTS to end-of-life and simply never reinstalled (I get no updates on my netbook anymore), I will say that it sucks because all the repos die. Unlike Windows, it is a pain in the ass to install “new” (really old versions of software that is supported on your OS but must be manually compiled because repos are gone) software due to hunting down dependencies and compiling manually. Even if Win 95 isn’t supported anymore, it is still easy to install software so long as you get the installer.

Half-Rolling Release Model – This has the least problems for Linux. In fact, I would consider this to be a very good tradeoff if it weren’t for this issue – software updates tend to stop coming along with the core updates. So while Windows and OS X are getting the latest versions of software, Linux ends up lagging behind. Chakra Linux is the only candidate that seems to do this right BUT they are not a stable distro. Their main issue is that they still haven’t released a GUI updater that is able to handle major upgrades without user intervention. Everything is still command line and involves editing of files (they derive from Arch so they have the same pain points of doing manual file editing when something in the core changes). LMDE, meanwhile, has a great updater but is based off of Debian Testing which is embarrassingly slow at getting software updates.

Rolling Release Model – Nothing like Windows or OS X and would blow them out of the water IF it were stable. Always having the latest and greatest is an issue because you don’t know how well a given piece of software is tested. As an Arch user, I’ll give a very recent example I had and still cannot solve. Ibus was upgraded in the Arch repos to an unstable version because GNOME 3.x did something to make the previous version of Ibus incompatible with it. Of course, the unstable version is… well, UNSTABLE. It literally does not work in all but one use case (Japanese Anthy) and even then, that is the only part of the software that works. Every other feature (configuration, appearance, etc.) is broken. Solution? None. You have to find a way to roll back to a stable version while resolving the dependency rollback yourself. All of this is manual and all of this is painful. Last point to make is that rolling release distros tend not to have any sort of easy to use graphical updater due to the nature of the model.

You can see that Linux is tantalizingly close to having a great model for releases, but none of them are perfect. They all fall short in major ways that make both Windows and Apple standout as competitors. But this isn’t the worst part about Linux support. What really stings is when the support for your DE dies before your distro does. Specifically:

GNOME – Support for 2.x ended and while stable, it isn’t bug free like Windows Explorer or OS X. This includes software like Nautilus. While Debian Wheezy will have GNOME 3.x,  Squeeze still has 2.x and there likely won’t be any way to upgrade it without upgrading the entire OS. MATE is an attempt at keeping this alive but that isn’t exactly bug free either. You can make endless comparisons about how Windows is worse than even GNOME 2.x, but all users of XP, 7, and eventually 8, will know that bug fixes won’t suddenly end before the OS itself is unsupported. This tying together of distro and DE support is something GNOME does not have and because of it, they should take into consideration end user usage instead of selfishly doing what’s best for the GNOME project. Hence why I still believe GNOME 3.x should be an alpha/beta instead of an actual released DE.

KDE – This is where the 3.x to 4.x move bothers me. Full rewrites are never bug free and often much buggier for a few years after release. The fact that KDE pulled the same thing as GNOME and released a fairly unstable 4.x line without maintaining the 3.x line simultaneously is something that really would have screwed end users. Again, they have no distro to tie support to (NetRunner and Kubuntu are close as they are sponsored by Blue Systems but there is no official KDE distro) so they really should consider end users over their own project’s interests. Trinity did pick up the 3.x line to continue support, but just like MATE, it isn’t exactly a good alternative except in the short term. Users will not want to deal with the hassle of converting to Trinity or MATE until the new line (4.x and 3.x respectively) becomes stable a few years down the road.

But there is hope. Outside of the big 2 (GNOME and KDE), there are other distros maintaining their own DE which means updates forever on their LTS versions. Unity, Cinnamon, and Pantheon all fall under this and for any user using those three, being able to see the DE slowly improve without having to upgrade the base OS is great. However, the keyword here is improve. As mentioned earlier, none of these DEs are stable enough to be suggested as alternatives to Windows or OS X.

Then there are the simple, stable DEs – XFCE, LXDE. Both of these are pretty distro independent and their updates are fully dependent on which release model you follow. But as mentioned earlier, they aren’t good candidates because they don’t offer a clear improvement over Windows or OS X.

So there you have it. Why Windows and OS X beat out Linux even after decades of innovation. I still have hope that the community will eventually get it right (heck, even I’ve been trying to find the right combination of software to make an unbreakable XFCE Archlinux desktop) but who knows when that will happen. For now, I’ll continue following these distros in hopes that one of these will make that next big step into being THE perfect Linux distro. One with all of the right pros and none of the crushing cons.

  • Linux Mint / Linux Mint Debian Edition w/ Cinnamon DE
  • Ubuntu w/ Unity DE
  • Chakra Linux w/ KDE
  • Elementary OS w/ Pantheon DE

I have been following all of those since they’ve been created and the future looks bright. Let’s just hope it isn’t far away.

 

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Google Fiber – A boon for small towns

Edit: So I read up on it Google Fiber more thoroughly and turns out it is only in Kansas right now. So ignore the petition part of my post. The rest of what I’m saying still makes sense though.

So Google Fiber was announced today and I have to say, it is amazing. If you take a look at the price plans, you will be amazed at what you can get for the price. Google is making a very smart move here by providing the “free” option. It will undercut any provider not named Verizon (fios) and press the telcos to respond accordingly. This is great for competition and even better for us consumers.

Even after looking at the fine print, this is a damn good deal. The free option is a one time payment of $300 or $25 a month for 12 months. Include the $10 you need to sign up to petition for Google Fiber and it will cost you $310 for a minumum (yes stated in their TOS) of 7 years of internet at 5mbps speeds. I’m sure most rural places without any competition (meaning overpriced, shitty telcos) would be dying to have this. It comes out to less than $50 a year for 7 years.

The other benefit is that you get 5mbps speeds on a fiber network which means you won’t be capped. You will get actual 5mbps speeds unlike what you get from most broadband providers today. For example, Optimum Online advertises some 10mbps speeds but you can never go past a sustained 3mbps. It’s hilariously misleading but legal because they say “up to 10mbps”.

What what’s the problem? You need to get most or a large chunk of your town to commit to the $10 fee to petition for Gooogle Fiber to come to your town. To be effective, you need the local government to get involved and push to get everyone to sign up (close to 100% conversion rate would definitely make Google want to pick up your town). I can see this being extremely beneficial economically for small towns. They can both break free of the telco monopolies and it would save residents a ton of money. It will also bolster certain businesses by allowing them to utilize internet as part of their establishment.

Going for coffee at the local cafe? Free wifi.
Eating lunch at your local pizza joint? Free wifi.
Waiting at the local bank? Free wifi for your phone.

Tons of ways to make life super convenient. Having so many free wifi spots around town will also allow residents to save money on cell phone fees by avoiding data plans. Overall, residents get to save a ton of money which can then be spent on local businesses (who are now better).

So I guess what I’m trying to say is, get your town to sign the fuck up y’all!

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

The new mintBox – is it worth it?

The Linux Mint team in conjunction with CompuLab has released a new SFF PC named the mintBox. It has a small (nettop sized) SFF design centered around an all metal case that allows for the PC to be cooled without fans. This near silent (hard drive is not SSD but can be replaced with one on your own) design puts this PC ahead of many other Linux computers simply because it goes one step further than what you can get with off the shelf parts. The premium you are paying for hardware is no longer the cost of labor – it is the cost of pushing unique Linux PCs to the market.

Of course, this sounds great but everyone is going to want to know the price. Is the markup too high? Well this post’s aim is to give you a comparison between the mintBox, similar nettops, the Mac Mini, and DIY solutions. Hopefully, it will give you enough information for you to make your own decision about whether or not the mintBox is priced correctly.

The mintBox Basic and Pro

Here’s the price and spec breakdown of the two versions being offered (all of this is straight from the Linux Mint blog post linked above):

mintBox BasicmintBox Pro
APU G-T40N (1.0 GHz dual core + Radeon HD 6290 – 9W)APU G-T56N (1.65 GHz dual core + Radeon HD 6320 – 18W)
2 RAM Slots Available – 4GB RAM2 RAM Slots Available – 8GB RAM
Bay for 2.5″ SATA HDD – 250GB HDDBay for 2.5″ SATA HDD – 250GB HDD
Flat metal caseRibbed metal case
Dual-head display HDMI + DisplayPortDual-head display HDMI + DisplayPort
Digital 7.1 S/PDIF and analog 2.0 audio, both input and outputDigital 7.1 S/PDIF and analog 2.0 audio, both input and output
Gigabit EthernetGigabit Ethernet
WiFi 802.11 b/g/n + BT combo with dual antennasWiFi 802.11 b/g/n + BT combo with dual antennas
2 USB3 ports + 6 USB2 ports2 USB3 ports + 6 USB2 ports
2 eSATA ports2 eSATA ports
2 mini-PCIe sockets / 1 mSATA2 mini-PCIe sockets / 1 mSATA
Serial RS232 portSerial RS232 port
$476 + shipping, duty & VAT$549 + shipping, duty & VAT

Note: Differentiating features are in bold.

 

Looking at these specs isn’t all too helpful though. We need to be able to compare this to similar PCs so let’s go shopping!

Barebones Nettop Kit

Doing a quick search on Newegg, we can easily find barebone nettop kits with similar CPU/GPU specs. Foxconn is the cheapest (pricewise) of these. Asus and Acer also offer similar components for slightly more (though usually not barebones). For now, I’ll use Foxconn as a baseline comparison as that is the cheapest you can go. This will give us a good minimum baseline to compare prices.

These two nettops are the closest to the specs of the mintBox Pro and run about $160 on average:

Since I’ll be using the Foxconn nt-A3700-0H0WBANA in the comparison build, let’s list this as $150.

Next, we can look for some RAM and a hard drive. Currently, RAM is fairly cheap but hard drives are still higher than pre-flood prices. Looking at our options, this seems to be a fair choice of components to compare with:

Grand Total: $237

SFF DIY Solution

We aren’t done yet! Next, we will want to look at a custom DIY solution to make a similar SFF PC. Ideally, a quiet solution similar to the mintBox and see what kind of components we can get. There are several sites with great information on helping to build such systems, often for HTPC use, such as SilentPCReview, [H]ard|Forum, and AVS Forum. For the comparison build, I will base my parts list off of this Habey HTPC build. However, I will swap the case with a slim version and remove the slim DVD drive used in the build. Fudging some of the parts for closer comparison to a Mac Mini (pretty sure these parts will still work together) gives us a parts list of:

* Another case option is the Viako cases but I’m not sure about their availability and am too lazy to look up builds using these cases. They do look very spiffy though!
** 120W version is only $4 more – I’m not building this thing so I’m allowing myself some fudge numbers.
*** I’m aware this is no longer sold at Newegg but I did use this in a build a year ago so I know the previous price. I also picked this over the better Intel motherboard because it has built in wifi (Intel one would require a USB wifi dongle).
**** It is also possible that you may need a better CPU heatsink and fan (low profile one that is quieter), but I’m leaving that out since this is just a theory build. I assume the default CPU heatsink and fan included with the i3 is a low profile one.

Grand Total: $455

Comparing

Finally we can compare! From the Mac Mini lineup, I will be using the default $599 configuration (cheapest price) as upgrades will quickly shoot up the price. Also, before we compare, I’d like to point out that 10% of the mintBox price is a donation to Linux Mint. So the actual prices break down like this:

  • mintBox Basic – $428.40 + shipping, duty & VAT + $47.60 donation
  • mintBox Pro – $494.10 + shipping, duty & VAT + $54.90 donation

Now let’s put everything in that giant table and compare!

FeaturesmintBox BasicmintBox ProFoxconn BarebonesHabey DIYMac Mini
CPUAPU G-T40N (1.0 GHz dual core)APU G-T56N (1.65 GHz dual core)AMD E-450 APU (1.65GHz dual core)Intel Core i3-2105 (3.1GHz dual core)Intel Core i5-2415M (2.3GHz dual core)
GPURadeon HD 6290Radeon HD 6320Radeon HD 6320Intel HD Graphics 3000Intel HD Graphics 3000
Power Consumption9W (8W idle, 17W load)18W (9W idle, 24W load)??
RAM2 RAM Slots Available – 1x4GB = 4GB2 RAM Slots Available2x4GB = 8GB1 RAM Slot Available – 1x4GB = 4GB2 RAM Slots Available2x4GB = 8GB2 RAM Slots Available – 1x2GB = 2GB
Hard DriveBay for 2.5″ SATA HDD – 250GB HDDBay for 2.5″ SATA HDD – 250GB HDDBay for 2.5″ SATA HDD – 250GB HDDBay for 2.5″ SATA HDD – 250GB HDDBay for 2.5″ SATA HDD – 500GB HDD
Case DesignFlat metal caseRibbed metal casePlastic Nettop case3mm Thick Aluminum caseUnibody Aluminum case
Dimensions6.3″ x 6.3″ x 1″ (16cm x 16cm x 2.5cm)7.48″ x 6.3″ x 1.575″ (19cm x 16cm x 4cm)7.48″ x 5.31″ x 0.98″9.0″ x 8.0″ x 2.25″7.7″ x 7.7″ x 1.4″
Case/CPU FansNoNoYesYesYes
Video OutputsDual-head display HDMI + DisplayPortDual-head display HDMI + DisplayPortHDMI + DVI (DVI to VGA adapter included)HDMI + DVI + VGADual-head display HDMI + Thunderbolt (HDML to DVI adapter included)
AudioDigital 7.1 S/PDIF and analog 2.0 audio, both input and outputDigital 7.1 S/PDIF and analog 2.0 audio, both input and outputLine-out jack (support S/PDIF-OUT) + frontside headphone/mic jacks3 audio ports (8 channels – Realtek ALC892), speaker, headphone, micAudioline in/out minijack, headphones + built-in speakers
Ethernet PortsGigabit EthernetGigabit EthernetGigabit EthernetGigabit EthernetGigabit Ethernet
WifiWiFi 802.11 b/g/n + BT combo with dual antennasWiFi 802.11 b/g/n + BT combo with dual antennas802.11 b/g/n – Single AntennaWi-Fi 802.11b/g/n – Dual Attennas802.11n Wi-Fi wireless networking;4 IEEE 802.11a/b/g compatible – No Antenna
USB Ports2 USB3 ports + 6 USB2 ports2 USB3 ports + 6 USB2 ports2 USB3 ports + 4 USB2 ports2 USB3 ports + 4 USB2 ports0 USB3 ports + 4 USB2 ports
eSATA Ports2 eSATA ports2 eSATA ports0 eSATA ports1 eSATA port0 eSATA ports
FireWire Ports0 FireWire ports0 FireWire ports0 FireWire ports0 FireWire ports1 FireWire ports
Thunderbolt Ports0 Thunderbold ports0 Thunderbold ports0 Thunderbold ports0 Thunderbold ports1 Thunderbold ports
Card ReaderNoneNoneSD/SDHC/MS/MS Pro/MMC 5 in 1 Card ReaderNoneSDXC card slot
BluetoothYesYesNoNoYes
External ExtrasSerial RS232 portSerial RS232 portIncludes a VESA mount and standOptical S/PDIF OutApple Remote (optional upgrade)
Internal Extras2 mini-PCIe sockets / 1 mSATA2 mini-PCIe sockets / 1 mSATA1 mini-PCIe socket (wifi card)1 mini-PCIe socket (wifi card) / PCI Express 2.0 x16 / 2 x SATA 6Gb/s / 2 x SATA 3Gb/s?
Most Noticable FeatureFanless Case DesignFanless Case DesignCheapestUpgradable, CustomizableNo Power Brick, Built-in Speakers, Unibody Case, Upgradable, eGPU capable
Total Cost$428.40 + shipping, duty & VAT + $47.60 donation$494.10 + shipping, duty & VAT + $54.90 donation$237$455$599 + shipping
Markup (no donation, based on mintBox Pro)$-65.70$0$-257.10$-39.10$104.90

Note: I’ve bolded the standout features of each system in comparison to each other. Obviously based on my own opinion.

Drawing Conclusions

So what do we learn from this gigantic table? I have no idea.. yet. I would like to make a comparison between the Foxconn build and the mintBox Pro as that paints a clearer picture of what extras you are paying for.

First off, the markup is about $257. Now let’s see what is missing from the barebones setup that you get in a mintBox Pro:

  1. extra mPCI-e slot (2 instead of 1, just remember in both systems that 1 of these is the wifi card – or at least I assume the mintBox to be using one of these for wifi)
  2. Bluetooth
  3. mSATA
  4. 2 eSATA ports
  5. 2 extra USB2 ports
  6. Display Port
  7. Dual wifi attennas
  8. extra RAM slot and an extra 4GB of RAM
  9. Fanless case design (I have an older Atom Foxconn and it is loud!)

Note: You do lose out on a card reader.

Now in that list, I’d say 1-7 aren’t worth more than $40. The RAM (extra slot and 4GB) is a good extra and I can see demand pushing this to be worth ~$30. Then you have the biggest differentiator: the case design. It is fanless, sturdy (metal, not plastic), and looks sleek. Knowing the prices of good SFF cases, I’d say the difference between a low quality and high quality case can be at least $40-50. This cuts the markup to about $147 so let’s says $150.

So what is in this $150 markup? Well there’s CompuLab’s R&D, labor costs, and profit share, but is that really worth $150? Probably not as I’d say a more reasonable markup would be $75 or $100, though I can’t really blame a small company wanting more money for taking a risk and pushing a new type of Linux PC – well designed, custom, and branded – usually Linux PC makers don’t have custom designed cases. We are usually just paying for labor and testing.

Verdict: Price is a tad bit high but that’s a fair price to pay for trying to break into a market. Also, if you are an early adopter, it should be expected that you will be paying more than those who decide to purchase a couple months later. This is just my opinion though; what’s yours?

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)
 

Page optimized by WP Minify WordPress Plugin