Blu-ray: It’s nice, but not perfect

During the Hi-Def disc war, I reviewed the features each side supported. But over time, I came to the conclusion that Blu-ray had limitations. Below are my thoughts on the subject.


Early Blu-ray adopters ended up with problems. The standards changed quite a bit to the point that original Blu-ray players would not play some later produced discs. Some players may have had firmware updates that solved the problem. But others were simply left to rot. Leaving lots of electronic waste.

This brings up the fact that for years after Blu-ray won the Hi-Def disc war, the standard still changed. I am not referring to 3-D or additional lossless audio, (unless the disc was made without a backward compatible audio track).

One prime example appears to be Avatar, (and NOT the 3-D version). Without a reasonably new Blu-ray player, (or firmware upgrade), Avatar simply would not play. Years after the video’s release, it’s not a problem. But this was a problem when Avatar 2-D first came out.


Now current Blu-ray players, (2011 and on), almost always require Internet access, if only for new firmware. Having a device, (any device), connected to the Internet on a regular basis is really a poor idea. Even with good firewalls, and no open in-coming ports, having your Blu-ray player announce it’s self to the world by making un-wanted network connections leaves me with serious concerns.

Certain things like extra content might be nice to have. Except then the content providers can not only track your IP, but the specific video disc that you are playing, and when. Talk about lack of privacy.

Even without the privacy concerns, man in the middle attacks could be done. Do you really believe that the Blu-ray player firmware writers are top-notch? And care about security? Or privacy?

Skipping those issues, how about an actual breach at the studio. Like this has never happened, (Sony anyone?). Or a studio initiated Root Kit, (again looking at you Sony!!!).


Old DVD players rarely needed a firmware update. I have seen it, but it was rare. If the Blu-ray players start doing automatic firmware updates without your consent, they could end up ‘bricking’ the player.

Unless the manufacturer will warrant their players against those firmware updates that ‘brick’ the player, then I would not trust automatic updates. And I mean warrant the firmware update for as long as 10 years. My old DVD player lasted that long.

Looking at the computer industry as an example, (and what else is the Blu-ray player except a computer?), we should see one of 2 things. First, allow 2 firmware images in the player with a recovery method to “boot” off the alternate, (prior), image. Or second, the ability to perform a manual firmware update using an older image. Perhaps even allowing the existing firmware to be written to a Blu-ray disc! (Or USB flash drive.)

Some people may wonder at that extra effort, and simply say that their Blu-ray player is not that important. Except when they have just rented or purchased some movies on Blu-ray, and can’t use them due to “bricking”. Especially if that was a holiday weekend, or the Blu-ray player is out of warranty.


Another issue is that in theory, Blu-ray player manufacturer’s can create new firmware that targets a highly pirated disc for black listing. Thus, not play it. This is fine except that it’s possible the user bought the disc from a reputable store. Which, when you come to think about it, is likely as how else would a pirated Blu-ray disc get such wide-spread distribution? Enough that the Blu-ray people, (studios or content owners), would want that disc black listed?

So they “punish” the users who bought from a reputable store, instead of going after the so-called reputable store, (or its supplier more likely). In fact, it would be easier to go after the store, supplier or distributer than it would be to go after the end users. After all, the so called “reason” for this copy protection is money, right? Get the money and the copyright owner should not care what the end-user has.


I am for all digital in that the information on disc is digital, and it’s processing is digital, then transmitted to the display and amplifier as digital. So HDMI sounds like a good idea. The cable is not too thick and the connector, while not perfect, is reasonable.

However, in order to meet some licensing requirement(s), something called HDCP, (Hi-Def Copy Protection), has to be used. Again, I would have no objections, if it worked right. Basically I get occasional flashes on the screen. This seems to indicating that HDCP failed for a fraction of a second dropping the video frame and replacing it with a static color frame.

HDCP should have a method to delay failure. Meaning if it thinks it’s handshake is failing, give it a second or 2 to try again before it abort’s and gives the user garbage. With this fault, we actually return BACK to the VCR era with its video tracking problems.

Also, it appears that HDCP takes a noticable amount of time to handshake. But it should take no time what so ever. Why should it? Most people set up their home theater, (or simple HDMI T.V.), and don’t change it for months. During that time they may watch hundreds of hours of video that should not have any problems.

For example, each display should support 5 or so HDCP enabled sources. So when the source becomes the active one, it should take no time what soever to bring up it’s display. We can even allow one slot of the HDCP cache to be used for on the fly handshaking as it does today. So then supporting portable sources.


Current Blu-ray discs support 25GB and 50GB, (dual layer), formats. This seems to be plenty of storage. So much that we could easily put entire seasons of older SD, (Standard Definition), T.V. series on to a single Blu-ray disc. Even reap some space reductions by using MPEG-4 encoding over MPEG-2. We could even get between twice and four times the amount on Blu-ray for that single reason. Remember, T.V. series from before 2000 generally are SD even if they went to wide-screen. (I was surprised that some of the Babylon 5 seasons were wide screen, as they came out in the early ’90s.)

Except that it appears Blu-ray is being targeted for Hi-Def only. Some older T.V series like Star Trek the Original Series are being re-released on Blu-ray. Yet they went and “improved” STOS. To be fair, the changes appear to be nice for STOS, (as the original planets looked kinda bad by today’s standards). But we are un-likely to see such re-releases for the tens of thousands of SD T.V. series that have been already released on DVD.

Of course, one way to avoid the disc changing for a T.V. series is to use a media server and media player. In some cases a Blu-ray player will actually play network accessible videos and music. So, taking the DVDs and trans-coding them from MPEG-2 to MPEG-4, (to cut the file size by more than half), and putting the entire series on your media server means it take seconds to go from one episode to the next.

Yet then the same thing applies to the Blu-ray released HD T.V. series. If they came on more than 1 Blu-ray disc, we can copy them to a media server as well. Thus eliminated the disc change. Except that the Blu-ray people think this is “bad” or “illegal”, and use what they think is serious copy protection to prevent that.

Gnome over: I gave up on Gnome

There have been many articles and comments about Gnome 3.x and its new way of doing things.

Reading what others have had to say about Gnome 3.x and having to learn a new way of doing things, I had to think twice about upgrading. Fortunately I had time, as Gentoo Linux did not forcing me to make a decision until I had a opportunity to review my options.


Reviewing my use of Gnome 2.x, I found that it was slowing down with each new sub-release. A tiny bit perhaps, but enough that over the years I noticed it. And that’s using it with faster computers.

Plus, every time I updated my computers, (some use Gentoo Linux), I had to worry about new packages required, just to maintain existing usage. Thus, it got bloated to the point of re-loading a desktop took most of a day. Both to download the Gnome 2.x package’s source and compile them, (Gentoo Linux builds most or all packages from source).


At one job I used Evolution as my mail client. It worked, perhaps a bit slowly but it worked. I got used to it and lived with it, as the alternative was to use MS-Windows XP on my work desktop. A while later that employer moved to a new mail system, which allowed me to use Thunderbird.

That made E-Mail much faster and easier to deal with. Except I could not seem to get rid of Evolution, (it takes forever to compile on Gentoo Linux). Gnome 2.x seemed to require it, and if I forced Gentoo Linux not to load it, some other applications broke. Eventually I was able to make Gnome 2.x not run it, (either the Evolution server or client).

That helped but pointed out something else I needed to work around. Never the less, it still had to be on-disk, (and compiled when needed).


Gnome 2.x had many configuration options. Except I could not find one to change the login background screen, or the screen saver un-lock screen. Whatever that green mess was, I found it ugly and distracting. I would prefer a simple, fast login, and un-lock screen.

Over the years I’ve looked and never did find a real solution to this problem. So I simply continued to over-write the green mess with a nice picture of Jupiter. Any time a Gentoo Linux update backed out my change, I’d have to go back in and fix it. Annoying but straight forward and I clearly documented how to perform that fix.


Other configurations options to make changes I wanted, existed and simply needed to be implemented with each new OS / Gnome load. Though one thing I could never find an option to change, was the menu bar’s 3 items. The first was the normal menu and the last was system actions, (as I recall). But the middle one was labeled “Places” and had things in it I never used. Or could work around if “Places” did not exist. Annoying.

Eventually I had to document all my changes so that I would not have to look them up each time. For example, I never use the desktop icons, (computer, home, etc…), which is an outdated method for use. So I would prefer never to see those icons, and have a blank screen, (except for the tool bar).

This worked well, except indications are that Gnome 3.x removed or changed many customization features. Another reason to pursue other options.


One thing in particular that I could not solve with Gnome 2.x, was the login delay. If my broadband MODEM was on and the Internet accessible, then the login process was reasonably quick. But, if the Internet was not accessible, there was then about a 60 second delay between the time I entered my password and the login process started opening up my applications.

This particular problem was annoying as my home office security keeps the broadband MODEM off, except when I need it. Further, my netbook may not have a network connection when I want to use it to pull up the info I have loaded on it. All in all, I ran across this login delay problem thousands of times. Looking back, I can’t even remember when this problem started showing up. But it definitely did not occur early on with Gnome 2.x.

Many searches of Gnome 2.x login bugs and configuration options led to some known issues. For example, where “localhost” needed to be first in the “/etc/hosts” file, as well as have the real name of the box. For example;

/etc/hosts
127.0.0.1 localhost mybox
10.10.10.10 mybox

That did not help in my case.

Next, the obvious problem could be that DNS was searched first, and then hosts. But no, I never setup my boxes with DNS first. It’s always;

/etc/nsswitch.conf
hosts: files dns

Last, I could have used port mirroring on my Ethernet switch and then used “tcpdump” or WireShark to grab the network packets to see what was really going on.

But why should I have to go to such great lengths for something that should just work?

In the end, I put up with the issue until Gnome 3.x pushed me over the edge and gave me the opportunity to solve this problem.


An odd rumor about Gnome 3.x is that you would have to log out before using the “shutdown” command. Otherwise it would cause problems. With Gnome 2.x and other window managers, you don’t need to do that. In fact, it’s a bit of security to be logged into the console. Meaning you are in front of the box, not remote.

This would have been an annoyance, if true. I routinely shutdown my desktop(s) when logged into their GUI. Either from the shutdown menu or simply with;

 shutdown -h now

After the controversy about Gnome 3.x hit a threshold, several Gnome 2.x forks and look-alike / work alike projects came forth. Most did not exist over the general Linux distributions, and were targeted to specific distros. That seemed reasonable at first, so I waited to see if one would be better supported over another, as well as how many distros would support which Gnome 2.x work alikes.

No specific winner appeared during the time I monitored the situation. They all looked like nice projects but it would’ve been better if a Gnome 2.x fork simply existed and continued. In the end, with too many choices, and none having wide-spread support, I had to dismiss them all and pursue other options.


When it came down to it, I chose to use XFce4. My current desktop is reasonably powerful and other desktop boxes have enough horse power to run fast even with a graphic heavy desktop. My miniature, always on server, does not run a GUI for both security and that it’s not very powerful. Thus, it has no need to load or run Gnome, (or GUI), at all.

So, continuing to use Gnome did not seem like it would be a problem from a CPU/GPU performance stand point. However, I have a cheap netbook, (Asus EeePC 900), I use on-site at times. It was slow starting Gnome 2.x, but was quite usable after bringing up the desktop. Going to Gnome 3.x sounded like a disaster as far as speed was concerned. (If Gnome 3.x would even work as it sounds like Gnome 3.x needs special graphic drivers.)

Further, XFce4 seemed more lightweight and could be made to work exactly like I used Gnome 2.x. Certain things worked differently, but with very little effort I got things working great. And re-loading my netbook did not take forever. (I wanted to make sure all traces of Gnome that I did not need were gone, as I have limited Flash disk space on that netbook.) XFce4 had the side benefit that it takes up less than half the disk space that Gnome 2.x did. That made the decision to use 2 separate boot environments for doing upgrades easier.

All in all, an easy, minimal impacting transition. As opposed to what seemed like a Gnome 2.x to Gnome 3.x would be. Plus, my netbook runs the same programs and user interface GUI as my main desktop, and fast. So fast, it is a reasonable stand-in for my main, (quad-core AMD64), desktop when using a separate video monitor, external keyboard, and mouse.


There comes a time when newer is not better. In my opinion, Gnome 3.x may be useful for new users, but it really should have been called something else, (Gnome NG?). And Gnome 2.x continued as something backward compatible.


Edit 2017/03/12: After using XFce4 for several years, it’s worked out quite well. Even to the point of forgetting about Gnome, (2.x or 3.x). Thinking back about it, I wish I had moved over sooner. Sometimes using what many people are using is not the best course.