Friday, December 30, 2011

How to fix blank/default icons on Windows 7 Task Bar (oops!)

My office is in the process of converting from Windows XP to Windows 7 and I am one of the early testers.  One problem I ran into was that Acrobat PDF files would open-up but without the proper Acrobat icon on the Task Bar.  Normally you could simply right-click on the Task Bar icon, then right-click again on the program name ("Adobe Acrobat') and then click on 'Change Icon' to fix it.  But because of the way Acrobat installed the 'Change Icon' link was greyed-out.

When I tried googling this situation I found lots of varied suggestions, none of which seemed to apply.  What seems to have worked, fortunately, was the following procedure which I believe triggered an update/refresh of the cached icon-image for Acrobat.


  1. located original 'acrobat.exe' in C:\Program Files\Adobe\Acrobat 9.0\Acrobat
    (I noted that it had the correct icon but it was missing from Task Bar once opened)
  2. created a shortcut to the .EXE
    (the shortcut also had the correct icon)
  3. opening the shortcut showed the correct icon!  
  4. At this point I temporarily pinned the app to the Task Bar.
  5. I tried opening some PDFs and found that they opened in the pinned app and with the correct icon
  6. I unpinned the app from the Task Bar
  7. I opened some PDFs again and now the icon is correct

UPDATE: And just like that it's gone...  I thought I had fixed it this morning but now it's broken again.

Wednesday, December 21, 2011

Convert phone contacts to Gmail

I just helped a co-worker convert all her built-in phone contacts to Gmail contacts, in preparation for getting a new phone.  (This was on an older Android smartphone.) The trick is sort of counter-intuitive -- you need to disable the built-in contacts rather than the Gmail!

FYI - Here's where I found the solution (though it's not written as clearly as I'm about to do it):

  1. (this might not be necessary?) Open your Contacts app, click the menu button > More > Import/Export Contacts > Export to SD card.  This creates a .VCF file in the root-folder
  2. again click the menu button > Display Options, and disable the view of all built-in Phone contacts.  Only leave your Gmail account selected.  Don't freak-out when the Contacts list goes blank :-)
  3. again click the menu button > More > Import/Export Contacts > Import from SD card > Import from VCF

At this point all the contacts should re-appear and now they'll be marked (the tiny icon to the right of each entry) as both 'G' and 'P' for Gmail and Phone.  Login to your Gmail account, click on the drop-down next to the big MAIL link (in the top-left) and select Contacts.  Everything on the phone should now be there!

Wednesday, October 19, 2011

Quick fix for secure erase of used SSD

I just bought a Corsair Force3 SSD on-sale at Fry's.  I saved money coming and going, in fact, as there was a mail-in rebate and the drive had been returned by someone else -- so it was discounted another $15.  These particular drives use the new 'SandForce' SSD controller which have lots of reliability problems so they probably need to offer the rebates to move them off the shelves...  Now, why did I bother buying one??

The first thing I checked was the firmware on the drive.  'Fortunately' it was not running the current v1.3.2 which tells me any problems the original purchaser had might have been fixed by this firmware upgrade.  I proceeded to install the new firmware and had no problems.  Now all that remained was to do a secure-erase, to ensure I got the correct 'New Drive' performance.  (The write-speed on SSDs gradually drops as they get full and the 'free space' becomes fragmented. Since the physical memory 'sectors' are remapped 'logical sectors' you can't use a traditional OS or application defragger.)

Most of the online instructions said to download the Parted Magic ISO, boot from it, then select the 'Erase this drive' option in the GUI.  You also need to select the 'Internal' mode (which tells the drive to do it's built-in erase command) rather than 'External' mode (which manually rewrites the entire drive and just adds wear to an SSD).  I knew that Parted Magic just relies on the same Gparted and 'hdparm' commands on my beloved Ubuntu, so I decided to just use an Ubuntu boot-drive I already had setup.

Here's where the problem came in, and the fix:

For whatever reason the Ubuntu flash-boot and/or the BIOS locked the drive.  I was able to run the basic-query hdparm command successfully,
 > sudo hdparm -i /dev/sda

But when I tried to run the advanced query,
 > sudo hdparm -I /dev/sda

or when I tried to trigger the internal secure-erase command,
 > sudo hdparm --security-erase NULL /dev/sda

I got an error message re "Input/Output Error"

After much googling I finally found one discussion (on the Intel SSD forums) which made passing reference to either hot-swapping the drive or sleeping/waking the machine.  From my Ubuntu 10.10 flash-drive boot, I was able to click the 'Suspend' command, then power-on again, and the hdparm command now worked.

Sunday, September 25, 2011

Why my next computer will be Windows, not Linux

For those of you who have been following my blog (anyone?!) it may be surprising to hear that I'm abandoning Linux.  It is not a decision I am making lightly.  But after several years dealing with the limitation of Linux (specifically, Ubuntu) I have realized that the Windows 'ecosystem' really is better for me.  Let me explain...

Multimedia support
I use my computers as my primary source of video, both TV shows, movies, and my extensive collection of home videos.  On Linux the playback of high-definition video has always been problematic, requiring never-ending experimentation with replacement video-drivers (the proprietary vendor 'blobs' versus the open-source drivers).  The h.264 hardware-acceleration still doesn't work properly with ATI videocards, even with extensive manual revision of the installed drivers.

On Windows, this multimedia support is trivial, as Microsoft has created a well-organized system (DirectX) that all the manufacturers use.

Multi-user support
Linux is inherently a multi-user OS yet the graphical interface is poorly tested.  My wife and I share our Linux workstation, taking turns as the current login, and we can attest to these problems first-hand.  Whenever one of us connects a flash-drive, the other user gets non-sensical error messages.  This is a known problem with a longstanding Launchpad bug; never fixed.  When we switch from user-to-user (using the longstanding 'fast-user-switch' feature) the login screen always flashes the name of the current user, then corrects itself... 'always' except for the occasional crash!  Clearly this feature has not been properly tested.  The first issue is cosmetic, the second is clearly more, but neither has ever been fixed.

RAID support
One of the reasons I was initially thrilled with Linux was the ability to setup a software-based RAID mirror and get improved read-performance (as well as piece of mind).  I used this on my combined boot/data drive(s).  But now I am switching to using a single SSD for my boot/OS drive and a separate pair of mirrored data drives.  This combination has always been supported under Windows so Linux has no advantage.  Plus, on Linux, there was no built-in GUI for checking the state of the RAID array.  I managed to find an old unsupported utility to do it but I have no idea if it would actually warn me of a failure.  On Windows, I'm quite sure this GUI is available and supported.

Printer and Scanner support
I have had non-stop problems with both my printer and scanner under Linux.  From kernel rev to kernel rev the support for my peripherals has come and gone, unpredictably.  This has been true for my Canon inkjet, my Brother laserjet, and my Epson scanner.  In theory I appreciate the use of generic hardware drivers -- the use of shared drivers means that they are better tested than individual/per-device drivers.  But that advantage is moot if they don't correctly work on your hardware. 

To add insult to injury, everything prints slower, especially the Brother laser.  It's like there's an extra layer of translation taking place during the print-file generation?  Printing even a single page will require a 20 second delay, whereas the printing from Windows is instantaneous.

I also have a 'pro-sumer' Epson scanner with 'ICE' support for correcting scratches in photos.  But this feature is only usable through the Windows driver, there is no support for it through the generic scanner driver.

USB wake from sleep
By default, Windows lets you wake a machine from sleep-mode by moving your mouse and/or typing on the keyboard; Linux does not.  It is possible to configure Linux to do the same but it can't be set as the default?!  Instead, the best you can do is add a startup script to manually change the power settings at boot.  And the configuration of this script varies from distribution to distribution, and even from release to release (as the startup processor changes or is replaced).  On Ubuntu there is a Launchpad request for either a change in the default behavior, a GUI to easily edit it, or even just a way to make a permanent change in the configuration.  It has languished for years without action.

Windows-only apps
I have also tried resolutely to only use native Linux apps.  Some are actually better than their Windows equivalents, e.g. MP3 music managers.  But I continue to run into unsolvable situations where the only functional option is the Windows version.  My scanner, as I already mentioned, can only use it's ICE scratch-recovery with the native Windows driver.  My Oregon Scientific weather station is only recognized under Windows; some other models have Linux drivers but not mine.  The only Newsgroup Binary Reader with bandwidth control is on Windows.  Quicken is still the best checkbook app.  Outlook is still the best email/organizer app.  Yes, for many years now I've been using Evolution in place of Outlook...  And I have a handful of unresolved bugs (filed with to prove that Outlook works better.

For the past couple of years I've resorted to running these Windows apps in a Virtualbox VM.  In general this has been a satisfactory workaround.  But I've long suspected that the majority of my system crashes were related to this VM sub-system.  Surely it's better to run a single OS?

All these experiences have made me appreciate the relative pro's and con's of open-source software like Linux versus for-pay software like Windows.  Obviously, no one in the Linux developer community is concerned enough about the various glitches to which I am subjected to try and do something about it.  I still love the freedom of Linux and plan to continue using it on my personal single-user workstation.  But, much as I hate Microsoft, I am forced to admit that the nominal cost of their Windows is a small price to pay for a 'sane' OS.

My plan for US economic recovery

The Democrats and Republicans keep arguing about the economy and taxes but without really connecting the two.  Because of the seeming disconnect I think it's necessary to go back to 'first principles', to sort of start over.

The first and most important fact, which I don't think the politicians ever mention, is that work creates wealth.  If you pay me to do something -- anything! -- for you, the profit I earn on my labor is newly-created wealth.  If, in turn, I then use this new wealth to pay YOU to do something for ME then you have now created wealth!  Even though the physical cash is a 'zero sum' game, we both end-up wealthier for having traded services.  (The same is true for selling goods.)  This is the reason GDP is so important.  It is supposed to be the measurement of all economic activity in our society and, all other things considered, the more activity the better.  (This is also the reason that the Federal Reserve has been aggressively lowering interest rates, to try and promote economic activity of ANY KIND.)

The problem is what happens with this new wealth. 

Is it put back into the economy to pay for still-more goods and services, hence still more wealth?  Or is it put into savings accounts, effectively removing it from the 'active' side of the economy?  Obviously, we need to have some reasonable amount of savings so that there is money available for investment.  But, as Alan Greenspan correctly noted in his (post-Fed Chairmanship) autobiography, there has been a worldwide glut in savings for the past two decades.  The proof: a steady decline in the interest rates available to savers.

So, for Republicans to claim we need to continue reducing taxes on the rich is clearly false.  There is already far more than enough savings available for startups and capital investments.  Instead, the tax policies of the past decade (cough*Bush*cough) have simply allowed the ultra-rich to redirect a significant amount of wealth into their savings -- and away from more productive uses in the hands of workers. 

This isn't class warfare.  It's simple economics.

The solution, I feel, is a new corporate tax based on executive pay.  Ideally this would be implemented as part of a general tax SIMPLIFICATION plan.  History has shown that stockholders are unable to influence or control executive compensation, so I think the government needs to weigh in. 

Herewith, the new 'TOVAR POLICY' (though I don't think I'm really the 1st to say this):
Any corporation who's executives earn more than 20x the median income (of it's employees) must pay an additional corporate tax. 

If an executive committee really thinks their CEO is responsible for such a large share of their profits, they'll need to pay for it.  My hope, of course, is that they'll choose instead to lower executive compensation, or give raises to their employees!  The point being: a larger percentage of the wealth generated by the company should go to the employees and stockholders, where it will be returned to the economy, and less to the handful of ultra-rich executives who can't possibly spend it all.

Friday, September 16, 2011

Home Theater Acoustics 101

Excellent intro to room acoustics!

Really, I should call it an excellent summary as it has all the most-urgent things to do.

Saturday, August 27, 2011

Using CHNTPW to fix weird bug in Windows 7

I just spent a good many hours trying to fix a Windows 7 machine with a ‘broken’ login.  The user hadn’t changed their password (nor had it expired) but suddenly the computer insisted their password was wrong.  My first thought was just to reset (hack) the password.

I had previously used the ‘chntpw’ tool to manually tweak the built-in SAM password database on Windows XP – but did it work on Windows 7, too? I carry a flash-drive on my key-chain which boots Ubuntu 10.10 so I started there.  ‘Chntpw’ is in the 10.10 repository (the ‘Universe’ alternate) and I successfully installed it, then ran it against the Win7 machine’s hard-drive and SAM file. It said it had 'successfully' blanked the user’s password but Windows still reported ‘invalid password’. I also tried resetting to a new password; same result.

I then tried one of the official Windows 7 repair techniques. I booted the install disc, then selected Repair > Recovery Console. Once I had the command-line open I tried the following command which was supposed to activate the built-in Administrator account,
> net user administrator /active:yes

Unfortunately, even though this was using a ‘legit’ Microsoft process it still didn’t work; there was no Administrator login option.

At this point, I thought maybe there was a newer version of ‘chntpw’ so I booted a newer Ubuntu 11.04 LiveCD. But there was no version of ‘chntpw’ in its Universe repository. Instead I had to download the .deb from Launchpad (just google “ubuntu 11.04 chntpw”). Unfortunately, attempts to both blank and also reset the user password still failed.

Finally, I tried unlocking the Administrator account with chntpw and it finally worked! I still couldn’t login as the regular user-account but I was able to click on Switch User > Other User > manually enter ‘administrator’ (sans password) and it logged me in.

Now, here's the weird bug part:

From the administator login I was able to successfully reset the user's password and login as them.  I immediately saw a weird error message, something about "user profile directory c:\windows\system32\config\Desktop not found."  The user's Desktop folder is not supposed to be under C:\Windows?!  Also, the system loaded a generic desktop.  I switched back to the Administrator login and searched the C:\Users folder.  The user's original Desktop folder was right where it was supposed to be, and I noticed something unusual:  There was a copy of the files off a Norton Antivirus install CD filling the Desktop.  I also saw the AUTORUN.INF and, on a suspicion, I deleted it.  Voila!  The user's account now worked properly.

So, apparently there is an unpatched Microsoft bug wherein Windows will try to 'obey' an AUTORUN file in the user-profile directory.

Wednesday, July 13, 2011

Unacknowledged problems with Intel SATA?

For a long time I've had to deal with ongoing 'flakiness' problems with the hard-drives in our Dell laptops. Recently, it got so bad that I decided I had to do something about it! The only clue I had was the large number of 'device timeout' errors in the Windows XP Event Viewer. I had previously been told (by multiple co-workers) that the messages were false/cosmetic. They were easy to check for by applying a filter to the System event-log for either source= 'atapi' or source= 'iastor'.


The first thing I figured-out was that some of the errors were cosmetic but only on our newer E-series laptops (the E6400 and E6410). They had always ever been setup with the older/generic version of the Intel SATA driver, 'iastor' v8.8. Once I upgraded these machines to the newer v9.6 the errors went away.

I also figured-out a relatively reliable way of testing for the error. Without explicitly testing for it, I was seeing the 'timeout' errors every 1-4 weeks under normal usage. If I left a machine logged-in and running overnight, however, it almost always reported new errors in the middle of the night.


Most importantly, I was able to now demonstrate that these errors were NOT cosmetic on any of our Latitude D630 laptops. I spent about a MONTH swapping HDs between different machines, reimaging them, using the latest Intel SATA driver versus the older/default, even re-installing in AHCI vs ATA mode. The results were conclusive: the timeout errors were from the motherboard. And, the problem only occurred on the machines I had purchased between January 2008 and October 2008, i.e., it really seemed to indicate a bad batch.

Unfortunately, by this point, most of my D630s were out-of-warranty; most by just a few months. Dell was willing to replace the motherboards on all the in-warranty machines, of course, but I had a real argument with them about the rest. I pointed out how the problem seemed to be a manufacturing defect and that it had taken me a LONG time to prove.  They ended-up granting me an additional 60 day 'grace' period on the warranties. So 7 of the 13 affected laptops are going to be repaired.


All of this got me thinking: Why do any of these machines show a SATA 'timeout' at all? I heard that the latest 'Sandy Bridge' chipset had been recalled because of an obvious defect in the SATA-2 controller. But is there a larger problem with Intel's chipsets? Why does the error occur on E-series laptops if you use an older revision of the Intel driver? Is it really a cosmetic error, or is it a universal problem which the newer version of the driver simply ignores?

Thursday, June 9, 2011

Comparing old JVC KW-XR810 to new KW-HDR720

I have the JVC KW-XR810 double-DIN receiver in my car and love it.  But it still has it's limitations.  Most of all I miss having more EQ frequencies to play with, plus I'd like to have HD Radio.  (Though I suspect HD Radio is slowly dying.)

Now JVC has introduced a new KW-HDR720 receiver which is the 'mid-range' model in their line-up of double-DIN radios.  Both units are very similar and JVC has not made it easy to compare the detailed features of both units.  So I have painstakingly reviewed both User Manuals and present my findings below.

older XR810 ($300 at Crutchfield)
  • includes Bluetooth adapter
  • supports HD Radio but not included
  • parametric EQ has 3 bands each with 3 frequency settings
  • subwoofer level can only be adjusted by digging through Menu settings
  • supports Sirius satellite radio
  • supports Bluetooth profiles: HFP 1.5, OPP 1.1, PBAP 1.0, A2DP 1.2, AVRCP 1.3
  • separate High-Pass and Low-Pass crossovers for mains vs subwoofer
  • dedicated 'Phone' button
  • 'full-dot' LCD
newer KDR720 ($220 at Crutchfield)
  • supports Bluetooth adapter but not included
  • includes HD Radio
  • parametric EQ has 3 bands each with 4 frequency settings
  • subwoofer level adjustable directly from EQ window
  • no support for Sirius satellite radio
  • User Manual has no details about Bluetooth profiles
  • no separate High-Pass crossover for mains; just 'on/off' ?
  • dedicated 'Tag' button for HD Radio which doubles as 'Phone' when used with bluetooth
  • old-style 'segmented' LCD lettering
 They also have the 'low-end' KW-XR610 model which is the same as the XR-810 minus the $100 USB bluetooth adapter; the KDR720 uses a different adapter.

I'm happy to see the additional EQ frequencies but wish they'd included even more.  Also, I'd like to know if the bluetooth support has been improved.  My current setup (XR810 paired with HTC Hero running Android 2.3.3) has inconsistent phonebook support and also is missing any audio display functionality (which I think requires AVRCP 1.4).

UPDATE: Ha!  No sooner had I written-up all this then I noticed the following PDF on the JVC product web-page. It shows that the KDR720 has the same level of bluetooth audio support (as the XR810) but not the phonebook support.  It also looks to have superior radio-tuner performance, in general (not just with HD).
Comparison Chart of JVC Car Audio receivers

Friday, May 27, 2011

Is Sprint coverage getting worse?

In the past month I've noticed that I lose 3G data service in lots of places, and more importantly in places where I used to have plenty of coverage.  I decided to google "is sprint coverage getting worse" and learned that I am not alone.  In the past couple of months there have been lots of similar discussion threads, e.g.

The really significant one seems to be this detailed post on the Sprint Community forum,

My initial suspicion had been that Sprint was preparing to announce their "new 4G service" here in San Diego, and that they were re-purposing existing 3G towers for the effort.  Now I'm not so sure.  Even in areas with existing 4G service, the coverage map has dramatically shrunk recently.  The discussion on Sprint Community is too long for me to read right now but the gist seems to be that Sprint never really had full 3G coverage -- instead they were paying for 'seamless' roaming on other networks (cough*Verizon*cough) and now they're not.

I was already thinking of switching to T-Mobile for their better selection of Android phones and existing '4G' coverage.  This is now yet another reason to do so.

Sunday, May 15, 2011

Considering NAS and DLNA servers

My current network consists of a 24/7 Ubuntu workstation (with a dual 1 TB RAID) and Win7 HTPC (also with dual 1 TB).  I am also seeding a bunch of torrents (mostly Ubuntu distros) and hosting the Squeezebox server for the Logitech Radio in our kitchen.

I am considering scrapping the Ubuntu workstation and reinstalling it with Win7 and an SSD.  I would like to save on our electricity bill and improve the file-sharing and backup.  So I am investigating Network Attached Storage (NAS) options.  So far the Netgear ReadyNAS seems like the best option as it's a Linux-based OS with add-ons for all the major services.  Right off the bat it has Tivo and DLNA support, and there are also plug-ins for the Transmission bittorrent app as well as for the Squeezebox server.  The ReadyNAS Ultra 2 has a single-core Intel Atom while the Ultra 2 Plus has dual-core.

The one thing the ReadyNAS wouldn't be able to do is host my new weather station (really just a Oregon Scientific RMS300A three-source temperature and humidity monitor).

I am also debating the usefulness of DLNA.  One of my friends has a new Samsung plasma HDTV with DLNA support and such a thing should -- in theory -- negate the need for an HTPC (and XBMC).  The problem seems to be the limited codec and encoding support in DLNA.  I assume this was on purpose, to prevent people from downloading ripped content and directly playing it on DLNA players.

Fortunately there's a workaround -- a DLNA server with transcoding!  Windows 7 includes Windows Media Center which already includes a DLNA server, but unless the original files are already in DLNA-compatible formats you won't be able to play them on something like the Samsung TV.  I also already own the Nero 9 Suite which includes the Nero Home DLNA server, but again I don't think that includes transcoding.

Instead, I have found the following two options for DLNA transcoding:

Mezzmo, $30

iSedora, $17 (single-device) / $48 (multi-device)

At this point the whole investigation is moot since I'm not about to scrap my primary workstation!  Since it's running Ubuntu 10.10, though, it'll eventually lose support/updates and that will certainly force my hand.

Saturday, April 16, 2011

How to make ATI 785G (4200) pass DD and DTS through HDMI

I just replaced my WinXP-based HTPC with a new Windows 7 Premium system. The new PC has an ATI 785G motherboard with the 4200 IGP video card. I have the HDMI output connected to my Denon receiver and everything seemed to be working ok.

When I tried playing a movie in XBMC, however, all I could get was stereo. I went into the System > System > Audio settings and set Output=HDMI and enabled both 'Dolby Digital capable' and 'DTS capable'. But then the movie went silent and stuttered badly. I tried the same test with Media Player Classic - Homecinema and got the same result. What did I have to do to get surround-sound?!

After much research I finally determined the following:
  1. The ATI 785G chipset supports DD and DTS passthrough on HDMI but not with the standard ATI drivers! (It does not ever, however, support HD audio-passthrough, e.g. from Bluray)
  2. The standard ATI drivers create both an "ATI HDMI" audio device and a virtual Realtek SPDIF device. Deleting the latter doesn't help, nor is it recreated.
  3. The audio chip on the motherboard is from Realtek and they provide their own alternate (better!) driver,
  4. You can download the improved driver at their web-site under High Definition > ATI HDMI Audio Device
Once I installed this alternate driver I was able to go into the Control Panel > Hardware and Sound > Manage Audio Devices > Realtek HDMI > Properties. Under the 'Custom' tab there's a new option for "Allow AC3/DTS/WMA output (Reboot required)". Enabling this, then rebooting, finally fixed the problem!

UPDATE 9/14/11 - I started having problems where the audio-link wouldn't sync-up (i.e., no sound) even though the Windows volume-control said it was working.  So I uninstalled the Realtek HDMI driver, then updated to the latest ATI Catalyst 11.08 driver.  Everything seems to be working now, and without the 'unofficial' fix!

Tuesday, April 12, 2011

New recommendation re printers for Ubuntu

I just heard that the forthcoming Ubuntu 11.04 will included automatic support for Epson printers! This is very exciting. Epson now joins HP and Brother in having good Linux/Ubuntu support. I have had too many problems with HP, however, to ever recommend their scanners or inkjets to people. I currently have a Brother fax/laser setup and it's working fine, if slow. (Not sure if there's a software problem or not, i.e., it might be trying to print at an overly-high resolution?)

As for recommending Ubuntu itself, I'm still on the fence. I continue to love Ubuntu and prefer it over Windows 7 for my personal work. But I continue to have weird hardware compatibility problems which come-and-go with different kernel updates, and my sound and fast-user-switch continue to fail. So I'm no longer so enthusiastic in my Ubuntu recommendations.

I've also been using Windows 7 a lot and haven't had any problems with it. Which is good, since you have to pay for it!

Tuesday, March 22, 2011

Configure XBMC 10 to use Media Player Classic

I use XBMC on a regular basis for my home-theatre PC and in general I love it. Some of it's only limitations revolve around hardware acceleration. First of all, on my current Windows XP setup there is none! I have a new Windows 7 based machine in the works, however, and even then I find it limited. Specifically, there is no deinterlacing of my Sony camcorder recordings which are 1080i AVCHD. So I set about configuring XBMC to selectively use Media Player Classic - Home Cinema edition (MPC-HC) for only those video files which required it.

MPC-HC is a DirectX-based video player which has both it's own built-in 'MPC' decoder as well as access to any installed DirectX codec, e.g. Windows 7's DTV decoder.

Along the way I found a lot of gotchas in XBMC's support for external players, so I decided to write it up as this new post.

  • You can't simply edit the original configuration file in-place
  • If not done correctly the configuration file will make MPC-HC the default player
  • I haven't been able to make this work on Windows XP without MPC-HC becoming the default player -- regardless of the rules I specify.
  • The 'rule name' check also applies to the folder name, i.e., a 1080p mkv located in a folder called 'converted from 1080i' will still playback with MPC-HC!
  • The formatting is completely unforgiving (and arbitrary)! For instance, you MUST MUST MUST include the extra / character at the end of each new rule you add.
GENERAL PROCEDURE To setup XBMC v10 on Windows 7, so that it uses MPC-HC to playback 1080i and .m2ts (AVCHD) files:
  1. Locate the 'playercorefactory.xml' in the folder,
  2. Copy it to your personal folder,
  3. Edit the new .xml to add the 2 sections below
*** Add this to the '<players>' section (edit the .exe path as necessary)
<player name="MPC-HC" type="ExternalPlayer" audio="true" video="true">
<filename>C:\Program Files\Media Player Classic - Home Cinema\mpc-hc.exe</filename>
<args>"{1}" /fullscreen /close</args>
*** Add this to the '<rules>' section
<!-- Play "1080i" and *.m2ts with MPC-HC --> <rules action="prepend">
<rule filetypes="m2ts" player="MPC-HC"/>
<rule name="1080i" player="MPC-HC"/>

FINAL NOTE - Make sure it's inserted before the final </rules> and </playercorefactory> commands!

UPDATE: I found that I had to specify <hidexbmc>True else it interfered with MPC-HC. I've updated the example to use this setting.

Sunday, February 13, 2011

How I went about moving and updating my ‘trapped’ Wordpress site

As the volunteer web-editor for a local non-profit, I decided to replace our old-style SHTML site with a new Wordpress CMS-based site. I didn't want to replace the current/live site, however, until I had a complete replacement (and the approval of the management). So I created the new Wordpress site as a temporary sub-site of our existing site. Unfortunately, because of the provider,’s, stupid install script the new site was ‘trapped’ – I could neither update the Wordpress software nor could I reconfigure any plugins. The primary reason for this was that had set it up as a (partly) shared install, i.e., some of the system files were in a shared read-only folder somewhere (not in my personal folders).

In order to remedy this I had to do the following:
  • make a backup of the new Wordpress sub-site
  • create a new properly-working copy in the new location
  • merge the two copies into a proper replacement site
  • confirm that I could update and configure everything properly
There were lots of instructions online about how to go about moving a Wordpress site but none seemed to apply to my situation -- un-trapping a script-created site!


I initially tried installing an up-to-date copy of Wordpress as a new install, and confirmed that it worked properly by installing all the same plugins and theme as before. I also restored the ‘wp-content’ files from my original site. I then used the Tools > Import function to restore the “WXR” (.XML) file I’d exported from the original site. Unfortunately, this only restored the Pages but none of the theme or widget settings. As a last attempt I dumped the new site's MySQL database and restored it from a tweaked copy of the original DB (where I’d done a search-and-replace for all the URLs). For whatever reason this caused the whole setup to reset to defaults. Fail.

  1. Deleted all the files in the new location and started over.
  2. Copied all the install files from a matched version of WordPress (v3.0.1 downloaded from their Release Archive).
  3. Manually updated only the wp-config.php.
  4. Copied all the files from the old site. I had to rename the script-created folder, 'wordpress-content', to match the standard folder-name, 'wp-content'
  5. Used the phpMyAdmin technique to download the SQL DB from the initial site in uncompressed text format.
  6. Manually fixed all the URL and folder references using search-and-replace: >> >>
    wp/wordpress-content >> wp-content
    wordpress-content >> wp-content
  7. Went into phpMyAdmin on the new site, dumped all the tables and restored the edited SQL file.
  8. Finally, I tested the Wordpress auto-update as well as installing/configuring new plugins, and it all works properly!

My struggles with Wordpress and

I am the volunteer web-master for a local non-profit. The web-site I inherited was a mess of hand-coded HTML and SHTML and was a nightmare to update. For the past 3 years I’ve limited myself to correcting and updating the text, and I’ve made copious notes about which HTML editors work or don’t work.

Now I’ve decided to re-do the site using a modern CMS and in particular I’ve chosen Wordpress. I thought this would be an easy process since Wordpress is extremely popular and our hosting provider,, includes an auto-install script in their Control Panel. Boy was I wrong about that ‘easy’ part! Yes, I was able to get the initial site setup in minutes but then I quickly started running into roadblocks – of the arbitrary kind put up by!

First, their install script only installed Wordpress 3.0.1 circa July 2010. I tried to update to the latest v3.0.5 using the built-in updater but I got an error message about access-rights. I asked about this and they said they didn’t support updates by users. Period. No other explanation, support ticket = Closed. I had to submit another request to ask if they were taking responsibility for installing Wordpress security updates and they said yes. (And then closed that ticket, too.) Needless to say, they haven’t updated anything.

Since I was still just experimenting with the software I decided to give the security updates a pass, for the moment. But then I discovered that I could not install new plugins either?! This time I picked-up the phone and called, and this time they were able to modify my installation to allow new plugins. So far, so good? No, because they also broke the site admin login! Yet another call to and finally everything seemed to be working.

THEN I realized that I couldn’t configure the plugins?! The 'wp-plugins' folder was apparently marked read-only even though it was in my personal site, not a shared folder. As if to stress the absurdity of the situation now said they could not ‘support’ third-party software. At this point I resolved to do a manual installation (and to find us another hosting provider).

Well... Yes, Wordpress is renowned for its ease of installation. Unfortunately, that’s dependent on your hosting provider providing accurate and up-to-date information about your account configuration. I tried to use's online knowledgebase but their article re MySQL configuration had the wrong server-host information. Again, I was forced to call and – to their credit – they were immediately able to point me to the Control Panel > MySQL Database Manager which listed the correct server-host. They also clarified that they use a custom MySQL port (3306). Finally, I was able to get a new manual install of WordPress to complete successfully.