Sunday, October 24, 2010


Hundreds of billions of dollars have been pumped into Africa since 1980 and it’s now poorer than it was then. What exactly did that money do?

The money built roads, schools, governments, wells and farms. It hasn’t worked. Why hasn't it worked? I think the answer is clear, It hasn’t worked because African culture doesn’t work. There. I said it, now I'll pause briefly so you can throw tomatoes at me... done? OK moving on... Why do we hold culture up as something intrinsically good and untouchable? Why is it taboo for me to say “culture is the problem” the only answer (I think) can be that we are afraid of such an objective eye being turned on our own culture. Eliminating culture from the possible causes of poverty has prevented us from addressing it. To address culture, we need to build people not roads. We need to build people not schools, we need to build people not governments. If we build roads they’ll get washed out, if we build schools they’ll crumble, if we build governments they’ll become kleptocracies. If we build people they will build and maintain roads, if we build people they will build and teach in schools, if we build people they will build and run government. If we build people they will build a better culture.

What is building a person? In order to build a person we first need a standard for them to be built to, what is a good person? What is the ideal man? ...And we need to be realistic, we won’t be able to build perfect people and we won’t be able to build the ideal man. But... if we can make “the standard” the goal of the imperfect person himself, then I think we have succeeded.

Here is a standard I think most will agree on. Someone who is honest, someone who cares most about things that matter most and least about things that matter least (I understand that "what matters most" needs to be defined but for now I'm just going to say some things clearly don't make the cut, like soccer) Someone who does to others what he would like to have done to himself, someone who works hard, someone who is loving, generous, kind, patient and persevering.

We now have a goal and a set of standards. The goal is to make it the goal of the individual to adhere to the aforementioned standards. Now why would he do such a thing, why would he rebel against a culture that is more comfortable with cronyism and chronic shortsightedness? What will inspire him to such heights of altruism? ...I know only one man for the job. Jesus. Jesus teaches the values that Africa needs. Changing African culture has to happen by first changing individuals and that doesn't mean abandoning the African arts or forcing Americanization down their throats. The futility of continued operations in Africa is an embarrassment, an embarrassment that is a reflection of the embarrassing contents of human nature. While it’s uncomfortable for some I really think Africa's need for God has become quite inescapable, as Matthew Parris writes in the December 27, 2008 issue of The Times: “As an atheist, I truly believe Africa needs God.”

When atheists start advocating God, there's probably something to it; Africa will benefit when we escape the traditional faux pas and go with strategies that actually work.

OK I understand that writing doesn’t in and of itself help anyone but I had to get this one off my chest and I do plan to put hands and feet to these words.

Wednesday, October 20, 2010

HPalm Fails: Ignores the new Smartphone Paradigm

The HP (formerly Palm) Pre 2 has been announced but not yet launched. It will be a commercial failure.

How can I announce the failure of a product I haven't even touched? First let me add that I'm not saying HPalm won't make money on it (they may) then lets examine the flaws that scuttled the original Pre and have now been unhappily inherited by the Pre 2.

The original Pre launched with arguably the best OS on the market; better integrated and smoother than Android, more capable and flexible than iOS it was an absolutely world class operating system. The reasons Pre 1 was not the smashing success it could have been are:

1. Hardware shortcomings
2. Launching on a second-tier carrier
3. Mediocre Marketing

Point 3 is obvious if you've seen the downright weird commercials that accompanied the Pre's launch.
Point 2 is also obvious, Sprint's smaller user base means fewer potential customers, Sprint does not have the prestige of AT&T or VZW and the cutting edge consumers that Pre targeted are more carrier-prestige aware than most, also Sprint's network is inferior to VZW's and (arguably) AT&T's.
Point 1... this was the single biggest factor in the Pre's mediocre sales and is also probably the most open to debate... so let's talk hardware...

First let's start with what Palm got right. The Pre was the first top-tier smartphone to launch with an ARM Cortex A8 SOC. Definitely no mistake there. It also had 256mb ram, no mistake there; it's keyboard was well executed, no mistake; it looked attractive and fit nicely in your hand... again no mistake there. Palm nailed all the details but failed on the broader, more general points of design, though in fairness to Palm it may have been impossible to spot this flaw at the time the device went into production.

The first jet powered commercial aircraft, the de Havilland Comet, had four jet engines built into its wing roots. Later airliner designs had high wings and low wings, 2 3 or 4 engines mounted inside underneath and on top of the wing, t-tails and conventional tails, subsonic and supersonic cruise speeds. Now there is almost zero variation in modern passenger jets. They are all low-wing, conventional tailed, subsonic aircraft with two engines mounted under the wing. Sure there are a few models that still have four engines but they are the exception and none of them are selling very well right now. Cars have similarly converged to conformity, so have laptops, bubble gum and pretty much every other mature product. The reason all these products now look basically the same is that best has been found, and when you've found best... why go for anything else? Now best isn't the same for everyone so there is room for outliers in all these product categories, but outliers can only ever occupy niches, they cannot hold high market-share positions. With a 3.1 inch screen the Pre 2 is an outlier. The last year has seen a trend to ever increasing screen sizes. First the iPhones 3.5" screen seemed to be the standard, then the Motorola Droid launched with a 3.7" screen and was followed by a string of other 3.7" inch devices, then a slew of 4.0" and above handsets were launched. Currently the Droid X and EVO 4g are the screen size champs at 4.3" each and both are selling faster than HTC and Motorola can make them.
When given the choice between browsing the internet on a 3.1" screen and a 4.3" screen, everyone takes the 4.3" screen, this is hugely important because internet browsing is now a primary smartphone function. Similarly when given the choice between viewing photos or movies on a 3.1" screen and a 4.3" screen, everyone picks the latter again and while not as big a deal as internet browsing this is still a significant issue for many users. 4.3 may not sound that much bigger than 3.1 but you have to keep in mind that this is area (dimensions are squared) so actual screen area on the EVO is (4.3^2/3.1^2) 92% greater than the Pre's (ignoring aspect ratio differences). The advantages of a big screen are numerous and the advantages of a small screen are few, there's a good reason why screens have been trending bigger, it's because they're better at most things for most people. Screen size alone, makes the Pre 2 an "also-ran" device, I would guess that only 3% of prospective smartphone buyers would consider a phone with such a small screen.
The hardware keyboard is also a factor. Apple's original iPhone was greeted with much skepticism for it's lack of a physical keyboard, it was an extremely edgy and innovative move by Apple. And it paid off, good touch-screen keyboards work better than most physical keyboards as I found out after purchasing a D1, this is why most high end smartphones fore-go the physical keyboard. There's still a decent market for smartphones with physical keyboards but it represents only about 10% of the market.

With the screen eliminating 97% of prospective buyers and the keyboard leaving another 90% less than enthused I've got $20 that says the Pre 2 will not achieve greater than a 1% market share. HPalm has ignored the new Smartphone Paradigm, they have not aimed for best in all categories and so they will not be successful. Better luck next time HPalm.

Wednesday, October 6, 2010

What is Software's Jurisdiction?

Computers will seduce you, they will work perfectly every time for ten years. And then one day. They'll kill you.
-Doctor Art Draut,
Test Pilot 
Professor of Aerodynamics and Computer Programming

As a flight instructor, one of the first things I teach my students in the Cessna 172 is how to "lean" the "mixture" correctly. Most aircraft have separate controls for the throttle (black) and the mixture (red). The throttle controls how much fuel and air (unless the mixture is full out, in which case the cylinders get all air and no fuel) get sucked into each cylinder during the intake stroke of the Cessna's four-stroke reciprocating engine. The mixture controls exactly how much fuel is added to the air on its way into the cylinder. Cars originally didn't have mixture controls because they usually stay at around the same altitude, this meant that at higher elevations or on hotter days the cars were burning fuel inefficiently because they were adding the same amount of fuel at 4,000' as they did at sea level; at 4,000' there is less air and so less fuel is required to achieve the optimum fuel/air ratio. Later with the advent of electronic fuel injection automakers quickly realized that they could add a chip to the engine that sensed air density and adjusted the mixture perfectly to ambient conditions for each individual intake stroke of the engine. This significantly increased both engine performance and engine efficiency.

The auto industry in the United States is about 700 times larger than the light aircraft industry (by revenue) and the makers of aircraft engines did not have enough money to redesign their engines with electronic fuel (mixture) control, as a result of this, airplanes are stuck with vintage 1950s engines that still have mixture controls, many old school pilots are very happy about this because they simply do not trust electronics. There is a fundamental trade-off that occurs when you put a computer in charge of something. Performance goes up (hopefully), but complexity goes up as well; when you add electronic fuel control to an engine you are adding one more part that can break and kill your engine. Electronics however, are very reliable, a friend of mine is an engineer employed by Cessna and he reports that mechanical components on Cessna aircraft must have a reliability rating of 10^-5 (1 failure in 100,000 cycles) or better, while electrical components must have a reliability rating of 10^-9 (1 failure in one billion cycles) or better. Electronics are especially reliable when they only have to do one thing and cannot be told to do other things, such as the chips that control the mixture in jet engines (which all jet engines in the sky today have). In the case of the Cessna's engine, I believe handing the job of leaning the mixture to a computer is a no-brainer, I lean the mixture several times a flight but a computer will do it more than a 1,000 times a second. Whenever the engine is at a less than optimal mixture setting it makes the engine more likely to suffer failure in the future so my relatively infrequent leaning is comparatively bad for engine health, thus in this specific case the risk of failure will likely go down.

Moving on from the Cessna to more serious hardware, the problem with electronics is that the more things you want them to do the more likely it is that they will fail. The chips controlling the mixture on a GE-90 have never failed according to the NTSB database but Windows (which is designed to do pretty much everything) fails everyday. Windows not only has the disadvantage of being more complex because it's designed to do everything but it also has the disadvantage of being externally accessible. If your computer is physically connected to the internet then it is possible for your computer to get hacked.

Continuing to even more serious hardware, the flight management system on modern fighter jets is both complex (it must be aware of and correct for temperature, pressure, landing gear position, payload door position, payload weight, fuel weight, gps data, flight regime, other aircraft, pitch, bank, roll, yaw, radar data and much much more) and accessible. Modern fighter jet FMS systems actually run an operating system that is upgradeable and to be upgraded it must be accessible. It is only "internally" accessible however, (to my knowledge anyway, it is really just an educated guess) you must physically connect to the aircraft to modify its software. It seems comical that software, something we usually associate with Microsoft Windows and unicorns named Charlie; could cause something as deadly serious as an air superiority fighter jet to crash, but that is exactly what happened to the world's best fighter jet (a prototype) in 1992 and then to a production aircraft in 2004.

Continuing to even more serious hardware the american electrical grid is planning to "get smart." Last year the Obama administration set aside 3.4 billion dollars (for a total of 11 billion now) to make the electrical grid smarter, a move which the DOE calls a necessity. It is estimated that intelligent grid control could make the system 10% more efficient... and guess what communications network is to be used to control this new dynamic system? That's right, the internet.
Let's pause for a second and think about America's electrical grid... Generating about 4 Trillion Kilowatt hours of electricity a year and powering hundreds of millions of appliances the electrical grid quite literally keeps us alive. Think I'm exaggerating? Let's think about what electricity does for us: it powers our computers, phones and lights, our freezers, refrigerators and home heating/cooling systems, our traffic lights, gas stations and air traffic control system. Maybe you don't need any more convincing but the electrical grid is a serious piece of hardware. If you dabble in the safety sciences you will frequently see a chart like this one:
Software first crept its way from the bottom left of this chart to the bottom right of this chart. Now it is climbing up up from the bottom right toward the top right corner. While very few dispute that software makes cars, Cessna's and jet engines better, it's use in fighter jets might give you pause. There is a concept in the safety sciences called "blood priority," it basically states that even though people know a problem exists, they won't fix it until it kills someone. As people get more comfortable with software, they allow it more leeway, as a (amateur) programmer myself I love software, but it cannot solve everything, and it cannot be held responsible for anything. At some point or another, if we continue to allow software to expand it's jurisdiction into more and more of our lives we will reach a point of dangerous reliance and/or vulnerability. This is exactly what Doc Draut meant when he uttered the quote at the top of this page.
He does not mean that everyone will get killed by a T1000 in ten years, he simply means that we humans need to remain ultimately in control of the software that runs our lives. And nobody is in full control of the internet so I don't believe allowing it to run a life-critical system like the electrical grid is an acceptable risk. Referring to the above chart again I think everyone agrees that the potential severity of a nation-wide electrical failure belongs in the "catastrophic" row. The question is, what column does it belong in? Lockheed Martin seems to think it is not in the far right column.

In the interests of not sounding like a prophet of doom I would like to end this post with this link.